Social

onsdag den 29. august 2018

PowerShell in Azure Functions - Lessons Learned

I love writing PowerShell in Azure Functions - it is a mixed blessing not having to worry about a VM (or VMs), but I hope to share a few tips that will result in fewer hairs being torn.

Forget Write-* except Write-Output. Write-Error will work just like throw (which I would then much prefer to use).
It can be inconvenient that Write-Verbose in (ordinary PowerShell) functions is lost, and you cannot use Write-Output in a function as that would go towards the output of the function, and not the log stream. But there is a trick if you really need that verbose output, you can redirect it (read more on that here). Try running below.


# this will do nothing
Write-Verbose "Verbose"
# redirect verbose stream
Write-Verbose "Verbose redirected to success stream" -Verbose 4>&1
# verbose output in function
function function_with_verbose {
    [CmdletBinding()]
    param (
        
    )
    Write-Verbose "this is verbose"
    Write-Verbose "more verbose"
    # output result
    4
}
# redirect verbose stream
$result = function_with_verbose -Verbose 4>&1
# assuming just a single out
Write-Output "output from function"
($result | Select-Object -Last 1)
Write-Output "The verbose stream"
# everything but the last
$result[0..($result.length-2)]

All depending this may or may not be worth the trouble. I think that at some point the other streams will be displayed in the logging output.

There is some documentation on importing modules in a Function App, but what I found the best was to first use Save-Module to download to disk, then in Platform features in the app there is something called Advanced tools (Kudu). Click that and a new tab opens. In the top click Debug Console and select either.
I usually create a new folder (the big + sign) in root, lib, and in that another folder modules. Here you can drag and drop the module folder you just downloaded.
You can zip the module folders before uploading if you like, they are unzipped automatically. Note down the full path to the psd1 file that you will import. When importing in the function app simply

Import-Module "D:\home\lib\modules\AzureRM.profile\5.3.4\AzureRM.Profile.psd1"

I I have often seen a -global appended to this command. Not sure why, I have had no luck getting global variables to work. This leads to my next point, when using any Azure PowerShell modules you need to authenticate using ex. Login-AzureRmAccount. Problem with this is that if you have multiple functions running at the same time they will leak into each other, especially something like Select-AzureRmSubscription will mess you up!

Luckily there is a solution for this (same for Login-AzureRmAccount).

$DefaultProfile = Select-AzureRmSubscription -SubscriptionId $SubscriptionId -Tenant $TenantId -Scope Process

The $DefaultProfile is then used in all subsequent calls ex.

Get-AzureRmResource -ResourceType 'Microsoft.DevTestLab/labs/virtualMachines' -ResourceGroupName $ResourceGroupName -ExpandProperties -DefaultProfile $DefaultProfile

Now in case a different instance of the same function runs at the same time, it will not interfere. As it is tedious to add this everywhere you can use $PSDefaultParameterValues and also removes the risk you forgot this somewhere.

$PSDefaultParameterValues = @{'*:DefaultProfile' = $DefaultProfile}

I use a Managed Service Identity to login to Azure. Under Platform features there is an item "Managed service identity" - click it and select On.

To run below you need the MSI application Id. You find it in Azure Active Directory under App Registrations (select All apps) and search for your function app name. Copy the application Id. I have added it to Application Settings, and then accessible from $env:


$apiVersion = "2017-09-01"
$resourceURI = "https://management.azure.com/"
$tokenAuthURI = $env:MSI_ENDPOINT + "?resource=$resourceURI&api-version=$apiVersion"
$tokenResponse = Invoke-RestMethod -Method Get -Headers @{"Secret"="$env:MSI_SECRET"} -Uri $tokenAuthURI
$accessToken = $tokenResponse.access_token
$DefaultProfile = Login-AzureRmAccount -Tenant $TenantId -AccountId $env:ApplicationId -AccessToken $accessToken -Scope Process

Note that this may fail if the MSI has access to no resources in any subscription. Anyways, it would be rather pointless if it does not.

Migrate Azure function app from consumption plan

If you have your function app running on a consumption plan and ever considered to move it to a different plan then this may have stopped you

Being greyed out does not mean it is not possible, only that there is not yet any portal support for it. But luckily this is possible using PowerShell. The easiest way to achieve this is likely to start the Azure Cloud Shell (top bar in the portal)

The shell will pop up in the bottom of the browser. Where it reads Bash, click and select PowerShell.
First we need to select the relevant subscription (you are already logged in). If you only have a single subscription in your tenant, skip this step.
The subscription id is what follows /subscriptions/ in the browsers url, ex.

https://portal.azure.com/#@mytenant.onmicrosoft.com/resource/subscriptions/9734191f-63d9-4b3d-880e-8de9a40942f2
/resourceGroups/rgfuncapp/providers/Microsoft.Web/sites/funcapp/appServices

Copy this value and enter (paste using mouse right click)

Select-AzureRmSubscription -SubscriptionId your_subscription_id

Before continuing make sure you have a new plan (the one you wish to move to) in the same resource group and in the same region.

To move you need just a single command

Set-AzureRmWebApp -Name "[function name]" -ResourceGroupName "[resource group]" -AppServicePlan "[new app service plan name]"

You need to reload the browser before you see the change in the portal.

Please refer to https://github.com/Azure/Azure-Functions/issues/155 for further details. As you may notice this blogpost is simply elaborating on a comment made on the issue, but it took me a while to find that, so hopefully this helps someone while we wait for portal support.



fredag den 24. august 2018

PowerShell: MS Graph API authentication with Service Principal

I had to access the MS Graph API from Azure Function App and after wasting some time trying to get it to work with a Managed Service Identity (you can get a token, but cannot assign the MSI any roles, yet), I opted for the good ol' Service Principal (SP).

There are several blogpost on how to get a token for various Microsoft APIs, and most of the code is very similar, but they are all lacking one essential detail, without it you may get varying results.

I experienced getting a token that the API claimed was invalid, one that was expired, not getting a token because the SP secret was incorrect (it was not), and maybe just that no overload of the function could be found. Clearly I was doing something wrong.

For good measure, here is a short guide on getting a working SP. Go to Azure AD,-> App Registrations, and create a new app registration.
The application type is web app/API, you can put anything in the sign-on URL.


This will also create an enterprise application.

In the settings of the newly registered app click Settings, and under API Access click Required permissions. Add a new permission and select the Microsoft Graph API, and check off the permissions needed.



When done you need to click Grant Permissions.

Next click Keys. Fill in a description, select when the key should expire and click Save. The key will be generated and shown. Save this for later.

Back in your app registration copy the Application ID.

Next we need a dll file, Microsoft.IdentityModel.Clients.ActiveDirectory.dll - this is the main contribution of this blog. It just happens that there is many different versions of this, and you need the right one to get a working token.
I found that the one in AzureRm.Profile 5.3.4 works just fine. I would guess versions close to this one is the same. You can get it using Save-Module:

Save-Module AzureRm.Profile -RequiredVersion 5.3.4 -Path C:\Temp

Now find Microsoft.IdentityModel.Clients.ActiveDirectory.dll and use Add-Type to load it

Add-Type -Path "C:\Temp\AzureRM.profile\5.3.4\Microsoft.IdentityModel.Clients.ActiveDirectory.dll"

Next we need an important piece of information. Run this in a fresh PS-session:

[appdomain]::currentdomain.getassemblies() | Where-Object {$_.fullname -like "Microsoft.IdentityModel.Clients.ActiveDirectory*"} | Select-Object -Property Fullname

The result is what we need in the following function. We can use it to specify that it is that exact dll-file we are referring to. There could be many of these loaded, and if the wrong one is used we get a non-desirable result.

Function Get-AADToken {
    [CmdletBinding()]
    [OutputType([string])]
    Param (
        [String]$TenantID,
        [string]$ServicePrincipalId,
        [securestring]$ServicePrincipalPwd,
        $resourceAppIdURI = 'https://graph.microsoft.com/'
    )
    Try {
        # Set Authority to Azure AD Tenant
        $authority = 'https://login.windows.net/' + $TenantId

        $ClientCred = [Microsoft.IdentityModel.Clients.ActiveDirectory.ClientCredential, Microsoft.IdentityModel.Clients.ActiveDirectory, Version=2.28.3.860, Culture=neutral, PublicKeyToken=31bf3856ad364e35]::new($ServicePrincipalId, $ServicePrincipalPwd)
        $authContext = [Microsoft.IdentityModel.Clients.ActiveDirectory.AuthenticationContext, Microsoft.IdentityModel.Clients.ActiveDirectory, Version=2.28.3.860, Culture=neutral, PublicKeyToken=31bf3856ad364e35]::new($authority)
        $authResult = $authContext.AcquireTokenAsync($resourceAppIdURI, $ClientCred)

        if($authResult.Exception)
        {
            throw $authResult.Exception.InnerException.Message
        }
        
        $Token = $authResult.Result.AccessToken
    }
    Catch {
        Throw $_
    }
    $Token
}

The function is called as follows

# load this specific Microsoft.IdentityModel.Clients.ActiveDirectory.dll
Add-Type -Path "C:\temp\AzureRM.profile\5.3.4\Microsoft.IdentityModel.Clients.ActiveDirectory.dll"

# your Azure AD tenant
$TenantId = '55da3b96-2993-4ef3-ad6f-f0a066401f60'

# the application id from the app registration
$AppId = '135fee95-c7c3-48f5-9821-fcaf29fd8a3c'

# the key we created - obviously do not store this in cleartext
$ServicePrincipalPwd = '5a1mXQYcZNZADD8h2lSYxzSGHSF0U+chrpk0L5E0Cgw=' | ConvertTo-SecureString -AsPlainText -Force
# get the token
$Token = Get-AADToken -TenantID $TenantId -ServicePrincipalId $AppId -ServicePrincipalPwd $ServicePrincipalPwd

Now that we have a token, it is time to put it to work

$Headers = @{
    "Authorization" = "Bearer $token"
}

try {
    $Response = Invoke-RestMethod -Uri 'https://graph.microsoft.com/v1.0/users/' -Method Get -UseBasicParsing -Headers $Headers
}
catch {
    $_
    $_.Exception.ErrorDetails.Message
}

lørdag den 28. juli 2018

Azure Function App - Frontend and Backend

There are (many) different ways Function Apps can call other function apps. The perhaps most obvious (classic) way is making a web-request, from one function-endpoint to another. I have my "frontends" in Function App functions protected with "App Service Authentication" - one must login with Azure AD to authenticate one self (use the "express" settings to configure this to get it quickly setup).

Once configured add your users to the Managed application in the "Users and groups" tab.


These users will be allowed access to all the functions in your Function App. That seems pretty secure! You can even add Conditional Access to the application for added security.

Only problem is that if you want to make requests to other functions in the same Function App, then you would also have to authenticate, from the function, and I have so far given up to get this to work.

So I had to cook up some alternative. What I found was having 2 Function App instances, one is the frontend, and authentication is done using AAD as mentioned before, the backend is not protected by AAD authentication, but you do need a function key to access a given function (ie. no anonymous calls to this endpoint), and we can encrypt the response (also with a key), and both keys will be stored in Azure Key Vault.

Create 2 function apps and a key vault. In the key vault create a secret called encryptionKey, the value should be 32 characters long (256 bits), and the other is named to match the function and the value being the functions key (found in the Manage tab of a function, named default).


Next step is to enable Managed service identity on both Function Apps. You can do this under platform features, same place as you find Application settings. Now you need to note down the application id of both function apps, you can find that in the Azure portal under Azure Active Directory->Enterprise Applications. They will be named the same as your function apps.
Add these values to their respective application settings under the name ApplicationId.

In both Function Apps create a PowerShell Http trigger function.

Code for the frontend

# get a token for the key vault
$apiVersion = "2017-09-01"
$resourceURI = "https://vault.azure.net"
$tokenAuthURI = $env:MSI_ENDPOINT + "?resource=$resourceURI&api-version=$apiVersion"
$tokenResponse = Invoke-RestMethod -Method Get -Headers @{"Secret"="$env:MSI_SECRET"} -Uri $tokenAuthURI
$accessToken = $tokenResponse.access_token

# remember to set these
if(-not $accessToken) {throw "unable to fetch access token"}
if(-not $env:ApplicationId) {throw "application id not set in environmental settings"}

# get the function key first
# get the base url from the "overview" tab in the key vault
$secretName = 'somebackend'
$uri = "https://cbfuncappkv.vault.azure.net/secrets/{0}?api-version=2016-10-01" -f $secretName
$Headers = @{Authorization ="Bearer $accessToken"}
$KeyvaultResponse = (Invoke-WebRequest -UseBasicParsing -Uri $uri -Method Get -Headers $Headers).Content | ConvertFrom-Json
# get the value of the secret
$FunctionKey = $KeyvaultResponse | Select-Object -ExpandProperty value

# now ready to make a request to the backend
$uri = "https://cbfuncappbackend.azurewebsites.net/api/somebackend?code={0}" -f $FunctionKey
$Headers = @{'content-type' = "application/x-www-form-urlencoded"}
# oh oh, the response we got back is encrypted!
$EncryptedOutput = (Invoke-WebRequest -UseBasicParsing -Uri $uri -Method Get -Headers $Headers).Content | ConvertFrom-Json

# retrieve the encryption key from key vault
$secretName = 'encryptionKey'
# get the base url from the "overview" tab in the key vault
$uri = "https://cbfuncappkv.vault.azure.net/secrets/{0}?api-version=2016-10-01" -f $secretName
$Headers = @{Authorization ="Bearer $accessToken"}
$KeyvaultResponse = (Invoke-WebRequest -UseBasicParsing -Uri $uri -Method Get -Headers $Headers).Content | ConvertFrom-Json
# get the value of the secret
$encryptionKey = $KeyvaultResponse | Select-Object -ExpandProperty value

$Key = ([system.Text.Encoding]::UTF8).GetBytes($encryptionKey)
# decrypt the secure string
$DecryptedSecureString = $EncryptedOutput | ConvertTo-SecureString -Key $Key
# copies the content of the secure string into unmanaged memory
$ptr = [System.Runtime.InteropServices.marshal]::SecureStringToBSTR($DecryptedSecureString)
# convert to a string
$DecryptedOutput = [System.Runtime.InteropServices.marshal]::PtrToStringAuto($ptr)

# html part - 
$html = @"
<head><style>$style</style></head>
<title>Hello PS Backend</title>
<h1>Hello PS Backend</h1>
<h5>Time is $(Get-Date)</h2>
$DecryptedOutput
"@

# output as a webpage
@{
    headers = @{ "content-type" = "text/html"}
    body    = $html
} | ConvertTo-Json | Out-File -Encoding Ascii -FilePath $res

And for the backend


# get a token for the key vault
$apiVersion = "2017-09-01"
$resourceURI = "https://vault.azure.net"
$tokenAuthURI = $env:MSI_ENDPOINT + "?resource=$resourceURI&api-version=$apiVersion"
$tokenResponse = Invoke-RestMethod -Method Get -Headers @{"Secret"="$env:MSI_SECRET"} -Uri $tokenAuthURI
$accessToken = $tokenResponse.access_token

# remember to set these
if(-not $accessToken) {throw "unable to fetch access token"}
if(-not $env:ApplicationId) {throw "application id not set in environmental settings"}

# retrieve the encryption key from key vault
$secretName = 'encryptionKey'
# get the base url from the "overview" tab in the key vault
$uri = "https://cbfuncappkv.vault.azure.net/secrets/{0}?api-version=2016-10-01" -f $secretName
$Headers = @{Authorization ="Bearer $accessToken"}
$KeyvaultResponse = (Invoke-WebRequest -UseBasicParsing -Uri $uri -Method Get -Headers $Headers).Content | ConvertFrom-Json
# get the value of the secret
$encryptionKey = $KeyvaultResponse | Select-Object -ExpandProperty value

# secure and encrypt the below output
$Output = "Hello from the backend" 
# convert our encryption key to byte array, if string is 32 characters, we get 8*32=256 bit encryption
$Key = ([system.Text.Encoding]::UTF8).GetBytes($encryptionKey)
# convert to secure string, then to en encrypted string (the string must be secure before it can be encrypted)
$EncryptedOutput = $Output | ConvertTo-SecureString -AsPlainText -Force | ConvertFrom-SecureString -key $key

# write encrypted output
Out-File -Encoding Ascii -FilePath $res -inputObject $EncryptedOutput

Lastly we need to grant access to the secrets in the key vault, Get operation on secrets is sufficient.



Optionally enable AAD authentication on the frontend Function App before running the example, and in that case remember to add your own user!

For added security you could add a timed trigger function that resets the keys in the key vault at regular intervals. To make sure matching encryption keys are used (in both ends), you could provide the version of the encryption key as part of the response.
I also think that you can use service endpoints on the key vault so that only these functions are able to retrieve the key in the first place.

The result should look like this


onsdag den 25. juli 2018

Simple website in Azure Function App written in PowerShell

Just as the title says, in this post I will show how to write a simple website using Azure Function App in the still "experimental language" PowerShell. You can skip ahead and view there result here.

Doug Finke already showed how to do this, but in his example you need to write the HTML code yourself. Being the lazy programmer I am, I wanted to use ConvertTo-Html.

I am assuming you are familiar with rolling a Function App. Go ahead and create a HTTP trigger function and language set to PowerShell.


Name it however you like and leave other settings to default.

The real magic happens with the discovery of the -Fragment switch to ConvertTo-Html. It will provide you only with the body, meaning you can combine multiple fragments, and that is exactly what is needed to output HTML in a Function App.

The code part is pretty basic. We have some CSS, some semi-static HTML, and then using ConvertTo-Html to output available PS-modules.


# inline CSS - stole this somewhere, sorry dude, can't remember
$style = @"
h1, h2, h3, h4, h5, th { text-align: center; }
table { margin: auto; font-family: Segoe UI; box-shadow: 10px 10px 5px #888; border: thin ridge grey; }
th { background: #0046c3; color: #fff; max-width: 400px; padding: 5px 10px; }
td { font-size: 11px; padding: 5px 20px; color: #000; }
tr { background: #b8d1f3; }
tr:nth-child(even) { background: #dae5f4; }
tr:nth-child(odd) { background: #b8d1f3; }
"@

# html part - show Azure modules and non-Azure modules available in Function Apps
$html = @"
<head><style>$style</style></head>
<title>Hello PS Website</title>
<h1>Hello PS Website</h1>
<h5>Time is $(Get-Date)</h2>
<h2>Azure modules</h2>
$(get-module -ListAvailable | where-object {$_.name -like "*azure*"} | ConvertTo-Html -Fragment -property Name, version)
<h2>Other modules</h2>
$(get-module -ListAvailable | where-object {$_.name -notlike "*azure*"} | ConvertTo-Html -Fragment -property Name, version)
"@

# thank you Doug!
@{
    headers = @{ "content-type" = "text/html"}
    body    = $html
} | ConvertTo-Json | Out-File -Encoding Ascii -FilePath $res

The output will look something like this

torsdag den 14. juni 2018

Set Expiry Date on DevTest Lab VMs

A colleague of mine recently handed me the script from https://github.com/Azure-Samples/virtual-machines-powershell-auto-expired/ and asked for my help with what he thought was a permission issue.
The script is 2 years old, so things have changed quite a bit since, but it is just not very clever as it fetches all resources in an entire subscription (something this colleague did not have permission to do), which by all means is a bad idea.

I made some improvements and wanted to share. Below is simply run and you are prompted to select a lab, then one or more VMs in that lab, and finally in how many days the VMs should expire.

Note that I will not be updating below with fixes so grab it from Technet here.



# you can remove the TenantId if you have just a single tenant
$TenantId = ''
Select-AzureRmSubscription -TenantId $TenantId -SubscriptionId '' | Out-Null
Function Set-AzureVirtualMachineExpiredDate 
{ 
    [CmdletBinding()] 
    Param 
    ( 
        [Parameter(Mandatory=$true, ValueFromPipeline, Position=1)][ValidateNotNull()][String]$VMName, 
        [Parameter(Mandatory=$true)][ValidateNotNull()][String]$LabName, 
        [Parameter(Mandatory=$true)][ValidateNotNull()][String]$LabResourceGroupName,
        [Parameter(Mandatory=$true)][ValidateNotNull()][DateTime]$ExpiredUTCDate 
    ) 
 
    Begin{
        $Jobs = @()
    }

    Process
    {
        try {
            # get vm info 
            $targetVMInfo = Get-AzureRmResource -ResourceName "$LabName/$VMName" -ResourceGroupName $LabResourceGroupName `
                                                -ResourceType 'Microsoft.DevTestLab/labs/virtualMachines' -ExpandProperties
        }
        catch {
            Throw "$VMName not found in $LabName, error was:`n$_" 
        }
     
        # get vm properties 
        $vmProperties = $targetVMInfo.Properties 
     
        # set expired date
        $vmProperties | Add-Member -MemberType NoteProperty -Name expirationDate -Value $ExpiredUTCDate -Force 
        
        Write-Host "Setting expiry date to $ExpiredUTCDate on $LabName/$VMName..."
        $Jobs += Set-AzureRmResource -ResourceId $targetVMInfo.ResourceId -Properties $vmProperties -Force `
                    -ErrorAction Stop -AsJob
    } # end of process

    End
    {
        Write-Host "Waiting for jobs to complete..."
        $Jobs | Wait-Job | Receive-Job | ForEach-Object {
            Write-Host "Expiry date on $($_.Name) set to $($_.Properties.expirationDate)"
        }
    }
} 

$Lab = Get-AzureRmResource -ResourceType 'Microsoft.DevTestLab/labs' | Out-GridView -Title "Select DevTest Lab" -PassThru
$LabName = $Lab | Select-Object -ExpandProperty Name

$VM = Get-AzureRmResource -ResourceName "$LabName/*" -ResourceType 'Microsoft.DevTestLab/labs/virtualMachines' | `
        Out-GridView -Title "Select VM" -PassThru

$AddDays = 1..14 | Out-GridView -Title "Expire in days..." -PassThru

$VM | ForEach-Object {("$($_.Name)".Split('/') | Select-Object -Last 1)} | Set-AzureVirtualMachineExpiredDate `
                                    -LabName $LabName `
                                    -LabResourceGroupName $Lab.ResourceGroupName `
                                    -ExpiredUTCDate (Get-Date).AddDays($AddDays)

Out-GridView with Selected Properties

A script is worth a thousand words, right?



<#
 I use Out-GridView (alias: ogv) alot for interactively selecting objects. Also sometimes I need some extra information not 
 described in the object itself.
 Let's say we need to enumerate files in c:\temp for a list of computers. After collecting all the files we wish to use ogv 
 for displaying some of the properties like the name of the file, the size in kb and the computer on which the file is found. 
 We will pretend (because this is an example anyone can run) that the file object is missing the last part, hence we add it 
 to the object using Add-Member

 another usecase is simply that ogv will not show the properties we wish to see. Using Select-Object will create a new object
 and if we use the -Passthru parameter to ogv it will not be the original object we get. In below example we convert the size
 of each file to kb, which also uses Select-Object and a calculated property to do the conversion
#>

$ComputerNames = @($env:COMPUTERNAME)
$FilesInTemp = @()

foreach($ComputerName in $ComputerNames)
{
    $FilesInTemp += Invoke-Command -ScriptBlock {
        Get-ChildItem -Path c:\temp -File
    } -ComputerName $ComputerName | `
        Add-Member -Name MachineName -Value $ComputerName -MemberType NoteProperty -PassThru
}
# Now we have a list of files that we can select from, but below fails
$FilesInTemp |  Select-Object -Property Name, @{ Name = 'SizeInKb'; Expression = {  $_.Length/1KB }}, MachineName | `
                Out-GridView -Title "Select files to delete (example 1)" -PassThru | `
                Remove-Item -WhatIf
<#
 the problem is that Select-Object creates a new object with just the selected properties. Why Select-Object? Try running 
 the line below
#>
Get-ChildItem -Path c:\temp -File | Out-GridView
<# 
 we did get some decent properties, but it is showing the same we would get from a Format-Table, ie. the default properties
 if we want something different we use Select-Object, but as mentioned we get an entirely new object (with just the properties
 selected) , which is why Remove-Item fails
 A solution which can be applied in probably every case is found below
 The only difference is that we add the object to itself as a member and then later "extract" it before the pipe to Remove-Item
#>
$FilesInTemp = @()
foreach($ComputerName in $ComputerNames)
{
    $FilesInTemp += Invoke-Command -ScriptBlock {
        Get-ChildItem -Path c:\temp -File
    } -ComputerName $ComputerName | `
        Add-Member -Name MachineName -Value $ComputerName -MemberType NoteProperty -PassThru | `
        ForEach-Object {$_ | Add-Member -Name _self -Value $_ -MemberType NoteProperty -PassThru}
}

$FilesInTemp | Select-Object -Property Name, @{ Name = 'SizeInKb'; Expression = {  $_.Length/1KB }}, MachineName, _self | `
    Out-GridView -Title "Select files to delete (example 2)" -PassThru | `
    Select-Object -ExpandProperty _self | `
    Remove-Item -WhatIf

<#
 We can make this even easier with some helper functions. Below I have used _self as the property name. Some may recognize
 the name as used in Python, and equivalent of "this" in C#
#>

Function Add-Self
{
    process
    {
        ForEach-Object {$_ | Add-Member -Name '_self' -Value $_ -MemberType NoteProperty -PassThru}
    }
}

Function Get-Self
{
    process
    {
        $_ | Select-Object -ExpandProperty '_self'
    }
}
$FilesInTemp = @()
foreach($ComputerName in $ComputerNames)
{
    $FilesInTemp += Invoke-Command -ScriptBlock {
        Get-ChildItem -Path c:\temp -File
    } -ComputerName $ComputerName | `
        Add-Member -Name MachineName -Value $ComputerName -MemberType NoteProperty -PassThru | `
        Add-Self
}

$FilesInTemp | Select-Object -Property Name, @{ Name = 'SizeInKb'; Expression = {  $_.Length/1KB }}, MachineName, _self | `
    Out-GridView -Title "Select files to delete (example 3)" -PassThru | `
    Get-Self | `
    Remove-Item -WhatIf

Søg i denne blog