Using PowerShell Modules in Azure Functions

This post was originally published on this site

Previously, I blogged about how I created PowerShell GitHub Dashboard using Azure Functions to run a PowerShell script and didn’t use PowerShell Modules as I didn’t find an easy way to do it with Azure Functions.  Stefan informed me that you can easily do it using FTP!  Today, I’m publishing a guest blog post that Stefan authored that walks you through how to use PowerShell Modules in Azure Functions!


 

Steve Lee published a couple of blog posts about how he created a PowerShell Open Source Community Dashboard in PowerBi. In his last blog post he explained how he used PowerShell, Azure StorageTable, Azure Function and PowerBi to create the Dashboard.

In his solution, the Azure Function is executing a PowerShell script which calls the Github REST APIs and stores the result in an Azure StorageTable, finally queried by PowerBI.

Azure Functions and especially PowerShell Azure Functions are something I’m interested in for the last couple of weeks. I already wrote a blog post called “PowerShell Azure Functions lesson learned“. If you are just starting to explore PowerShell Azure Functions I would highly recommend to look.

In his blog post Steve mentions that within Azure Functions you only can run a PowerShell script and that you cannot use PowerShell modules within your PowerShell Azure Function. Well that’s not correct, there is a way to use PowerShell Modules within your PowerShell Azure Function.

In this blog post I’ll explain how you could start using PowerShell Modules in Azure Functions. I’ve to give credit to David O’Brien for introducing PowerShell Azure Functions to me.

 

Which PowerShell Modules are available with a PowerShell Azure Function?

First you need to create an Azure Function. On the Github page about Azure Functions you can find all the info to get started.

After creating your PowerShell Azure Function, you can start exploring the current available PowerShell Modules just like you would do on your local machine.

Get-Module -ListAvailable | Select-Object Name, Version | Sort-Object -Property Name

Let first start with retrieving the currently installed PowerShell Modules in Azure Functions by outputting the result to the log window in Azure Functions. We are also interested in the locations of the PowerShell Modules.

Write-Output ‘Getting PowerShell Module’

$result = Get-Module -ListAvailable |

Select-Object Name, Version, ModuleBase |

Sort-Object -Property Name |

Format-Table -wrap |

Out-String

Write-output `n$result

Copy above PowerShell code into the run.ps1 Azure Function and click on Run.

pic1

If we expand the log window we see that the following PowerShell Modules are installed.

pic2

I prefer to see the output result in my PowerShell ISE so I often use the following code in my Azure Function.

$result = Get-Module -ListAvailable |

Select-Object Name, Version, ModuleBase |

Sort-Object -Property Name |

Convertto-Json -depth 1

Out-File -encoding Ascii -FilePath $res -inputObject $Result

You can now call your Azure Function from the PowerShell ISE with the following PowerShell script.

$HttpRequestUrl = ‘[enter azure function url here]’

Invoke-RestMethod -Uri $HttpRequestUrl -Method POST -ContentType ‘application/json’ | ConvertFrom-Json

This results in the following.

pic3

The next step is checking the Environment variable PSModulePath to find out where all the PowerShell Modules are stored.

$result =  $env:psmodulepath |

Convertto-Json -depth 1

Out-File -encoding Ascii -FilePath $res -inputObject $Result


pic4

 

Where can we store our own PowerShell Modules?

Well because Function apps are built on App Service, all the deployment options available to standard web apps are also available for function apps 🙂 The App Service has a deployment option to configure deployment credentials.

Remark: When you configure a FTP/Deployment username and password this applies to all your Web Apps within your Microsoft Azure Account.

To upload PowerShell Modules via FTP we need to configure the Function App Settings.

Select Function app settings -> Go to App Service Settings to configure the Deployment credentials.

pic5

Next select in the App Service settings for the Azure Function Deployment Credentials.

pic6

Enter a FTP/deployment username and password.

pic7

Now you are ready to connect via a FTP client to your Azure Function (Web App).

You can find the FTP Hostname in your Azure Portal.

pic8

 

Connect via FTP Client to Azure Function Web App

Open your favorite FTP Client and connect to the FTP endpoint. I’m using Filezilla as my favorite FTP client.

pic9

Copy and past host, username from App Service overview page on the Azure Portal and click on Quickconnect.

pic10

After connecting via FTP we can create a new Module folder under wwwrootHttpTriggerPowerShellDemo (or your name of your Azure Function). Create a new directory called Modules.

pic11

After the creation of the Modules directory we can upload our PowerShell Module(s) to this remote directory. Make sure you have first downloaded your module to your local system. We are going to upload my favorite PowerShell Module Wunderlist.

Drag and drop the local Wunderlist module to the remote Modules folder.

pic12

You have now uploaded the Wunderlist module to your Azure Function.

 

How can we start using this module in our Azure Function?

First we can check if we can load the module and return the help from one of the Functions in the Module.

Use the following code in your Azure Function:

Write-Output “Loading Wunderlist Module”

import-module ‘D:HomesitewwwrootHttpTriggerPowerShellDemoModulesWunderlist1.0.10Wunderlist.psm1’

$Result = Get-Help Get-WunderlistUser | Out-String

Write-Output $Result

Result:

pic13

To get my Wunderlist Module working you need to also configure a ClientId and AccessToken. For more information you can check the following blog post Using Wunderlist Module in Azure Automation–Part 2. Long story short, you need to store the ClientID and AccessToken as Environment variables using the Web App Settings.

Go to Function App Settings and click on Configure app settings.

pic14

Add ClientID and AccessToken variables in the Application settings. Don’t forget to Save the new settings.

pic15

We can now use these Environment variables in our Azure Function without any extra configuration needed.

pic16

As a final test, we are going to test if we can create a new Wunderlist Task from our Azure Function.

By using a HTTP Post method with a body value as input for our Function we can do some more advanced Azure Function activities.

Use the following code in your Azure Function:

# Get the input request

$in = Get-Content $req -Raw | ConvertFrom-Json

Write-Output $in.Title

 

# Import Wunderlist Module

Write-Output “Loading Wunderlist Module”

import-module ‘D:HomesitewwwrootHttpTriggerPowerShellDemoModulesWunderlist1.0.10Wunderlist.psm1’

 

# Create new Wunderlist Task

Write-Output “Creating Wunderlist Task”

$Result = New-WunderlistTask -listid ‘267570849’ -title $in.Title

Write $Result

pic17

Hope you have now learned how to use your “own” PowerShell modules within your PowerShell Azure Functions.

Stefan Stranger
Secure Infrastructure Consultant
Microsoft

Code Coverage – Part 2

This post was originally published on this site

In my last post on code coverage, I shared the process for you to collect coverage for your environment. This week, I’ll be describing a way to use our tools to create new tests and show how you can measure the increase of coverage for PowerShell Core after adding new tests. To recap, we can collect code coverage with the OpenCover module, and then inspect the coverage. In this case I would like to know about coverage for a specific cmdlet. For this post, we’re going to focus on the Clear-Content Cmdlet because coverage is ok, but not fantastic and it is small enough to go over easily.

Here’s a partial capture from running the OpenCover tools:

coverage-2a

By selecting the class Microsoft.PowerShell.Commands.ClearContentCommand we can drill into the specifics about the class which implements the Clear-Content cmdlet. We can see that we have about 47% line coverage for this class which isn’t fantastic, by inspecting the red-highlights we can see what’s missing.

coverage-2b

coverage-2c

coverage-2d

It looks like there are some error conditions, and some code which represents whether the underlying provider supports should process are not being tested. We can create tests for these missing areas fairly easily, but I need to know where these new tests should go.

Test Code Layout

Now is a good time to describe how our tests are laid out.

https://github.com/PowerShell/PowerShell/test contains all of the test code for PowerShell. This includes our native tests, C-Sharp tests and Pester tests as well as the tools we use. Our Pester tests should all be found in https://github.com/PowerShell/PowerShell/test/powershell and in that directory there is more structure to make it easier to find tests. For example, if you want to find those tests for a specific cmdlet, you would look in the appropriate module directory for those tests. In our case, we’re adding tests for Clear-Content, which should be found in https://github.com/PowerShell/PowerShell/test/powershell/Modules/Microsoft.PowerShell.Management. (You can always find which module a cmdlet resides via get-command). If we look in this directory, we can already see the file Clear-Content.Tests.ps1, so we’ll add our tests to that file. If that file didn’t exist, you should just create a new file for your tests. Sometimes the tests for a cmdlet may be combined with other tests. Take this as an opportunity to split up the file to make it easier for the next person adding tests. If you want more information about how we segment our tests, you can review https://github.com/PowerShell/PowerShell/docs/testing-guidelines/testing-guidelines.md.

New Test Code

Based on the missing code coverage, I created the following replacement for Clear-Content.Tests.ps1 which you can see in this PR: https://github.com/PowerShell/PowerShell/pull/3157. After rerunning the code coverage tools, I can see that I’ve really improved coverage for this cmdlet.

coverage-2e

There seems to be a small issue with OpenCover as some close braces are not being marked as missed, but you can see the improvement:

coverage-2f

Now it’s your turn and we could really use your help. If you have areas of the product that you rely on, and don’t have the tests that you think they should have, please consider adding tests!

 

 

 

Managing Security Settings on Nano Server with DSC

This post was originally published on this site

We have released DSC resources building upon the previously released security and registry cmdlets for applying security settings. You can now implement Microsoft-defined security baselines using DSC.

AuditPolicyDsc

SecurityPolicyDsc

GPRegistryPolicy

Install all 3 from the Gallery with the command:

install-module SecurityPolicyDsc, AuditPolicyDsc, GpRegistryPolicy 

A sample configuration, below, takes the Security Baselines for Windows Server 2016 and extracts the .inf, .csv and .pol containing the desired security settings from the exported Group Policy Objects. (You can find information on extracting the necessary files in the Registry cmdlets blogpost.) Simply pass the files into the new DSC resources, and you have successfully implemented security baselines using DSC!

This is most useful for Nano Server, since Nano Server doesn’t support Group Policy. However, this approach will work for all installation options. It’s not a good idea to manage the same server using both Group Policy and DSC since the two engines will constantly attempt to overwrite each other if they are both managing the same setting.

WARNING: As with all security settings, you can easily lock yourself out of remote access to your machine if you are not careful. Be sure to carefully review security settings before applying them to Nano Server, and stage test deployments before using security baselines in production!

Configuration SecurityBaseline
{
    Import-DscResource -ModuleName AuditPolicyDsc, SecurityPolicyDSC, GpRegistryPolicy
    node localhost
    {
        SecurityTemplate baselineInf
        {
            Path = "C:UsersAdministratorDocumentsGptTmpl.inf"
            # https://msdn.microsoft.com/powershell/dsc/singleinstance
            IsSingleInstance = "Yes"
        }
        AuditPolicyCsv baselineCsv
        {
            IsSingleInstance = "Yes"
            CsvPath = "C:UsersAdministratorDocumentsaudit.csv"
        }
        RegistryPolicy baselineGpo
        {
            Path = "C:UsersAdministratorDocumentsregistry.pol"
        }
    }
}
#Compile the MOF file
SecurityBaseline 
Start-DscConfiguration -Path ./SecurityBaseline 

MSA2040 (SAS) Cluster Slow Speed of Network

This post was originally published on this site

Hi there,

i have a problem on customer side.

they run a small env. with 3xDL360 Gen9 connected by SAS to a MSA2040.

They have 3 Disk Groups with about 6 Disks each, 1.2 TB SAS 10k Drives.

Raid is RAID6.

 

They run a ESXi 6.0 Cluster (Update 2 i think) only Windows VM’s resides on the cluster.

 

When i do a speedtest on the fileserver (write local), i get arround 569 MB/s write, 532 MB/s read.

When i do the speedtest on a Citrix Server (Write local), i get 395 MB/s write, 200 MB/S read

 

When i start the speedtest and write from the citrix server to the fileserver, i only get 30 MB/s.

The Servers are all connected with 2 NIC’s (Gbit) to a HPE Switch.

 

How can i find the bottlneck in this situation, it must be something network related or any other hint?

Building a GitHub Dashboard using PowerShell, AzureStorageTable, AzureFunction, and PowerBI

This post was originally published on this site
Last week, I published a PowerShell Community Dashboard and today, I’m going to share the code and cover some of the learnings.The code is published as a module on the PowerShell Gallery.
Make sure you get v1.1 as I found an issue where if you’re not a member of the PowerShell Org on GitHub, you won’t have permission to query the members so I changed the code to accommodate that.You can install the module using:

install-module PSGitHubStats

(and it works on PowerShell Core 6.0 including Linux! I only tested it with alpha.15, though…)

Once installed, you can just run it manually:

PS C:> Get-PSDownloadStats -publishedSinceDate 1-1-2017 -accessToken $accesstoken

Tag             Name                                                 OS      Distro    Count Published
---             ----                                                 --      ------    ----- ---------
v6.0.0-alpha.15 powershell-6.0.0-alpha.15.pkg                        MacOS   MacOS     1504  1/25/2017 7:25:52 PM
v6.0.0-alpha.15 powershell-6.0.0_alpha.15-1.el7.centos.x86_64.rpm    Linux   CentOS    436   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 powershell_6.0.0-alpha.15-1ubuntu1.14.04.1_amd64.deb Linux   Ubuntu14  368   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 powershell_6.0.0-alpha.15-1ubuntu1.16.04.1_amd64.deb Linux   Ubuntu16  951   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win10-win2k16-x64.msi      Windows Windows10 349   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win10-win2k16-x64.zip      Windows Windows10 70    1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win7-win2k8r2-x64.msi      Windows Windows7  119   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win7-win2k8r2-x64.zip      Windows Windows7  34    1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win7-x86.msi               Windows Windows7  192   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win7-x86.zip               Windows Windows7  17    1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win81-win2k12r2-x64.msi    Windows Windows8  74    1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win81-win2k12r2-x64.zip    Windows Windows8  21    1/25/2017 7:25:52 PM

PS C:> $contributors = Get-PSGitHubReport -startDate 1-1-2017 -repos powershell/powershell -accessToken $accesstoken
PS C:> $contributors | ? {$_.Org -eq "Community"} | sort -Property Total -Top 10 -Descending

   Org: Community

Name             PRs               Issues           PR Comments      Issue Comments   Total            End Date
----             ---               ------           -----------      --------------   -----            --------
iSazonov         8                 4                44               46               102              2017-02-03 10...
vors             5                 4                12               6                27               2017-02-03 10...
thezim           0                 2                0                9                11               2017-02-03 10...
juneb            0                 4                0                4                8                2017-02-03 10...
Jaykul           0                 3                0                5                8                2017-02-03 10...
pcgeek86         0                 3                0                5                8                2017-02-03 10...
jeffbi           0                 0                0                6                6                2017-02-03 10...
MaximoTrinidad   0                 2                0                3                5                2017-02-03 10...
g8tguy           0                 0                0                5                5                2017-02-03 10...
mwallner         0                 1                0                3                4                2017-02-03 10...

The $accesstoken is something you would generate and needed because the number of queries I have to do against the GitHub API will likely exceed the unauthenticated rate limit.
I ran over the rate limit many times while generating my report, even as an authenticated user. I solved this by adding a sleep command to the report generation.

One thing you may notice with the module vs the dashboard is that you get the raw numbers rather than just the rankings.
On the dashboard, we decided to only show the rankings so that people don’t focus specifically on the numbers.

Get-PSDownloadStats should be pretty straight forward.
There is some specialized logic in that function to determine the target operating system for the release package which unfortunately depends on the filename.

Publishing to AzureTable is fairly simple once you figure out the magic sauce to provide in the headers to make sure you’re calling the appropriate version of the REST API:

if ($publishToAzure)
{
    $json = $pkg | ConvertTo-Json -Compress
    $date = [datetime]::UtcNow.ToString("R", [System.Globalization.CultureInfo]::InvariantCulture)
    [string] $canonicalizedResource = "/$storageAccount/$storageTable"
    $contentType = "application/json"
    [string] $stringToSign = "POST`n`n$contentType`n$date`n$canonicalizedResource"
    $headers = @{"Prefer"="return-no-content";"Authorization"=(CreateAuthHeader -canonicalizedString $stringToSign -storageAccount $storageAccount -storageKey $storageKey);
        "DataServiceVersion"="3.0;NetFx";"MaxDataServiceVersion"="3.0;NetFx";"Accept"="application/json;odata=nometadata";
        "Accept-Charset"="UTF-8";"x-ms-version"="2013-08-15";"x-ms-date"=$date}
    $null = Invoke-RestMethod -Uri $storageUrl -Headers $headers -Body $json -Method Post -ContentType $contentType
}

I deliberately chose to use an AzureTable for a few reasons:

  • Power BI supports reading from AzureTable natively (although I think you can only do it from the Power BI desktop app) as I couldn’t find the option in the web interface
  • I didn’t need the relational capabilities nor the additional cost of AzureSQL for my purposes
  • I can import JSON directly into AzureTable

The most complicated part of working with AzureTable is correctly crafting the authentication header built from a canonicalized string AzureTable expects to protect against replay attacks.
The string is defined in the previous code section as $stringToSign while this bit of code hashes it and converts the result to Base64:

Function CreateAuthHeader([string]$canonicalizedString,[string]$storageAccount,[string]$storageKey)
{
    [string]$signature = [string]::Empty
    [byte[]]$bytes = [System.Convert]::FromBase64String($storageKey)
    [System.Security.Cryptography.HMACSHA256] $SHA256 = New-Object System.Security.Cryptography.HMACSHA256(,$bytes)
    [byte[]] $dataToSha256 = [System.Text.Encoding]::UTF8.GetBytes($canonicalizedString)
    $signature = [System.Convert]::ToBase64String($SHA256.ComputeHash($dataToSha256))
    "SharedKey $($storageAccount):$signature"
}

Ilya‘s RFC on Get-StringHash should help make this simpler and more readable eliminating several lines of code from that function.

Once I had the module working in the command line, I validated it was correctly uploaded to Azure using Microsoft Azure Storage Explorer. Now I needed to have the script run regularly. I considered both Azure Automation and Azure Functions and decided to use the latter as it was newer, which gave me an opportunity to learn it. One immediate problem I had is that Azure Functions today only supports PowerShell v4. I originally used PowerShell classes for my internal types and thus changed it all to PSCustomObjects.
I’ve since contacted the Azure Functions team and asked them to support both Windows PowerShell v5.x and PowerShell Core 6.0 in the future.

With Azure Functions, you really only have the ability to run a PowerShell script. This means that unless you install the Azure PowerShell module at runtime (and PowerShellGet isn’t part of PowerShell v4), you really can only use what is available with PowerShell. Azure Automation would be better suited if you want to use modules.

I just cut and pasted the code out of my module into the web interface and added a call to my functions at the end supplying all the necessary parameters to make sure I was getting the right data from GitHub, and uploading the data correctly into AzureTable. One thing to note is that you’ll see I pass -UseBasicParsing to all my Invoke-WebRequest calls as the ParsedHtml property in the output relies on MSHTML which relies on Internet Explorer.
IE is not available in the Azure Function container your script is running in. I should also mention that I use both Invoke-WebRequest and Invoke-RestMethod where the former is needed when I need to access the response headers specifically for pagination of the GitHub API response which is handled by this bit of code:

if ($null -ne $output.Headers.Link) {
    $links = $output.Headers.Link.Split(",").Trim()
    foreach ($link in $links) {
        if ($link -match "<(?<url>.*?)>;srel=`"(?<rel>.*?)`"") {
            if ($matches.rel -eq 'next') {
                $query = $matches.url
            }
        }
    }
}

I’ll be working to add this capability into Invoke-RestMethod so this bit of code can be removed.

Building the Power BI visualization is a whole other topic and @MSFTzachal really did most of the work, but my recommendation is to use the Power BI desktop app which I found easier and more powerful than the current web interface.

Steve Lee
Principal Software Engineer Manager
PowerShell Core

Don’t Login on Untrusted Computers

This post was originally published on this site

A password is only as secure as the computer or network it is used on. As such, never log in to a sensitive account from a public computer, such as computers in a cyber cafe, hotel lobby or conference hall. Bad guys target public computers such as these and infect them on purpose. The moment you type your password on an infected computer, these cyber criminals can harvest your passwords. If you have no choice but to use a public computer, change your password at the next available opportunity you have access to a trusted computer.

Installing latest PowerShell Core 6.0 Release on Linux just got easier!

This post was originally published on this site
As we continue our journey from Alpha releases and eventually to Beta, you can continue to download the latest releases from our GitHub repository.
However, our goal has always been to enable installation through popular existing Linux package management tools like apt-get and yum.I am pleased to announce that we have now published PowerShell Core 6.0 alpha.15 to https://packages.microsoft.com!

Install PowerShell Core on Ubuntu 14.04

# Import the public repository GPG keys
curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -

# Register the Microsoft Ubuntu repository
curl https://packages.microsoft.com/config/ubuntu/14.04/prod.list | sudo tee /etc/apt/sources.list.d/microsoft.list

# Update apt-get
sudo apt-get update

# Install PowerShell
sudo apt-get install -y powershell

# Start PowerShell
powershell

Install PowerShell Core on Ubuntu 16.04

# Import the public repository GPG keys
curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -

# Register the Microsoft Ubuntu repository
curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list | sudo tee /etc/apt/sources.list.d/microsoft.list

# Update apt-get
sudo apt-get update

# Install PowerShell
sudo apt-get install -y powershell

# Start PowerShell
powershell

Install PowerShell Core on CentOS

# Enter superuser mode
sudo su 

# Register the Microsoft RedHat repository
curl https://packages.microsoft.com/config/rhel/7/prod.repo > /etc/yum.repos.d/microsoft.repo

# Exit superuser mode
exit

# Install PowerShell
sudo yum install -y powershell

# Start PowerShell
powershell

After registering the Microsoft repository once as superuser, from then on, you just need to use either sudo apt-get install powershell or sudo yum update powershell (depending on which distro you are using) to update it.

We plan to simultaneously publish to https://packages.microsoft.com and our GitHub repository for each new release.

Steve Lee
Principal Software Engineer Manager
PowerShell Core