Category Archives: Misc.

vCenter Server 6.0 Update 3g/3h Upgrade or Migration Path

This post was originally published on this site

The upgrade and migration path for vCenter Server 6.0 Update 3g and Update 3h has an important note that you may need to consider. There is currently no upgrade or migration path to vCenter Server 6.5 or 6.7 if you deployed vCenter Server Update 3g or 3h with the embedded Postgres DB. We have noted

The post vCenter Server 6.0 Update 3g/3h Upgrade or Migration Path appeared first on VMware Support Insider.

Introducing docs.vmware.com

This post was originally published on this site

Today we are pleased to announce the official launch of docs.vmware.com. This portal unifies the product documentation for all VMware products, versions, and languages into a single platform so you can find the information that you are looking for more quickly and easily. VMware products offer a wide range of business solutions from desktop virtualization to supporting your hybrid cloud. We’ve heard your feedback that finding the right information can be difficult. Our search was out of date, the look and feel were not modern, the content was siloed, and the docs were not available on mobile devices. To address these problems, we decided to start from scratch. The design of this site is meant to enable you to better filter content, find relevant answers, and create custom views of information that you can access on any device.

Key Features for You

Here are a few features that we hope will help you find the information that you need.

It’s All About Search

Our site design features search on every page. We built a new search experience powered by Elasticsearch with an intuitive taxonomy to help you more easily locate relevant content. You can search the entire site or a particular book.

Filter to Find What You Need

With tens of thousands of pages of information, you need help reducing the scope of content to the products that you care about. Use the filters to limit your search to particular products, versions, subjects or information types.

Create Custom Docs Collections and Share Content

Your VMware stack spans many products and versions and you need a way to manage the product documentation for your specific portfolio. MyLibrary enables you to assemble custom doc sets and share as HTML collections. You can add and organize product documentation from any product, any version, and any language.

Products currently supported

Currently, we support the products listed below:

  • VMware vSphere
  • VMware NSX for vSphere
  • VMware vSAN
  • VMware vRealize Automation
  • VMware Identity Manager
  • VMware Horizon 7
  • VMware Integrated OpenStack
  • VMware Cloud Foundation
  • VMware Workstation Pro, Player
  • VMware Fusion
  • VMware Site Recovery Manager
  • VMware vRealize Orchestrator
Your Feedback

The launch of this site is only Step 1. We’ve put in new feedback mechanisms so we can hear from you more often and make more frequent improvements to the site and content. Explore docs.vmware.com now and let us know if you want to get involved in helping us evolve our docs.

The post Introducing docs.vmware.com appeared first on Support Insider.

Beta Launch of docs.vmware.com

This post was originally published on this site

Today we are pleased to announce the beta launch of the docs.vmware.com site. This portal unifies the product documentation for all products, versions, and languages into a single site so you can find the information that you are looking for more quickly. VMware products offer a wide range of business solutions from desktop virtualization to support your hybrid cloud. We’ve heard your feedback that finding the right information can be difficult. Our search was out of date, the look and feel was not modern, the content was siloed, and the docs were not available on mobile devices. To address these problems, we decided to create a new site from scratch. The design of this site is meant to enable you to better filter content, find relevant answers, and create custom views of information that you can access on any device.

Key Features for You

Here are a few features that we hope will help you find the information that you need.

It’s All About Search

Our site design features search on every page. We’ve put in a new Elastic Search engine and a taxonomy to help you more easily locate relevant content. You can search the entire site or a particular book.

Doc1

Filter to Find What You Need

With tens of thousands of pages of information, you need help reducing the scope of content to the products that you care about. Use the filters to limit your search to particular products, versions, or information types.

Doc1

Create Customer Collections and Share Content

Use the MyLibrary feature to assemble custom doc sets that you can share as HTML collections.

Doc1

Your Feedback

The beta launch of this site is only Step 1. We’ve put in new feedback mechanisms so we can hear from you more often and make more frequent improvements to the site and content.

Let us know if you want to get involved in helping us evolve our docs. Expect to hear more about the docs.vmware.com site as we prepare for the official launch.

The post Beta Launch of docs.vmware.com appeared first on Support Insider.

DSC Resource Kit Release March 2017

This post was originally published on this site

We just released the DSC Resource Kit!

This release includes updates to 10 DSC resource modules, including 19 new DSC resources. In these past 6 weeks, 155 pull requests (the most ever!) have been merged and 71 issues have been closed, all thanks to our amazing community!

A few weeks ago we also released a new DSC resource module called SecurityPolicyDsc with resources to configure local security policies through secedit. Thank you to Jason Walker for this great new module!

The modules updated in this release are:

  • OfficeOnlineServerDsc
  • PSDscResources
  • SecurityPolicyDsc
  • SharePointDsc
  • xCertificate
  • xExchange
  • xPSDesiredStateConfiguration
  • xRemoteDesktopSessionHost
  • xSQLServer
  • xWindowsUpdate

For a detailed list of the resource modules and fixes in this release, see the Included in this Release section below.

Our last community call for the DSC Resource Kit was last week on March 1. A recording of our updates as well as summarizing notes are available. Join us next time at 9AM PST on April 12 to ask questions and give feedback about your experience with the DSC Resource Kit. Keep an eye on the community agenda for the link to the call.

We strongly encourage you to update to the newest version of all modules using the PowerShell Gallery, and don’t forget to give us your feedback in the comments below, on GitHub, or on Twitter (@PowerShell_Team)!

All resources with the ‘x’ prefix in their names are still experimental – this means that those resources are provided AS IS and are not supported through any Microsoft support program or service. If you find a problem with a resource, please file an issue on GitHub.

Included in this Release

You can see a detailed summary of all changes included in this release in the table below. For past release notes, go to the README.md or Changelog.md file on the GitHub repository page for a specific module (see the How to Find DSC Resource Modules on GitHub section below for details on finding the GitHub page for a specific module).

Module Name Version Release Notes
OfficeOnlineServerDsc 1.0.0.0
  • Added documentation to the module to finalise for release
  • Renamed resources to shorten names before release
    • “OfficeOnlineServerWebAppsFarm” becomes “OfficeOnlineServerFarm”
    • “OfficeOnlineServerWebAppsMachine” becomes “OfficeOnlineServerMachine”
PSDscResources 2.5.0.0
  • Enable codecov.io code coverage reporting
  • Group
    • Added support for domain based group members on Nano server.
  • Added the Archive resource
  • Update Test-IsNanoServer cmdlet to properly test for a Nano server rather than the core version of PowerShell
  • Registry
    • Fixed bug where an error was thrown when running Get-DscConfiguration if the registry already existed
SecurityPolicyDsc 1.2.0.0
  • SecurityTemplate: Remove [ValidateNotNullOrEmpty()] attribute for IsSingleInstance parameter
  • Fixed typos
SharePointDsc 1.6.0.0
    • Updated SPWebApplication to allow Claims Authentication configuration
    • Updated documentation in regards to guidance on installing binaries from

network locations instead of locally

  • New resources: SPFarmPropertyBag
  • Bugfix in SPSite, which wasnt returing the quota template name in a correct way
  • Bugfix in SPAppManagementServiceApp which wasnt returning the correct database

name

  • Bugfix in SPAccessServiceApp which did not return the database server
  • Bugfix in SPDesignerSettings which filtered site collections with an incorrect

parameter

  • Updated the parameters in SPFarmSolution to use the full namespace
  • Bugfix in SPFarmsolution where it returned non declared parameters
  • Corrected typo in parameter name in Get method of SPFeature
  • Added check in SPHealAnalyzerRuleState for incorrect default rule schedule of

one rule

  • Improved check for CloudSSA in SPSearchServiceApp
  • Bugfix in SPSearchServiceApp in which the database and dbserver were not

returned correctly

  • Improved runtime of SPSearchTopology by streamlining wait processes
  • Fixed bug with SPSearchServiceApp that would throw an error about SDDL string
  • Improved output of test results for AppVeyor and VS Code based test runs
  • Fixed issue with SPWebAppPolicy if OS language is not En-Us
  • Added SPFarm resource, set SPCreateFarm and SPJoinFarm as deprecated to be

removed in version 2.0

xCertificate 2.4.0.0
  • Converted AppVeyor build process to use AppVeyor.psm1.
  • Correct Param block to meet guidelines.
  • Moved shared modules into modules folder.
  • xCertificateExport:
    • Added new resource.
  • Cleanup xCertificate.psd1 to remove unneccessary properties.
  • Converted AppVeyor.yml to use DSCResource.tests shared code.
  • Opted-In to markdown rule validation.
  • Examples modified to meet standards for auto documentation generation.
xExchange 1.14.0.0
  • xExchDatabaseAvailabilityGroup: Added parameter AutoDagAutoRedistributeEnabled,PreferenceMoveFrequency
xPSDesiredStateConfiguration 6.1.0.0
  • Moved DSC pull server setup tests to DSCPullServerSetup folder for new common tests
  • xArchive:
    • Updated the resource to be a high quality resource
    • Transferred the existing “unit” tests to integration tests
    • Added unit and end-to-end tests
    • Updated documentation and examples
  • xUser
    • Fixed error handling in xUser
  • xRegistry
    • Fixed bug where an error was thrown when running Get-DscConfiguration if the registry key already existed
  • Updated Test-IsNanoServer cmdlet to properly test for a Nano server rather than the core version of PowerShell
xRemoteDesktopSessionHost 1.4.0.0
  • Updated CollectionName parameter to validate length between 1 and 15 characters, and added tests to verify.
xSqlServer 6.0.0.0
  • Changes to xSQLServerConfiguration
    • BREAKING CHANGE: The parameter SQLInstanceName is now mandatory.
    • Resource can now be used to define the configuration of two or more different DB instances on the same server.
  • Changes to xSQLServerRole
    • xSQLServerRole now correctly reports that the desired state is present when the login is already a member of the server roles.
  • Added new resources
    • xSQLServerAlwaysOnAvailabilityGroup
  • Changes to xSQLServerSetup
    • Properly checks for use of SQLSysAdminAccounts parameter in $PSBoundParameters. The test now also properly evaluates the setup argument for SQLSysAdminAccounts.
    • xSQLServerSetup should now function correctly for the InstallFailoverCluster action, and also supports cluster shared volumes. Note that the AddNode action is not currently working.
    • It now detects that feature Client Connectivity Tools (CONN) and Client Connectivity Backwards Compatibility Tools (BC) is installed.
    • Now it can correctly determine the right cluster when only parameter InstallSQLDataDir is assigned a path (issue 401).
    • Now the only mandatory path parameter is InstallSQLDataDir when installing Database Engine (issue 400).
    • It now can handle mandatory parameters, and are not using wildcards to find the variables containing paths (issue 394).
    • Changed so that instead of connection to localhost it is using $env:COMPUTERNAME as the host name to which it connects. And for cluster installation it uses the parameter FailoverClusterNetworkName as the host name to which it connects (issue 407).
    • When called with Action = “PrepareFailoverCluster”, the SQLSysAdminAccounts and FailoverClusterGroup parameters are no longer passed to the setup process (issues 410 and 411).
    • Solved the problem that InstanceDir and InstallSQLDataDir could not be set to just a qualifier, i.e “E:” (issue 418). All paths (except SourcePath) can now be set to just the qualifier.
  • Enables CodeCov.io code coverage reporting.
  • Added badge for CodeCov.io to README.md.
  • Examples
    • xSQLServerMaxDop
      • 1-SetMaxDopToOne.ps1
      • 2-SetMaxDopToAuto.ps1
      • 3-SetMaxDopToDefault.ps1
    • xSQLServerMemory
      • 1-SetMaxMemoryTo12GB.ps1
      • 2-SetMaxMemoryToAuto.ps1
      • 3-SetMinMaxMemoryToAuto.ps1
      • 4-SetMaxMemoryToDefault.ps1
    • xSQLServerDatabase
      • 1-CreateDatabase.ps1
      • 2-DeleteDatabase.ps1
  • Added tests for resources
    • xSQLServerMaxDop
    • xSQLServerMemory
  • Changes to xSQLServerMemory
    • BREAKING CHANGE: The mandatory parameter now include SQLInstanceName. The DynamicAlloc parameter is no longer mandatory
  • Changes to xSQLServerDatabase
    • When the system is not in desired state the Test-TargetResource will now output verbose messages saying so.
  • Changes to xSQLServerDatabaseOwner
    • Fixed code style, added updated parameter descriptions to schema.mof and README.md.
xWindowsUpdate 2.6.0.0
    • Converted appveyor.yml to install Pester from PSGallery instead of from

Chocolatey.

  • Fixed PSScriptAnalyzer issues.
  • Fixed common test breaks (markdown style, and example style).
  • Added CodeCov.io reporting
  • Deprecated xMicrosoftUpdate as it”s functionality is replaced by xWindowsUpdateAgent

How to Find Released DSC Resource Modules

To see a list of all released DSC Resource Kit modules, go to the PowerShell Gallery and display all modules tagged as DSCResourceKit. You can also enter a module’s name in the search box in the upper right corner of the PowerShell Gallery to find a specific module.

Of course, you can also always use PowerShellGet (available in WMF 5.0) to find modules with DSC Resources:

# To list all modules that are part of the DSC Resource Kit
Find-Module -Tag DSCResourceKit 
# To list all DSC resources from all sources 
Find-DscResource

To find a specific module, go directly to its URL on the PowerShell Gallery:
http://www.powershellgallery.com/packages/< module name >
For example:
http://www.powershellgallery.com/packages/xWebAdministration

How to Install DSC Resource Modules From the PowerShell Gallery

We recommend that you use PowerShellGet to install DSC resource modules:

Install-Module -Name < module name >

For example:

Install-Module -Name xWebAdministration

To update all previously installed modules at once, open an elevated PowerShell prompt and use this command:

Update-Module

After installing modules, you can discover all DSC resources available to your local system with this command:

Get-DscResource

How to Find DSC Resource Modules on GitHub

All resource modules in the DSC Resource Kit are available open-source on GitHub.
You can see the most recent state of a resource module by visiting its GitHub page at:
https://github.com/PowerShell/< module name >
For example, for the xCertificate module, go to:
https://github.com/PowerShell/xCertificate.

All DSC modules are also listed as submodules of the DscResources repository in the xDscResources folder.

How to Contribute

You are more than welcome to contribute to the development of the DSC Resource Kit! There are several different ways you can help. You can create new DSC resources or modules, add test automation, improve documentation, fix existing issues, or open new ones.
See our contributing guide for more info on how to become a DSC Resource Kit contributor.

If you would like to help, please take a look at the list of open issues for the DscResources repository.
You can also check issues for specific resource modules by going to:
https://github.com/PowerShell/< module name >/issues
For example:
https://github.com/PowerShell/xPSDesiredStateConfiguration/issues

Your help in developing the DSC Resource Kit is invaluable to us!

Questions, comments?

If you’re looking into using PowerShell DSC, have questions or issues with a current resource, or would like a new resource, let us know in the comments below, on Twitter (@PowerShell_Team), or by creating an issue on GitHub.

Katie Keim
Software Engineer
PowerShell Team
@katiedsc (Twitter)
@kwirkykat (GitHub)

Using PowerShell Modules in Azure Functions

This post was originally published on this site

Previously, I blogged about how I created PowerShell GitHub Dashboard using Azure Functions to run a PowerShell script and didn’t use PowerShell Modules as I didn’t find an easy way to do it with Azure Functions.  Stefan informed me that you can easily do it using FTP!  Today, I’m publishing a guest blog post that Stefan authored that walks you through how to use PowerShell Modules in Azure Functions!


 

Steve Lee published a couple of blog posts about how he created a PowerShell Open Source Community Dashboard in PowerBi. In his last blog post he explained how he used PowerShell, Azure StorageTable, Azure Function and PowerBi to create the Dashboard.

In his solution, the Azure Function is executing a PowerShell script which calls the Github REST APIs and stores the result in an Azure StorageTable, finally queried by PowerBI.

Azure Functions and especially PowerShell Azure Functions are something I’m interested in for the last couple of weeks. I already wrote a blog post called “PowerShell Azure Functions lesson learned“. If you are just starting to explore PowerShell Azure Functions I would highly recommend to look.

In his blog post Steve mentions that within Azure Functions you only can run a PowerShell script and that you cannot use PowerShell modules within your PowerShell Azure Function. Well that’s not correct, there is a way to use PowerShell Modules within your PowerShell Azure Function.

In this blog post I’ll explain how you could start using PowerShell Modules in Azure Functions. I’ve to give credit to David O’Brien for introducing PowerShell Azure Functions to me.

 

Which PowerShell Modules are available with a PowerShell Azure Function?

First you need to create an Azure Function. On the Github page about Azure Functions you can find all the info to get started.

After creating your PowerShell Azure Function, you can start exploring the current available PowerShell Modules just like you would do on your local machine.

Get-Module -ListAvailable | Select-Object Name, Version | Sort-Object -Property Name

Let first start with retrieving the currently installed PowerShell Modules in Azure Functions by outputting the result to the log window in Azure Functions. We are also interested in the locations of the PowerShell Modules.

Write-Output ‘Getting PowerShell Module’

$result = Get-Module -ListAvailable |

Select-Object Name, Version, ModuleBase |

Sort-Object -Property Name |

Format-Table -wrap |

Out-String

Write-output `n$result

Copy above PowerShell code into the run.ps1 Azure Function and click on Run.

pic1

If we expand the log window we see that the following PowerShell Modules are installed.

pic2

I prefer to see the output result in my PowerShell ISE so I often use the following code in my Azure Function.

$result = Get-Module -ListAvailable |

Select-Object Name, Version, ModuleBase |

Sort-Object -Property Name |

Convertto-Json -depth 1

Out-File -encoding Ascii -FilePath $res -inputObject $Result

You can now call your Azure Function from the PowerShell ISE with the following PowerShell script.

$HttpRequestUrl = ‘[enter azure function url here]’

Invoke-RestMethod -Uri $HttpRequestUrl -Method POST -ContentType ‘application/json’ | ConvertFrom-Json

This results in the following.

pic3

The next step is checking the Environment variable PSModulePath to find out where all the PowerShell Modules are stored.

$result =  $env:psmodulepath |

Convertto-Json -depth 1

Out-File -encoding Ascii -FilePath $res -inputObject $Result


pic4

 

Where can we store our own PowerShell Modules?

Well because Function apps are built on App Service, all the deployment options available to standard web apps are also available for function apps 🙂 The App Service has a deployment option to configure deployment credentials.

Remark: When you configure a FTP/Deployment username and password this applies to all your Web Apps within your Microsoft Azure Account.

To upload PowerShell Modules via FTP we need to configure the Function App Settings.

Select Function app settings -> Go to App Service Settings to configure the Deployment credentials.

pic5

Next select in the App Service settings for the Azure Function Deployment Credentials.

pic6

Enter a FTP/deployment username and password.

pic7

Now you are ready to connect via a FTP client to your Azure Function (Web App).

You can find the FTP Hostname in your Azure Portal.

pic8

 

Connect via FTP Client to Azure Function Web App

Open your favorite FTP Client and connect to the FTP endpoint. I’m using Filezilla as my favorite FTP client.

pic9

Copy and past host, username from App Service overview page on the Azure Portal and click on Quickconnect.

pic10

After connecting via FTP we can create a new Module folder under wwwrootHttpTriggerPowerShellDemo (or your name of your Azure Function). Create a new directory called Modules.

pic11

After the creation of the Modules directory we can upload our PowerShell Module(s) to this remote directory. Make sure you have first downloaded your module to your local system. We are going to upload my favorite PowerShell Module Wunderlist.

Drag and drop the local Wunderlist module to the remote Modules folder.

pic12

You have now uploaded the Wunderlist module to your Azure Function.

 

How can we start using this module in our Azure Function?

First we can check if we can load the module and return the help from one of the Functions in the Module.

Use the following code in your Azure Function:

Write-Output “Loading Wunderlist Module”

import-module ‘D:HomesitewwwrootHttpTriggerPowerShellDemoModulesWunderlist1.0.10Wunderlist.psm1’

$Result = Get-Help Get-WunderlistUser | Out-String

Write-Output $Result

Result:

pic13

To get my Wunderlist Module working you need to also configure a ClientId and AccessToken. For more information you can check the following blog post Using Wunderlist Module in Azure Automation–Part 2. Long story short, you need to store the ClientID and AccessToken as Environment variables using the Web App Settings.

Go to Function App Settings and click on Configure app settings.

pic14

Add ClientID and AccessToken variables in the Application settings. Don’t forget to Save the new settings.

pic15

We can now use these Environment variables in our Azure Function without any extra configuration needed.

pic16

As a final test, we are going to test if we can create a new Wunderlist Task from our Azure Function.

By using a HTTP Post method with a body value as input for our Function we can do some more advanced Azure Function activities.

Use the following code in your Azure Function:

# Get the input request

$in = Get-Content $req -Raw | ConvertFrom-Json

Write-Output $in.Title

 

# Import Wunderlist Module

Write-Output “Loading Wunderlist Module”

import-module ‘D:HomesitewwwrootHttpTriggerPowerShellDemoModulesWunderlist1.0.10Wunderlist.psm1’

 

# Create new Wunderlist Task

Write-Output “Creating Wunderlist Task”

$Result = New-WunderlistTask -listid ‘267570849’ -title $in.Title

Write $Result

pic17

Hope you have now learned how to use your “own” PowerShell modules within your PowerShell Azure Functions.

Stefan Stranger
Secure Infrastructure Consultant
Microsoft

Code Coverage – Part 2

This post was originally published on this site

In my last post on code coverage, I shared the process for you to collect coverage for your environment. This week, I’ll be describing a way to use our tools to create new tests and show how you can measure the increase of coverage for PowerShell Core after adding new tests. To recap, we can collect code coverage with the OpenCover module, and then inspect the coverage. In this case I would like to know about coverage for a specific cmdlet. For this post, we’re going to focus on the Clear-Content Cmdlet because coverage is ok, but not fantastic and it is small enough to go over easily.

Here’s a partial capture from running the OpenCover tools:

coverage-2a

By selecting the class Microsoft.PowerShell.Commands.ClearContentCommand we can drill into the specifics about the class which implements the Clear-Content cmdlet. We can see that we have about 47% line coverage for this class which isn’t fantastic, by inspecting the red-highlights we can see what’s missing.

coverage-2b

coverage-2c

coverage-2d

It looks like there are some error conditions, and some code which represents whether the underlying provider supports should process are not being tested. We can create tests for these missing areas fairly easily, but I need to know where these new tests should go.

Test Code Layout

Now is a good time to describe how our tests are laid out.

https://github.com/PowerShell/PowerShell/test contains all of the test code for PowerShell. This includes our native tests, C-Sharp tests and Pester tests as well as the tools we use. Our Pester tests should all be found in https://github.com/PowerShell/PowerShell/test/powershell and in that directory there is more structure to make it easier to find tests. For example, if you want to find those tests for a specific cmdlet, you would look in the appropriate module directory for those tests. In our case, we’re adding tests for Clear-Content, which should be found in https://github.com/PowerShell/PowerShell/test/powershell/Modules/Microsoft.PowerShell.Management. (You can always find which module a cmdlet resides via get-command). If we look in this directory, we can already see the file Clear-Content.Tests.ps1, so we’ll add our tests to that file. If that file didn’t exist, you should just create a new file for your tests. Sometimes the tests for a cmdlet may be combined with other tests. Take this as an opportunity to split up the file to make it easier for the next person adding tests. If you want more information about how we segment our tests, you can review https://github.com/PowerShell/PowerShell/docs/testing-guidelines/testing-guidelines.md.

New Test Code

Based on the missing code coverage, I created the following replacement for Clear-Content.Tests.ps1 which you can see in this PR: https://github.com/PowerShell/PowerShell/pull/3157. After rerunning the code coverage tools, I can see that I’ve really improved coverage for this cmdlet.

coverage-2e

There seems to be a small issue with OpenCover as some close braces are not being marked as missed, but you can see the improvement:

coverage-2f

Now it’s your turn and we could really use your help. If you have areas of the product that you rely on, and don’t have the tests that you think they should have, please consider adding tests!

 

 

 

Managing Security Settings on Nano Server with DSC

This post was originally published on this site

We have released DSC resources building upon the previously released security and registry cmdlets for applying security settings. You can now implement Microsoft-defined security baselines using DSC.

AuditPolicyDsc

SecurityPolicyDsc

GPRegistryPolicy

Install all 3 from the Gallery with the command:

install-module SecurityPolicyDsc, AuditPolicyDsc, GpRegistryPolicy 

A sample configuration, below, takes the Security Baselines for Windows Server 2016 and extracts the .inf, .csv and .pol containing the desired security settings from the exported Group Policy Objects. (You can find information on extracting the necessary files in the Registry cmdlets blogpost.) Simply pass the files into the new DSC resources, and you have successfully implemented security baselines using DSC!

This is most useful for Nano Server, since Nano Server doesn’t support Group Policy. However, this approach will work for all installation options. It’s not a good idea to manage the same server using both Group Policy and DSC since the two engines will constantly attempt to overwrite each other if they are both managing the same setting.

WARNING: As with all security settings, you can easily lock yourself out of remote access to your machine if you are not careful. Be sure to carefully review security settings before applying them to Nano Server, and stage test deployments before using security baselines in production!

Configuration SecurityBaseline
{
    Import-DscResource -ModuleName AuditPolicyDsc, SecurityPolicyDSC, GpRegistryPolicy
    node localhost
    {
        SecurityTemplate baselineInf
        {
            Path = "C:UsersAdministratorDocumentsGptTmpl.inf"
            # https://msdn.microsoft.com/powershell/dsc/singleinstance
            IsSingleInstance = "Yes"
        }
        AuditPolicyCsv baselineCsv
        {
            IsSingleInstance = "Yes"
            CsvPath = "C:UsersAdministratorDocumentsaudit.csv"
        }
        RegistryPolicy baselineGpo
        {
            Path = "C:UsersAdministratorDocumentsregistry.pol"
        }
    }
}
#Compile the MOF file
SecurityBaseline 
Start-DscConfiguration -Path ./SecurityBaseline 

Building a GitHub Dashboard using PowerShell, AzureStorageTable, AzureFunction, and PowerBI

This post was originally published on this site
Last week, I published a PowerShell Community Dashboard and today, I’m going to share the code and cover some of the learnings.The code is published as a module on the PowerShell Gallery.
Make sure you get v1.1 as I found an issue where if you’re not a member of the PowerShell Org on GitHub, you won’t have permission to query the members so I changed the code to accommodate that.You can install the module using:

install-module PSGitHubStats

(and it works on PowerShell Core 6.0 including Linux! I only tested it with alpha.15, though…)

Once installed, you can just run it manually:

PS C:> Get-PSDownloadStats -publishedSinceDate 1-1-2017 -accessToken $accesstoken

Tag             Name                                                 OS      Distro    Count Published
---             ----                                                 --      ------    ----- ---------
v6.0.0-alpha.15 powershell-6.0.0-alpha.15.pkg                        MacOS   MacOS     1504  1/25/2017 7:25:52 PM
v6.0.0-alpha.15 powershell-6.0.0_alpha.15-1.el7.centos.x86_64.rpm    Linux   CentOS    436   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 powershell_6.0.0-alpha.15-1ubuntu1.14.04.1_amd64.deb Linux   Ubuntu14  368   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 powershell_6.0.0-alpha.15-1ubuntu1.16.04.1_amd64.deb Linux   Ubuntu16  951   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win10-win2k16-x64.msi      Windows Windows10 349   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win10-win2k16-x64.zip      Windows Windows10 70    1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win7-win2k8r2-x64.msi      Windows Windows7  119   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win7-win2k8r2-x64.zip      Windows Windows7  34    1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win7-x86.msi               Windows Windows7  192   1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win7-x86.zip               Windows Windows7  17    1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win81-win2k12r2-x64.msi    Windows Windows8  74    1/25/2017 7:25:52 PM
v6.0.0-alpha.15 PowerShell_6.0.0-alpha.15-win81-win2k12r2-x64.zip    Windows Windows8  21    1/25/2017 7:25:52 PM

PS C:> $contributors = Get-PSGitHubReport -startDate 1-1-2017 -repos powershell/powershell -accessToken $accesstoken
PS C:> $contributors | ? {$_.Org -eq "Community"} | sort -Property Total -Top 10 -Descending

   Org: Community

Name             PRs               Issues           PR Comments      Issue Comments   Total            End Date
----             ---               ------           -----------      --------------   -----            --------
iSazonov         8                 4                44               46               102              2017-02-03 10...
vors             5                 4                12               6                27               2017-02-03 10...
thezim           0                 2                0                9                11               2017-02-03 10...
juneb            0                 4                0                4                8                2017-02-03 10...
Jaykul           0                 3                0                5                8                2017-02-03 10...
pcgeek86         0                 3                0                5                8                2017-02-03 10...
jeffbi           0                 0                0                6                6                2017-02-03 10...
MaximoTrinidad   0                 2                0                3                5                2017-02-03 10...
g8tguy           0                 0                0                5                5                2017-02-03 10...
mwallner         0                 1                0                3                4                2017-02-03 10...

The $accesstoken is something you would generate and needed because the number of queries I have to do against the GitHub API will likely exceed the unauthenticated rate limit.
I ran over the rate limit many times while generating my report, even as an authenticated user. I solved this by adding a sleep command to the report generation.

One thing you may notice with the module vs the dashboard is that you get the raw numbers rather than just the rankings.
On the dashboard, we decided to only show the rankings so that people don’t focus specifically on the numbers.

Get-PSDownloadStats should be pretty straight forward.
There is some specialized logic in that function to determine the target operating system for the release package which unfortunately depends on the filename.

Publishing to AzureTable is fairly simple once you figure out the magic sauce to provide in the headers to make sure you’re calling the appropriate version of the REST API:

if ($publishToAzure)
{
    $json = $pkg | ConvertTo-Json -Compress
    $date = [datetime]::UtcNow.ToString("R", [System.Globalization.CultureInfo]::InvariantCulture)
    [string] $canonicalizedResource = "/$storageAccount/$storageTable"
    $contentType = "application/json"
    [string] $stringToSign = "POST`n`n$contentType`n$date`n$canonicalizedResource"
    $headers = @{"Prefer"="return-no-content";"Authorization"=(CreateAuthHeader -canonicalizedString $stringToSign -storageAccount $storageAccount -storageKey $storageKey);
        "DataServiceVersion"="3.0;NetFx";"MaxDataServiceVersion"="3.0;NetFx";"Accept"="application/json;odata=nometadata";
        "Accept-Charset"="UTF-8";"x-ms-version"="2013-08-15";"x-ms-date"=$date}
    $null = Invoke-RestMethod -Uri $storageUrl -Headers $headers -Body $json -Method Post -ContentType $contentType
}

I deliberately chose to use an AzureTable for a few reasons:

  • Power BI supports reading from AzureTable natively (although I think you can only do it from the Power BI desktop app) as I couldn’t find the option in the web interface
  • I didn’t need the relational capabilities nor the additional cost of AzureSQL for my purposes
  • I can import JSON directly into AzureTable

The most complicated part of working with AzureTable is correctly crafting the authentication header built from a canonicalized string AzureTable expects to protect against replay attacks.
The string is defined in the previous code section as $stringToSign while this bit of code hashes it and converts the result to Base64:

Function CreateAuthHeader([string]$canonicalizedString,[string]$storageAccount,[string]$storageKey)
{
    [string]$signature = [string]::Empty
    [byte[]]$bytes = [System.Convert]::FromBase64String($storageKey)
    [System.Security.Cryptography.HMACSHA256] $SHA256 = New-Object System.Security.Cryptography.HMACSHA256(,$bytes)
    [byte[]] $dataToSha256 = [System.Text.Encoding]::UTF8.GetBytes($canonicalizedString)
    $signature = [System.Convert]::ToBase64String($SHA256.ComputeHash($dataToSha256))
    "SharedKey $($storageAccount):$signature"
}

Ilya‘s RFC on Get-StringHash should help make this simpler and more readable eliminating several lines of code from that function.

Once I had the module working in the command line, I validated it was correctly uploaded to Azure using Microsoft Azure Storage Explorer. Now I needed to have the script run regularly. I considered both Azure Automation and Azure Functions and decided to use the latter as it was newer, which gave me an opportunity to learn it. One immediate problem I had is that Azure Functions today only supports PowerShell v4. I originally used PowerShell classes for my internal types and thus changed it all to PSCustomObjects.
I’ve since contacted the Azure Functions team and asked them to support both Windows PowerShell v5.x and PowerShell Core 6.0 in the future.

With Azure Functions, you really only have the ability to run a PowerShell script. This means that unless you install the Azure PowerShell module at runtime (and PowerShellGet isn’t part of PowerShell v4), you really can only use what is available with PowerShell. Azure Automation would be better suited if you want to use modules.

I just cut and pasted the code out of my module into the web interface and added a call to my functions at the end supplying all the necessary parameters to make sure I was getting the right data from GitHub, and uploading the data correctly into AzureTable. One thing to note is that you’ll see I pass -UseBasicParsing to all my Invoke-WebRequest calls as the ParsedHtml property in the output relies on MSHTML which relies on Internet Explorer.
IE is not available in the Azure Function container your script is running in. I should also mention that I use both Invoke-WebRequest and Invoke-RestMethod where the former is needed when I need to access the response headers specifically for pagination of the GitHub API response which is handled by this bit of code:

if ($null -ne $output.Headers.Link) {
    $links = $output.Headers.Link.Split(",").Trim()
    foreach ($link in $links) {
        if ($link -match "<(?<url>.*?)>;srel=`"(?<rel>.*?)`"") {
            if ($matches.rel -eq 'next') {
                $query = $matches.url
            }
        }
    }
}

I’ll be working to add this capability into Invoke-RestMethod so this bit of code can be removed.

Building the Power BI visualization is a whole other topic and @MSFTzachal really did most of the work, but my recommendation is to use the Power BI desktop app which I found easier and more powerful than the current web interface.

Steve Lee
Principal Software Engineer Manager
PowerShell Core

Installing latest PowerShell Core 6.0 Release on Linux just got easier!

This post was originally published on this site
As we continue our journey from Alpha releases and eventually to Beta, you can continue to download the latest releases from our GitHub repository.
However, our goal has always been to enable installation through popular existing Linux package management tools like apt-get and yum.I am pleased to announce that we have now published PowerShell Core 6.0 alpha.15 to https://packages.microsoft.com!

Install PowerShell Core on Ubuntu 14.04

# Import the public repository GPG keys
curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -

# Register the Microsoft Ubuntu repository
curl https://packages.microsoft.com/config/ubuntu/14.04/prod.list | sudo tee /etc/apt/sources.list.d/microsoft.list

# Update apt-get
sudo apt-get update

# Install PowerShell
sudo apt-get install -y powershell

# Start PowerShell
powershell

Install PowerShell Core on Ubuntu 16.04

# Import the public repository GPG keys
curl https://packages.microsoft.com/keys/microsoft.asc | sudo apt-key add -

# Register the Microsoft Ubuntu repository
curl https://packages.microsoft.com/config/ubuntu/16.04/prod.list | sudo tee /etc/apt/sources.list.d/microsoft.list

# Update apt-get
sudo apt-get update

# Install PowerShell
sudo apt-get install -y powershell

# Start PowerShell
powershell

Install PowerShell Core on CentOS

# Enter superuser mode
sudo su 

# Register the Microsoft RedHat repository
curl https://packages.microsoft.com/config/rhel/7/prod.repo > /etc/yum.repos.d/microsoft.repo

# Exit superuser mode
exit

# Install PowerShell
sudo yum install -y powershell

# Start PowerShell
powershell

After registering the Microsoft repository once as superuser, from then on, you just need to use either sudo apt-get install powershell or sudo yum update powershell (depending on which distro you are using) to update it.

We plan to simultaneously publish to https://packages.microsoft.com and our GitHub repository for each new release.

Steve Lee
Principal Software Engineer Manager
PowerShell Core

ALERT: vRealize Automation patch to prevent potential data loss when using bulk import

This post was originally published on this site

VMware Support Alert Using bulk import, customers can bring unmanaged machines under management of vRealize Automation. However, if transient issues occur while performing the import operation, this may result in destruction of the machine being imported and data loss.

To guard against such data loss, customers are urged to read KB article: VMware vRealize Automation 7.0 patch to prevent potential data loss when using bulk import (2144526)

Any updates to this information will be made in the KB article itself.

The post ALERT: vRealize Automation patch to prevent potential data loss when using bulk import appeared first on Support Insider.