Tag Archives: Powershell

PowerShellGet 3.0 Preview 6 Release

This post was originally published on this site

Since our last blog post we have released two preview versions (previews 4 and 6) of the PowerShellGet 3.0 module. These releases have introduced publish functionality, repository wildcard search, and fixed error handling when a repository is not accessible in Find-PSResource. Note that preview 5 was unlisted due to a packaging error discovered post-publish.

To install the latest version module, open any PowerShell console and run:

Install-Module PowerShellGet -Force -AllowPrerelease -Repository PSGallery

Highlights of the releases

Preview 4 (3.0.0-beta4) Highlights

New Feature

Wildcard search for the -Repository parameter in Find-PSResource. This allows the user to return results from all registered PSRepositories instead of just their prioritized repository. To use this feature add -Repository '*' to your call to Find-PSResource.

Bug Fix

Fixed poor error handling for when repository is not accessible in Find-PSResource.

Preview 6 (3.0.0-beta6) Highlight

New Feature

The cmdlet Publish-PSResource was introduced which allows users to publish PowerShell resources to any registered PSRepository.

What’s next

We have 3 planned upcoming releases for the module:

  • The Preview 7 release will focus on update functionality, along with several bug fixes that have been reported by users through our preview releases.
  • The Release Candidate (RC) release will resolve any remaining bugs not resolved in our Preview 6 release.
  • The 3.0 General Availability (GA) release will be the same as the RC version so long as no blocking or high-risk bugs are found in the release candidate. If there are any blocking or high-risk bugs, we will release another release candidate before GA.

Migration to PowerShellGet 3.0

We hope to ship the the latest preview of PowerShellGet 3.0 in the next preview of PowerShell 7.1 (preview 6). The goal for this version of PowerShellGet, which will ship in PowerShell 7.1 preview 6, is to contain a compatibility module that will enable scripts with PowerShell 2.x cmdlets (ex. Install-Module) to be run using the PowerShellGet 3.0 module. This means that users will likely not need to update their scripts to use PowerShellGet 2.x cmdlets with PowerShell 7.1. It is important to note, as well, that on systems which contain any other version of PowerShell, the PowerShellGet 2.x module will still be available and used.

We hope to ship PowerShellGet 3.0 with a compatibility layer into PowerShell 7.1 as the sole version of PowerShellGet in the package. However, we will only do this if we reach GA, with a high bar for release quality, in time for the PowerShell 7.1 release candidate.

How to get support and give feedback

We cannot overstate how critical user feedback is at this stage in the development of the module. Feedback from preview releases help inform design decisions without incurring a breaking change once generally available and used in production. To help us to make key decisions around the behavior of the module please give us feedback by opening issues in our GitHub repository.

Sydney Smith PowerShell Team

The post PowerShellGet 3.0 Preview 6 Release appeared first on PowerShell.

Native Commands in PowerShell – A New Approach – Part 2

This post was originally published on this site

Native Commands in PowerShell
A New Approach – Part 2

In my last post I went through some some strategies for executing native executable and having them participate more fully in the PowerShell environment. In this post, I’ll be going through a couple of experiments I’ve done with the kubernetes kubectl utility.

Is there a better way

It may be possible to create a framework that inspects the help of the application and automatically creates the code that calls the underlying application. This framework can also handle the output mapping to an object more suitable for the PowerShell environment.

Possibilities in wrapping

The aspect that makes this possible is that some commands have consistently structured help that describes how the application can be used. If this is the case, then we can iteratively call the help, parse it, and automatically construct much of the infrastructure needed to allow these native applications to be incorporated into the PowerShell environment.

First Experiment – Microsoft.PowerShell.Kubectl Module

I created a wrapper for to take the output of kubectl api-resources and create functions for each returned resource. This way, instead of running kubectl get pod; I could run Get-KubectlPod (a much more PowerShell-like experience). I also wanted to have the function return objects that I could then use with other PowerShell tools (Where-Object, ForEach-Object, etc). To do this, I needed a way to map the output (JSON) of the kubectl tool to PowerShell objects. I decided that it was reasonable to use a more declarative approach that maps the property in the JSON to a PowerShell class member.

There were some problems that I wanted to solve with this first experiment

  • wrap kubectl api-resources in a function
    • automatically create object output from kubectl api-resources
  • Auto-generate functions for each resource that could be retrieved (only resource get for now)
    • only support name as a parameter
  • Auto-generate the conversion of output to objects to look similar to the usual kubectl output

When it came to wrapping kubectl api-resources I took the static approach rather than auto generation. First, because it was my first attempt so I was still finding my feet. Second, because this is one of the kubectl commands that does not emit JSON. So, I took the path of parsing the output of kubectl api-resources -o wide. My concern is that I wasn’t sure whether the table changes width based on the screen width. I calculated column positions based on the fields I knew to be present and then parsed the line using the offsets. You can see the code in the function get-kuberesource and the constructor for the PowerShell class KubeResource. My plan was that these resources would drive the auto-generation of the Kubernetes resource functions.

Now that I have retrieved the resources, I can auto-generate specific resource function for calling the kubectl get <resource>. At the time, I wanted some flexibility in the creation of these proxy functions, so I provided a way to include a specific implementation, if desired (see the $proxyFunctions hashtable). I’m not sure that’s needed now, but we’ll get to that later. The problem is that while the resource data can be returned as JSON, that JSON has absolutely no relation to the way the data is represented in the kubectl get pod table. Of course, in PowerShell we can create formatting to present any member of an object (JSON or otherwise), but I like to be sure that the properties seen in a table are properties that I can use with Where-Object, etc. Since, I want to return the data as objects, I created classes for a couple resources by hand but thought there might be a better way.

I determined that when you get data from kubernetes, the table (both normal and wide) output is created on the server. This means the mapping of the properties of the JSON object to the table columns is defined in the server code. It’s possible to provide data as custom columns, but you need to provide the value for the column using a JSON path expression. So, it’s not possible to automatically generate those tables. However, I thought it might be possible to provide a configuration file that could be read to automatically generate a PowerShell class. The configuration file would need to define the mapping between the property in the table with the properties of the object. The file would include the name of the column and the expression to get the value for the object. This allows a user to retrieve the JSON object and construct their custom object without touching the programming logic of the module but a configuration file. I created the ResourceConfiguration.json file to encapsulate all the resources that I had access to and provide a way to customize the object members where desired.

here’s an example:

  {
    "TypeName": "namespaces",
    "Fields": [
      {
        "PropertyName": "NAME",
        "PropertyReference": "$o.metadata.NAME"
      },
      {
        "PropertyName": "STATUS",
        "PropertyReference": "$o.status.phase"
      },
      {
        "PropertyName": "AGE",
        "PropertyReference": "$o.metadata.creationTimeStamp"
      }
    ]
  },

This JSON is converted into a PowerShell class whose constructor takes the JSON object and assigns the values to the members, according to the PropertyReference. The module automatically attaches the original JSON to a hidden member originalObject so if you want to inspect all the data that’s available, you can. The module also automatically generates a proxy function so you can get the data:

function Get-KubeNamespace
{
  [CmdletBinding()]
  param ()
  (Invoke-KubeCtl -Verb get -resource namespaces).Foreach({[namespaces]::new($_)})
}

This function is then exported so it’s available in the module. When used, it behaves very close to the original:

PS> Get-KubeNamespace

Name                 Status Age
----                 ------ ---
default              Active 5/6/2020 6:13:07 PM
default-mem-example  Active 5/14/2020 8:14:45 PM
docker               Active 5/6/2020 6:14:25 PM
kube-node-lease      Active 5/6/2020 6:13:05 PM
kube-public          Active 5/6/2020 6:13:05 PM
kube-system          Active 5/6/2020 6:13:05 PM
kubernetes-dashboard Active 5/18/2020 8:44:01 PM
openfaas             Active 5/6/2020 6:51:22 PM
openfaas-fn          Active 5/6/2020 6:51:22 PM

PS> kubectl get namespaces --all-namespaces

NAME                   STATUS   AGE
default                Active   26d
default-mem-example    Active   18d
docker                 Active   26d
kube-node-lease        Active   26d
kube-public            Active   26d
kube-system            Active   26d
kubernetes-dashboard   Active   14d
openfaas               Active   26d
openfaas-fn            Active   26d

but importantly, I can use the output with Where-Object and ForEach-Object or change the format to list, etc.

PS> Get-KubeNamespace |? name -match "faas"

Name        Status Age
----        ------ ---
openfaas    Active 5/6/2020 6:51:22 PM
openfaas-fn Active 5/6/2020 6:51:22 PM

Second Experiment – Module KubectlHelpParser

I wanted to see if I could read any help content from kubectl that would enable me to auto-generate a complete proxy of the kubectl command that included general parameters, command specific parameters, and help. It turns out that kubectl help is regular enough that this is quite possible.

When retrieving help, kubectl provides subcommands that also have structured help. I created a recursive parser that allowed me to retrieve all of the help for all of the available kubectl commands. This means that if an additional command is provided in the future, and the help for that command follows the existing pattern for help, this parser will be able to generate a command for it.

PS> kubectl --help
kubectl controls the Kubernetes cluster manager.

 Find more information at: https://kubernetes.io/docs/reference/kubectl/overview/

Basic Commands (Beginner):
  create         Create a resource from a file or from stdin.
  expose         Take a replication controller, service, deployment or pod and expose it as a new Kubernetes Service
  run            Run a particular image on the cluster
  set            Set specific features on objects

Basic Commands (Intermediate):
  explain        Documentation of resources
  get            Display one or many resources
. . .

kubectl set --help

PS> kubectl set --help

Configure application resources

 These commands help you make changes to existing application resources.

Available Commands:
  env            Update environment variables on a pod template
  . . .
  subject        Update User, Group or ServiceAccount in a RoleBinding/ClusterRoleBinding

Usage:
  kubectl set SUBCOMMAND [options]

PS> kubectl set env --help

Update environment variables on a pod template.

 List environment variable definitions in one or more pods, pod templates. Add, update, or remove container environment
variable definitions in one or more pod templates (within replication controllers or deployment configurations). View or
modify the environment variable definitions on all containers in the specified pods or pod templates, or just those that
match a wildcard.

 If "--env -" is passed, environment variables can be read from STDIN using the standard env syntax.

 Possible resources include (case insensitive):

  pod (po), replicationcontroller (rc), deployment (deploy), daemonset (ds), job, replicaset (rs)

Examples:
  # Update deployment 'registry' with a new environment variable
  kubectl set env deployment/registry STORAGE_DIR=/local
  . . .
  # Set some of the local shell environment into a deployment config on the server
  env | grep RAILS_ | kubectl set env -e - deployment/registry

Options:
      --all=false: If true, select all resources in the namespace of the specified resource types
      --allow-missing-template-keys=true: If true, ignore any errors in templates when a field or map key is missing in
the template. Only applies to golang and jsonpath output formats.
  . . .
      --template='': Template string or path to template file to use when -o=go-template, -o=go-template-file. The
template format is golang templates [http://golang.org/pkg/text/template/#pkg-overview].

Usage:
  kubectl set env RESOURCE/NAME KEY_1=VAL_1 ... KEY_N=VAL_N [options]

Use "kubectl options" for a list of global command-line options (applies to all commands).

The main function of the module will recursively collect the help for all of the commands and construct an object representation that I hope can then be used to generate the proxy functions. This is still very much a work in progress, but it is definitely showing promise. Here’s an example of what it can already do.

PS> import-module ./KubeHelpParser.psm1
PS> $res = get-kubecommands
PS> $res.subcommands[3].subcommands[0]
Command             : set env
CommandElements     : {, set, env}
Description         : Update environment variables on a pod template.

                       List environment variable definitions in one or more pods, pod templates. Add, update, or remove container environment variable definitions in one or more pod templates (within replication controllers or deployment configurations). View or modify the environment variable definitions
                      on all containers in the specified pods or pod templates, or just those that match a wildcard.

                       If "--env -" is passed, environment variables can be read from STDIN using the standard env syntax.

                       Possible resources include (case insensitive):

                        pod (po), replicationcontroller (rc), deployment (deploy), daemonset (ds), job, replicaset (rs)
Usage               : kubectl set env RESOURCE/NAME KEY_1=VAL_1 ... KEY_N=VAL_N [options]
SubCommands         : {}
Parameters          : {[Parameter(Mandatory=$False)][switch]${All}, [Parameter(Mandatory=$False)][switch]${NoAllowMissingTemplateKeys}, [Parameter(Mandatory=$False)][System.String]${Containers} = "*", [Parameter(Mandatory=$False)][switch]${WhatIf}…}
MandatoryParameters : {}
Examples            : {kubectl set env deployment/registry STORAGE_DIR=/local, kubectl set env deployment/sample-build --list, kubectl set env pods --all --list, kubectl set env deployment/sample-build STORAGE_DIR=/data -o yaml…}
PS> $res.subcommands[3].subcommands[0].usage
Usage                                                               supportsFlags hasOptions
-----                                                               ------------- ----------
kubectl set env RESOURCE/NAME KEY_1=VAL_1 ... KEY_N=VAL_N [options]         False       True
PS> $res.subcommands[3].subcommands[0].examples
Description                                                   Command
-----------                                                   -------
Update deployment 'registry' with a new environment variable  kubectl set env deployment/registry STORAGE_DIR=/local
. . .

PS> $res.subcommands[3].subcommands[0].parameters.Foreach({$_.tostring()})

[Parameter(Mandatory=$False)][switch]${All}
[Parameter(Mandatory=$False)][switch]${NoAllowMissingTemplateKeys}
[Parameter(Mandatory=$False)][System.String]${Containers} = "*"
[Parameter(Mandatory=$False)][switch]${WhatIf}
. . .
[Parameter(Mandatory=$False)][System.String]${Selector}
[Parameter(Mandatory=$False)][System.String]${Template}

There are still a lot of open questions and details to work out here:

  • how are mandatory parameters determined?
  • how do we keep a map of used parameters?
  • does parameter order matter?
  • can reasonable debugging be provided?
  • do we have to “boil the ocean” to provide something useful?

I believe it may be possible to create a more generic framework which would allow a larger number native executables to be more fully incorporated into the PowerShell ecosystem. These are just the first steps in the investigation, but it looks very promising.

Call To Action

First, I’m really interested in knowing that having a framework that can auto-generate functions that wrap a native executable is useful. The obvious response might be “of course”, but how much of a solution is really needed to provide value? Second, I would really like to know if you would like us to investigate specific tools for this sort of treatment. If it is possible to make this a generic framework, I would love to have more examples of tools which would be beneficial to you and test our ability to handle.

James Truher
Software Engineer
PowerShell Team

The post Native Commands in PowerShell – A New Approach – Part 2 appeared first on PowerShell.

Resolving PowerShell Module Assembly Dependency Conflicts

This post was originally published on this site

When writing a PowerShell module, especially a binary module (i.e. one written in a language like C# and loaded into PowerShell as an assembly/DLL), it’s natural to take dependencies on other packages or libraries to provide functionality.

Taking dependencies on other libraries is usually desirable for code reuse. However, PowerShell always loads assemblies into the same context, and this can present issues when a module’s dependencies conflict with already-loaded DLLs, preventing two otherwise unrelated modules from being used together in the same PowerShell session. If you’ve been hit by this yourself, you would have seen an error message like this:

Assembly load conflict error message.

In this blog post, we’ll look at some of the ways dependency conflicts can arise in PowerShell, and some of the ways to mitigate dependency conflict issues. Even if you’re not a module author, there are some tricks in here that might help you with dependency conflicts arising in modules that you use.

Contents

This is a fairly long blog post, so here’s a table of contents:

Why do dependency conflicts occur?

In .NET, dependency conflicts arise when two versions of the same assembly are loaded into the same Assembly Load Context (this term means slightly different things on different .NET platforms, which we’ll cover later). This conflict issue is a common problem that occurs essentially everywhere in software where versioned dependencies are used. Because there are no simple solutions (and because many dependency resolution frameworks try to solve the problem in different ways), this situation is often called “dependency hell”.

Conflict issues are compounded by the fact that a project almost never deliberately or directly depends on two versions of the same dependency, but instead depends on two different dependencies that each require a different version of the same dependency.

For example, say your .NET application, DuckBuilder, brings in two dependencies, to perform parts of its functionality and looks like this:

Two dependencies of DuckBuilder rely on different versions of Newtonsoft.Json

Because Contoso.ZipTools and Fabrikam.FileHelpers both depend on Newtonsoft.Json, but different versions, there may be a dependency conflict, depending on how each dependency is loaded.

Conflicting with PowerShell’s dependencies

In PowerShell, the dependency conflict issue is exacerbated because PowerShell modules, and PowerShell’s own dependencies, are all loaded into the same shared context. This means the PowerShell engine and all loaded PowerShell modules must not have conflicting dependencies.

One scenario in which this can cause issues is where a module has a dependency that conflicts with one of PowerShell’s own dependencies. A classic example of this is Newtonsoft.Json:

FictionalTools module depends on newer version of Newtonsoft.Json than PowerShell

In this example, the module FictionalTools depends on Newtonsoft.Json version 12.0.3, which is a newer version of Newtonsoft.Json than 11.0.2 that ships in the example PowerShell. (To be clear, this is an example and PowerShell 7 currently ships with Newtonsoft.Json 12.0.3.)

Because the module depends on a newer version of the assembly, it won’t accept the version that PowerShell already has loaded, but because PowerShell has already loaded a version of the assembly, the module can’t load its own using the conventional load mechanism.

Conflicting with another module’s dependencies

Another common scenario in PowerShell is that a module is loaded that depends on one version of an assembly, and then another module is loaded later that depends on a different version of that assembly.

This often looks like the following:

Two PowerShell modules require different versions of the Microsoft.Extensions.Logging dependency

In the above case, the FictionalTools module requires a newer version of Microsoft.Extensions.Logging than the FilesystemManager module.

Let’s imagine these modules load their dependencies by placing the dependency assemblies in the same directory as the root module assembly and allowing .NET to implicitly load them by name. If we are running PowerShell 7 (on top of .NET Core 3.1), we can load and run FictionalTools and then load and run FilesystemManager without issue. However, if in a new session we load and run FilesystemManager and then FictionalTools, we will encounter a FileLoadException from the FictionalTools command, because it requires a newer version of Microsoft.Extensions.Logging than the one loaded, but cannot load it because an assembly of the same name has already been loaded.

PowerShell and .NET

PowerShell runs on the .NET platform, and since we’re discussing assembly dependency conflicts, we must discuss .NET. .NET is ultimately responsible for resolving and loading assembly dependencies, so we must understand how .NET operates here to understand dependency conflicts.

We must also confront the fact that different versions of PowerShell run on different .NET implementations, which respectively have their own mechanisms for assembly resolution.

In general, PowerShell 5.1 and below run on .NET Framework, while PowerShell 6 and above run on .NET Core. These two flavours of .NET load and handle assemblies somewhat differently, meaning resolving dependency conflicts can vary depending on the underlying .NET platform.

Assembly Load Contexts

In this post, we’ll use the term “Assembly Load Context” (ALC) frequently. An Assembly Load Context is a .NET concept that essentially means a runtime namespace into which assemblies are loaded and within which assemblies’ names are unique. This concept allows assemblies to be uniquely resolved by name in each ALC.

Assembly reference loading in .NET

The semantics of assembly loading (whether versions clash, whether that assembly’s dependencies are handled, etc.) depend on both the .NET implementation (.NET Core vs .NET Framework) and the API or .NET mechanism used to load a particular assembly.

Rather than go into deep detail describing these here, there is a list of links in the Further reading section that go into great detail on how .NET assembly loading works in each .NET implementation.

In this blog post, we’ll refer to the following mechanisms:

  • Implicit assembly loading (effectively Assembly.Load(AssemblyName)), when .NET implicitly tries to load an assembly by name from a static assembly reference in .NET code.
  • Assembly.LoadFrom(), a plugin-oriented loading API that will add handlers to resolve dependencies of the loaded DLL (but which may not do so the way we want).
  • Assembly.LoadFile(), a bare-bones loading API intended to load only the assembly asked for with no handling of its dependencies.

The way these APIs work has changed in subtle ways between .NET Core and .NET Framework, so it’s worth reading through the further reading links.

Differences in .NET Framework vs .NET Core

Importantly, Assembly Load Contexts and other assembly resolution mechanisms have changed between .NET Framework and .NET Core.

In particular .NET Framework has the following features:

  • The Global Assembly Cache, for machine-wide assembly resolution
  • Application Domains, which work like in-process sandboxes for assembly isolation, but also present a serialization layer to contend with
  • A limited assembly load context model, explained in depth here, that has a fixed set of assembly load contexts, each with their own behaviour:
    • The default load context, where assemblies are loaded by default
    • The load-from context, for loading assemblies manually at runtime
    • The reflection-only context, for safely loading assemblies to read their metadata without running them
    • The mysterious void that assemblies loaded with Assembly.LoadFile(string path) and Assembly.Load(byte[] asmBytes) live in

.NET Core (and .NET 5+) has eschewed this complexity for a simpler model:

  • No Global Assembly Cache; applications bring all their own dependencies (PowerShell, as the plugin host, complicates this slightly for modules ??? its dependencies in $PSHOME are shared with all modules). This removes an exogenous factor for dependency resolution in applications, making dependency resolution more reproducible.
  • Only one Application Domain, and no ability to create new ones. The Application Domain concept lives on purely to be the global state of the .NET process.
  • A new, extensible Assembly Load Context model, where assembly resolution can be effectively namespaced by putting it in a new ALC. .NET Core processes begin with a single default ALC into which all assemblies are loaded (except for those loaded with Assembly.LoadFile(string) and Assembly.Load(byte[])), but are free to create and define their own custom ALCs with their own loading logic. When an assembly is loaded, the first ALC it is loaded into is responsible for resolving its dependencies, creating opportunities to create powerful .NET plugin loading mechanisms.

In both implementations, assemblies are loaded lazily when a method requiring their type is run for the first time.

For example here are two versions of the same code that load a dependency at different times.

The first will always load its dependency when Program.GetRange() is called, because the dependency reference is lexically present within the method:

using Dependency.Library;

public static class Program
{
    public static List<int> GetRange(int limit)
    {
        var list = new List<int>();
        for (int i = 0; i < limit; i++)
        {
            if (i >= 20)
            {
                // Dependency.Library will be loaded when GetNumbers is run
                // because the dependency call occurs directly within the method
                DependencyApi.Use();
            }

            list.Add(i);
        }
        return list;
    }
}

The second will load its dependency only if the limit parameter is 20 or more, because of the internal indirection through a method:

using Dependency.Library;

public static class Program
{
    public static List<int> GetNumbers(int limit)
    {
        var list = new List<int>();
        for (int i = 0; i < limit; i++)
        {
            if (i >= 20)
            {
                // Dependency.Library is only referenced within
                // the UseDependencyApi() method,
                // so will only be loaded when limit >= 20
                UseDependencyApi();
            }

            list.Add(i);
        }
        return list;
    }

    private static void UseDependencyApi()
    {
        // Once UseDependencyApi() is called, Dependency.Library is loaded
        DependencyApi.Use();
    }
}

This is a good thing for .NET to do, since it minimizes the memory and filesystem reads it needs to use, making it more resource efficient.

Unfortunately a side effect of this is that if an assembly load will fail, we won’t necessarily know when the program is first loaded, only when the code path that tries to load the assembly is run.

It can also set up timing conditions for assembly load conflicts; if two parts of the same program will try to load different versions of the same assembly, which one is loaded depends on which code path is run first.

For PowerShell this means that the following factors can affect an assembly load conflict:

  • Which module was loaded first?
  • Was the cmdlet/code path that uses the dependency library run?
  • Does PowerShell load a conflicting dependency at startup or only under certain code paths?

Quick fixes and their limitations

In some cases it’s possible to make small adjustments to your module and fix things with a minimum of fuss. But these solutions tend to come with caveats, so that while they may apply to your module, they won’t work for every module.

Change your dependency version

The simplest way to avoid dependency conflicts is to agree on a dependency. This may be possible when:

  • Your conflict is with a direct dependency of your module and you control the version
  • Your conflict is with an indirect dependency, but you can configure your direct dependencies to use a workable indirect dependency version
  • You know the conflicting version and can rely on it not changing

A good example of this last scenario is with the Newtonsoft.Json package. This is a dependency of PowerShell 6 and above, and isn’t used in Windows PowerShell, meaning a simple way to resolve versioning conflicts is to target the lowest version of Newtonsoft.Json across the PowerShell versions you wish to target.

For example, PowerShell 6.2.6 and PowerShell 7.0.2 both currently use Newtonsoft.Json version 12.0.3. So, to create a module targeting Windows PowerShell, PowerShell 6 and PowerShell 7, you would target Newtonsoft.Json 12.0.3 as a dependency and include it in your built module. When the module is loaded in PowerShell 6 or 7, PowerShell’s own Newtonsoft.Json assembly will already be loaded, but the version will be at least the required one for your module, meaning resolution will succeed. In Windows PowerShell, the assembly will not be already present in PowerShell, and so will be loaded from your module instead.

Generally, when targeting a concrete PowerShell package, like Microsoft.PowerShell.Sdk or System.Management.Automation, NuGet should be able to resolve the right dependency versions required. It’s only in the case of targeting both Windows PowerShell and PowerShell 6+ that dependency versioning becomes more difficult, either because of needing to target multiple frameworks, or due to targeting PowerShellStandard.Library.

Circumstances where pinning to a common dependency version won’t work include:

  • The conflict is with an indirect dependency, and there’s no configuration of your dependencies that can use the common version
  • The other dependency version is likely to change often, meaning settling on a common version will only be a short-term fix

Add an AssemblyResolve event handler to create a dynamic binding redirect

When changing your own dependency version isn’t possible, or is too hard, another way to make your module play nicely with other dependencies is to set up a dynamic binding redirect by registering an event handler for the AssemblyResolve event.

As with a static binding redirect, the point here to is force all consumers of a dependency to use the same actual assembly. This means we need to intercept calls to load the dependency and always redirect them to the version we want.

This looks something like this:

// Register the event handler as early as you can in your code.
// A good option is to use the IModuleAssemblyInitializer interface
// that PowerShell provides to run code early on when your module is loaded.

// This class will be instantiated on module import and the OnImport() method run.
// Make sure that:
//  - the class is public
//  - the class has a public, parameterless constructor
//  - the class implements IModuleAssemblyInitializer
public class MyModuleInitializer : IModuleAssemblyInitializer
{
    public void OnImport()
    {
        AppDomain.CurrentDomain.AssemblyResolve += DependencyResolution.ResolveNewtonsoftJson;
    }
}

// Clean up the event handler when the the module is removed
// to prevent memory leaks.
//
// Like IModuleAssemblyInitializer, IModuleAssemblyCleanup allows
// you to register code to run when a module is removed (with Remove-Module).
// Make sure it is also public with a public parameterless contructor
// and implements IModuleAssemblyCleanup.
public class MyModuleCleanup : IModuleAssemblyCleanup
{
    public void OnRemove()
    {
        AppDomain.CurrentDomain.AssemblyResolve -= DependencyResolution.ResolveNewtonsoftJson;
    }
}

internal static class DependencyResolution
{
    private static readonly string s_modulePath = Path.GetDirectoryName(
        Assembly.GetExecutingAssembly().Location);

    public static Assembly ResolveNewtonsoftJson(object sender, ResolveEventArgs args)
    {
        // Parse the assembly name
        var assemblyName = new AssemblyName(args.Name);

        // We only want to handle the dependency we care about.
        // In this example it's Newtonsoft.Json.
        if (!assemblyName.Name.Equals("Newtonsoft.Json"))
        {
            return null;
        }

        // Generally the version of the dependency you want to load is the higher one,
        // since it's the most likely to be compatible with all dependent assemblies.
        // The logic here assumes our module always has the version we want to load.
        // Also note the use of Assembly.LoadFrom() here rather than Assembly.LoadFile().
        return Assembly.LoadFrom(Path.Combine(s_modulePath, "Newtonsoft.Json.dll"));
    }
}

Note that you can technically register a scriptblock within PowerShell to handle an AssemblyResolve event, but:

  • An AssemblyResolve event may be triggered on any thread, something that PowerShell will be unable to handle.
  • Even when events are handled on the right thread, PowerShell’s scoping concepts mean that the event handler scriptblock must be written carefully to not depend on variables defined outside it.

There are situations where redirecting to a common version will not work:

  • When the dependency has made breaking changes between versions, or when dependent code relies on a functionality otherwise not available in the version you redirect to
  • When your module isn’t loaded before the conflicting dependency. If you register an AssemblyResolve event handler after the dependency has been loaded, it will never be fired. If you’re trying to prevent dependency conflicts with another module, this may be an issue, since the other module may be loaded first.

Use the dependency out of process

This solution is more for module users than module authors, but is possible a solution to use when confronted with a module that won’t work due to an existing dependency conflict.

Dependency conflicts occur because two versions of the same assembly are loaded into the same .NET process. So a simple solution is to load them into different processes, as long as you can still use the functionality from both together.

In PowerShell, there are several ways to achieve this.

Invoke PowerShell as a subprocess

A simple way to run a PowerShell command out of the current process is to just start a new PowerShell process directly with the command call:

pwsh -c 'Invoke-ConflictingCommand'

The main limitation here is that restructuring the result can be trickier or more error prone than other options.

The PowerShell job system

The PowerShell job system also runs commands out of process, by sending commands to a new PowerShell process and returning the results:

$result = Start-Job { Invoke-ConflictingCommand } | Receive-Job -Wait

In this case, you just need to be sure that any variables and state are passed in correctly.

The job system can also be slightly cumbersome when running small commands.

PowerShell remoting

When it’s available, PowerShell remoting can be a very ergonomic way to run commands out of process. With remoting you can create a fresh PSSession in a new process, call its commands over PowerShell remoting and then use the results locally with, for example, the other module with the conflicting dependencies.

An example might look like this:

# Create a local PowerShell session
# where the module with conflicting assemblies will be loaded
$s = New-PSSession

# Import the module with the conflicting dependency via remoting,
# exposing the commands locally
Import-Module -PSSession $s -Name ConflictingModule

# Run a command from the module with the conflicting dependencies
Invoke-ConflictingCommand

Implicit remoting to Windows PowerShell

Another option in PowerShell 7 is to use the -UseWindowsPowerShell flag on Import-Module. This will import the module through a local remoting session into Windows PowerShell:

Import-Module -Name ConflictingModule -UseWindowsPowerShell

Be aware of course that modules may not work with or work differently with Windows PowerShell.

When not to use out-of-process invocation

As a module author, out-of-process command invocation is harder to bake into a module, and may have edge cases that cause issues. In particular, remoting and jobs may not be available in all environments where your module needs to work. However, the general principle of moving the implementation out of process and allowing the PowerShell module to be a thinner client, may still be applicable.

Of course, for module users too there will be cases where out-of-process invocation won’t work:

  • When PowerShell remoting is unavailable, for example if you don’t have privileges to use it or it is not enabled.
  • When a particular .NET type is needed from output, for example to run methods on, or as input to another command. Commands run over PowerShell remoting emit deserialized output, meaning they return psobjects rather than strongly-typed .NET objects. This means that method calls and strongly typed APIs will not work with the output of commands imported over remoting.

More robust solutions

The solutions above have all had the issue that there are scenarios and modules for which they won’t work. However, they also have the virtue of being relatively simple to implement correctly.

These next solutions we discuss are generally more robust, but also take somewhat more work to implement correctly and can introduce subtle bugs if not written carefully.

Loading through .NET Core Assembly Load Contexts

Assembly Load Contexts (ALCs) as an implementable API were introduced in .NET Core 1.0 to specifically address the need to load multiple versions of the same assembly into the same runtime.

Within .NET Core and .NET 5, they offer what is far and away the most robust solution to the problem of needing to load conflicting versions of an assembly. However, custom ALCs are not available in .NET Framework. This means that this solution will only work in PowerShell 6 and above.

Currently, the best example of using an ALC for dependency isolation in PowerShell is in PowerShell Editor Services (the language server for the PowerShell extension for Visual Studio Code), where an ALC is used to prevent PowerShell Editor Services’ own dependencies from clashing with those in PowerShell modules.

Implementing module dependency isolation with an ALC is conceptually difficult, but we will work through a minimal example here.

Imagine we have a simple module, only intended to work in PowerShell 7, whose source is laid out like this:

+ AlcModule.psd1
+ src/
    + TestAlcModuleCommand.cs
    + AlcModule.csproj

The cmdlet implementation looks like this:

using Shared.Dependency;

namespace AlcModule
{
    [Cmdlet(VerbsDiagnostic.Test, "AlcModule")]
    public class TestAlcModuleCommand : Cmdlet
    {
        protected override void EndProcessing()
        {
            // Here's where our dependency gets used
            Dependency.Use();
            // Something trivial to make our cmdlet do *something*
            WriteObject("done!");
        }
    }
}

The (heavily simplified) manifest, looks like this:

@{
    Author = 'Me'
    ModuleVersion = '0.0.1'
    RootModule = 'AlcModule.dll'
    CmdletsToExport = @('Test-AlcModule')
    PowerShellVersion = '7.0'
}

And the csproj looks like this:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netcoreapp3.1</TargetFramework>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Shared.Dependency" Version="1.0.0" />
    <PackageReference Include="Microsoft.PowerShell.Sdk" Version="7.0.1" PrivateAssets="all" />
  </ItemGroup>
</Project>

When we build this module, the generated output has the following layout:

AlcModule/
  + AlcModule.psd1
  + AlcModule.dll
  + Shared.Dependency.dll

In this example, our problem lies in the Shared.Dependency.dll assembly, which is our imaginary conflicting dependency. This is the dependency that we need to put behind an ALC so that we can use the module specific one.

We seek to re-engineer the module so that:

  • Module dependencies are only ever loaded into our custom ALC, and not into PowerShell’s ALC, so there can be no conflict. Moreover, as we add more dependencies to our project, we don’t want to continuously add more code to keep loading working; instead we want reusable, generic dependency resolution logic.
  • Loading the module will still work as normal in PowerShell, meaning cmdlets and other types the PowerShell module system needs to see will be defined within PowerShell’s own ALC.

To mediate these two requirements, we must break our module up into two assemblies:

  • A cmdlets assembly, AlcModule.Cmdlets.dll, which will contain definitions of all the types that PowerShell’s module system needs to load the module correctly. Namely any implementations of the Cmdlet base class and our IModuleAssemblyInitializer-implementing class, which will set up the event handler for AssemblyLoadContext.Default.Resolving to properly load AlcModule.Engine.dll through our custom ALC. Any types that are meant to be publicly exposed to PowerShell must also be defined here, since PowerShell 7 deliberately hides types defined in assemblies loaded in other ALCs. Finally, this assembly will also need to be where our custom ALC definition lives. Beyond that, as little code should live in this as possible.
  • An engine assembly, AlcModule.Engine.dll, which handles all the actual implementation of the module. Types from this will still be available in the PowerShell ALC, but it will initially be loaded through our custom ALC, and its dependencies will only ever be loaded into the custom ALC. Effectively, this becomes a “bridge” between the two ALCs.

Using this bridge concept, our new assembly situation effectively will look like this:

Diagram representing AlcModule.Engine.dll bridging the two ALCs

To make sure the default ALC’s dependency probing logic will not resolve the dependencies to be loaded into the custom ALC, we will need to separate these two parts of the module in different directories. So our new module layout will look like this:

AlcModule/
  AlcModule.Cmdlets.dll
  AlcModule.psd1
  Dependencies/
  | + AlcModule.Engine.dll
  | + Shared.Dependency.dll

To see how our implementation now changes, we’ll start with the implementation of AlcModule.Engine.dll:

using Shared.Dependency;

namespace AlcModule.Engine
{
    public class AlcEngine
    {
        public static void Use()
        {
            Dependency.Use();
        }
    }
}

This is a fairly straightforward container for the dependency, Shared.Dependency.dll, but you should think of it as the .NET API for your functionality, which the cmdlets living in the other assembly will wrap for PowerShell.

The cmdlet in AlcModule.Cmdlets.dll now looks like this:

// Reference our module's Engine implementation here
using AlcModule.Engine;

namespace AlcModule.Cmdlets
{
    [Cmdlet(VerbsDiagnostic.Test, "AlcModule")]
    public class TestAlcModuleCommand : Cmdlet
    {
        protected override void EndProcessing()
        {
            AlcEngine.Use();
            WriteObject("done!");
        }
    }
}

At this point, if we were to load AlcModule and run Test-AlcModule, we would hit a FileNotFoundException when the default ALC tries to load Alc.Engine.dll to run EndProcessing(). This is good, since it means the default ALC can’t find the dependencies we want to hide.

So now we need to add the magic to AlcModule.Cmdlets.dll that helps it know how to resolve AlcModule.Engine.dll. First we must define our custom ALC that will resolve assemblies from our module’s Dependencies directory:

namespace AlcModule.Cmdlets
{
    internal class AlcModuleAssemblyLoadContext : AssemblyLoadContext
    {
        private readonly string _dependencyDirPath;

        public AlcModuleAssemblyLoadContext(string dependencyDirPath)
        {
            _depdendencyDirPath = dependencyDirPath;
        }

        protected override Assembly Load(AssemblyName assemblyName)
        {
            // We do the simple logic here of
            // looking for an assembly of the given name
            // in the configured dependency directory
            string assemblyPath = Path.Combine(
                s_dependencyDirPath,
                $"{assemblyName.Name}.dll");

            // The ALC must use inherited methods to load assemblies
            // Assembly.Load*() won't work here
            return LoadFromAssemblyPath(assemblyPath);
        }
    }
}

Then, we need to hook our custom ALC up to the default ALC’s Resolving event (which is the ALC version of the AssemblyResolve event on Application Domains) that will be fired when EndProcessing() is called to try and find AlcModule.Engine.dll:

namespace AlcModule.Cmdlets
{
    public class AlcModuleResolveEventHandler : IModuleAssemblyInitializer, IModuleAssemblyCleanup
    {
        // Get the path of the dependency directory.
        // In this case we find it relative to the AlcModule.Cmdlets.dll location
        private static readonly string s_dependencyDirPath = Path.GetFullPath(
            Path.Combine(
                Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location),
                "Dependencies"));

        private static readonly AlcModuleAssemblyLoadContext s_dependencyAlc = new AlcModuleAssemblyLoadContext(s_dependencyDirPath);

        public void OnImport()
        {
            // Add the Resolving event handler here
            AssemblyLoadContext.Default.Resolving += ResolveAlcEngine;
        }

        public void OnRemove()
        {
          // Remove the Resolving event handler here
          AssemblyLoadContext.Default.Resolving -= ResolveAlcEngine;
        }

        private static Assembly ResolveAlcEngine(
            AssemblyLoadContext defaultAlc,
            AssemblyName assemblyToResolve)
        {
            // We only want to resolve the Alc.Engine.dll assembly here.
            // Because this will be loaded into the custom ALC,
            // all of *its* dependencies will be resolved
            // by the logic we defined for that ALC's implementation.
            //
            // Note that we are safe in our assumption that the name is enough
            // to distinguish our assembly here,
            // since it's unique to our module.
            // There should be no other AlcModule.Engine.dll on the system.
            if (!assemblyToResolve.Name.Equals("AlcModule.Engine"))
            {
                return null;
            }

            // Allow our ALC to handle the directory discovery concept
            //
            // This is where Alc.Engine.dll is loaded into our custom ALC
            // and then passed through into PowerShell's ALC,
            // becoming the bridge between both
            return s_dependencyAlc.LoadFromAssemblyName(assemblyToResolve);
        }
    }
}

With the new implementation laid out, we can now take a look at the sequence of calls that occurs when the module is loaded and Test-AlcModule is run:

Sequence diagram of calls using the custom ALC to load dependencies

Some points of interest are:

  • The IModuleAssemblyInitializer is run first when the module loads and sets the Resolving event
  • We don’t even load the dependencies until Test-AlcModule is run and its EndProcessing() method is called
  • When EndProcessing() is called, the default ALC does not find AlcModule.Engine.dll and fires the Resolving event
  • Our event handler here is what does the magic of hooking up the custom ALC to the default ALC, and only loads AlcModule.Engine.dll
  • Then, within AlcModule.Engine.dll, when AlcEngine.Use() is called, the custom ALC again kicks in to resolve Shared.Dependency.dll. Specifically it always loads our Shared.Dependency.dll, since it never conflicts with anything in the default ALC and only looks in our Dependencies directory.

Assembling the implementation, our new source code layout looks like this:

+ AlcModule.psd1
+ src/
  + AlcModule.Cmdlets/
  | + AlcModule.Cmdlets.csproj
  | + TestAlcModuleCommand.cs
  | + AlcModuleAssemblyLoadContext.cs
  | + AlcModuleInitializer.cs
  |
  + AlcModule.Engine/
  | + AlcModule.Engine.csproj
  | + AlcEngine.cs

AlcModule.Cmdlets.csproj looks like:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netcoreapp3.1</TargetFramework>
  </PropertyGroup>
  <ItemGroup>
    <ProjectReference Include="..AlcModule.EngineAlcModule.Engine.csproj" />
    <PackageReference Include="Microsoft.PowerShell.Sdk" Version="7.0.1" PrivateAssets="all" />
  </ItemGroup>
</Project>

AlcModule.Engine.csproj looks like this:

<Project Sdk="Microsoft.NET.Sdk">
  <PropertyGroup>
    <TargetFramework>netcoreapp3.1</TargetFramework>
  </PropertyGroup>
  <ItemGroup>
    <PackageReference Include="Shared.Dependency" Version="1.0.0" />
  </ItemGroup>
</Project>

And so when we build the module, our strategy is:

  • Build AlcModule.Engine
  • Build AlcModule.Cmdlets
  • Copy everything from AlcModule.Engine into the Dependencies dir, and remember what we copied
  • Copy everything from AlcModule.Cmdlets that wasn’t in AlcModule.Engine into the base module dir

Since the module layout here is so crucial to dependency separation, here’s a build script to use from the source root:

param(
    # The .NET build configuration
    [ValidateSet('Debug', 'Release')]
    [string]
    $Configuration = 'Debug'
)

# Convenient reusable constants
$mod = "AlcModule"
$netcore = "netcoreapp3.1"
$copyExtensions = @('.dll', '.pdb')

# Source code locations
$src = "$PSScriptRoot/src"
$engineSrc = "$src/$mod.Engine"
$cmdletsSrc = "$src/$mod.Cmdlets"

# Generated output locations
$outDir = "$PSScriptRoot/out/$mod"
$outDeps = "$outDir/Dependencies"

# Build AlcModule.Engine
Push-Location $engineSrc
dotnet publish -c $Configuration
Pop-Location

# Build AlcModule.Cmdlets
Push-Location $cmdletsSrc
dotnet publish -c $Configuration
Pop-Location

# Ensure out directory exists and is clean
Remove-Item -Path $outDir -Recurse -ErrorAction Ignore
New-Item -Path $outDir -ItemType Directory
New-Item -Path $outDeps -ItemType Directory

# Copy manifest
Copy-Item -Path "$PSScriptRoot/$mod.psd1"

# Copy each Engine asset and remember it
$deps = [System.Collections.Generic.Hashtable[string]]::new()
Get-ChildItem -Path "$engineSrc/bin/$Configuration/$netcore/publish/" |
    Where-Object { $_.Extension -in $copyExtensions } |
    ForEach-Object { [void]$deps.Add($_.Name); Copy-Item -Path $_.FullName -Destination $outDeps }

# Now copy each Cmdlets asset, not taking any found in Engine
Get-ChildItem -Path "$cmdletsSrc/bin/$Configuration/$netcore/publish/" |
    Where-Object { -not $deps.Contains($_.Name) -and $_.Extension -in $copyExtensions } |
    ForEach-Object { Copy-Item -Path $_.FullName -Destination $outDir }

So finally, we have a general way to use an Assembly Load Context to isolate our module’s dependencies that remains robust over time and as more dependencies are added.

Hopefully this example is informative, but naturally a fuller example would be more helpful. For that, go to this GitHub repository I created to give a full demonstration of how to migrate a module to use an ALC, while keeping that module working in .NET Framework and also using .NET Standard and PowerShell Standard to simply the core implementation.

Assembly resolve handler for side-by-side loading with Assembly.LoadFile()

Another, possibly simpler, way to achieve side-by-side assembly loading is to hook up an AssemblyResolve event to Assembly.LoadFile(). Using Assembly.LoadFile() has the advantage of being the simplest solution to implement and working with both .NET Core and .NET Framework.

To show this, let’s take a look at a quick example of an implementation that combines ideas from our dynamic binding redirect example, and from the Assembly Load Context example above.

We’ll call this module LoadFileModule, which has a trival command Test-LoadFile [-Path] <path>. This module takes a dependency on the CsvHelper assembly (version 15.0.5).

As with the ALC module, we must first split up the module into two pieces. First the part that does the actual implementation, LoadFileModule.Engine:

using CsvHelper;

namespace LoadFileModule.Engine
{
    public class Runner
    {
        public void Use(string path)
        {
            // Here's where we use the CsvHelper dependency
            using (var reader = new StreamReader(path))
            using (var csvReader = new CsvReader(reader, CultureInfo.InvariantCulture))
            {
                // Imagine we do something useful here...
            }
        }
    }
}

And then the part that PowerShell will load directly, LoadFileModule.Cmdlets. We use a very similar strategy as with the ALC module, where we isolate dependencies in a separate directory. But this time we must load all assemblies in with a resolve event, rather than just LoadFileModule.Engine.dll.

using LoadFileModule.Engine;

namespace LoadFileModule.Cmdlets
{
    // Actual cmdlet definition
    [Cmdlet(VerbsDiagnostic.Test, "LoadFile")]
    public class TestLoadFileCommand : Cmdlet
    {
        [Parameter(Mandatory = true)]
        public string Path { get; set; }

        protected override void EndProcessing()
        {
            // Here's where we use LoadFileModule.Engine
            new Runner().Use(Path);
        }
    }

    // This class controls our dependency resolution
    public class ModuleContextHandler : IModuleAssemblyInitializer, IModuleAssemblyCleanup
    {
        // We catalog our dependencies here to ensure we don't load anything else
        private static IReadOnlyDictionary<string, int> s_dependencies = new Dictionary<string, int>
        {
            { "CsvHelper", 15 },
        };

        // Set up the path to our dependency directory within the module
        private static string s_dependenciesDirPath = Path.GetFullPath(
            Path.Combine(
                Path.GetDirectoryName(Assembly.GetExecutingAssembly().Location),
                "Dependencies"));

        // This makes sure we only try to resolve dependencies when the module is loaded
        private static bool s_engineLoaded = false;

        public void OnImport()
          {
            // Set up our event when the module is loaded
            AppDomain.CurrentDomain.AssemblyResolve += HandleResolveEvent;
        }

        public void OnRemove(PSModuleInfo psModuleInfo)
        {
            // Unset the event when the module is unloaded
            AppDomain.CurrentDomain.AssemblyResolve -= HandleResolveEvent;
        }

        private static Assembly HandleResolveEvent(object sender, ResolveEventArgs args)
        {
            var asmName = new AssemblyName(args.Name);

            // First we want to know if this is the top-level assembly
            if (asmName.Name.Equals("LoadFileModule.Engine"))
            {
                s_engineLoaded = true;
                return Assembly.LoadFile(Path.Combine(s_dependenciesDirPath, "Unrelated.Engine.dll"));
            }

            // Otherwise, if that assembly has been loaded, we must try to resolve its dependencies too
            if (s_engineLoaded
                && s_dependencies.TryGetValue(asmName.Name, out int requiredMajorVersion)
                && asmName.Version.Major == requiredMajorVersion)
            {
                string asmPath = Path.Combine(s_dependenciesDirPath, $"{asmName.Name}.dll");
                return Assembly.LoadFile(asmPath);
            }

            return null;
        }
    }
}

Finally, the layout of the module is also similar to the ALC module:

LoadFileModule/
  + LoadFileModule.Cmdlets.dll
  + LoadFileModule.psd1
  + Dependencies/
  |  + LoadFileModule.Engine.dll
  |  + CsvHelper.dll

With this structure in place, LoadFileModule now supports being loaded alongside other modules with a dependency on CsvHelper.

Because our handler will apply to all AssemblyResolve events across the Application Domain, we’ve needed to make some specific design choices here:

  • We only start handling general dependency loading after LoadFileModule.Engine.dll has been loaded, to narrow the window in which our event may interfere with other loading.
  • We push LoadFileModule.Engine.dll out into the Dependencies directory, so that it’s loaded by a LoadFile() call rather than by PowerShell. This means its own dependency loads will always raise an AssemblyResolve event, even if another CsvHelper.dll (for example) is loaded in PowerShell, meaning we have an opportunity to find the correct dependency.
  • We are forced to code a dictionary of dependency names and versions into our handler, since we must try only to resolve our specific dependencies for our module. In our particular case, it would be possible to use the ResolveEventArgs.RequestingAssembly property to work out whether CsvHelper is being requested by our module, but this wouldn’t work for dependencies of dependencies (for example if CsvHelper itself raised an AssemblyResolve event). There are other, harder ways to do this, such as comparing requested assemblies to the metadata of assemblies in the Dependencies directory, but here we’ve made this checking more explicit both to highlight the issue and to simplify the example.

Essentially this is the key difference between the LoadFile() strategy and the ALC strategy: the AssemblyResolve event must service all loads in the Application Domain, making it much harder to keep our dependency resolution from being tangled with other modules. However, the lack of ALCs in .NET Framework makes this option one worth understanding (even just for .NET Framework, while using an ALC in .NET Core).

Custom Application Domains

A final (and by some measures, extreme) option for assembly isolation is custom Application Domain use. Application Domains (used in the plural), are only available in .NET Framework, and are used to provide in-process isolation between parts of a .NET application. One of the uses of this is to isolate assembly loads from each other within the same process.

However, Application Domains are serialization boundaries, meaning that objects in one Application Domain cannot be referenced and used directly by objects in another Application Domain. This can be worked around by implementing MarshalByRefObject, but when you don’t control the types (as is often the case with dependencies), it’s not possible to force an implementation here, meaning that the only solution is to make large architectural changes. The serialization boundary also has serious performance implications.

Because Application Domains have this serious limitation, are complicated to implement, and only work in .NET Framework, I won’t give an example of how you might use them here. While they’re worth mentioning as a possibility, they aren’t an investment I would recommend.

If you’re interested in trying to use a custom Application Domain, the following links might help:

Solutions for dependency conflicts that don’t work for PowerShell

Finally, I should address some possibilities that come up when researching .NET dependency conflicts in .NET that can look promising but generally won’t work for PowerShell.

These solutions have the common theme that they are changes in deployment configurations in an environment where you control the application and possibly the entire machine. This is because they are oriented toward scenarios like web servers and other applications deployed to server environments, where the environment is intended to host the application and is free to be configured by the deploying user.

They also tend to be very much .NET Framework focused, meaning they will not work with PowerShell 6 or above.

Note that if you know that your module will only be used in environments you have total control over, and only with Windows PowerShell 5.1 and below, some of these may be options.

In general however, modules should not modify global machine state like this, as it can break configurations, causing powershell.exe, other modules or other dependent applications not to work, or simply fail and cause your module to fail in unexpected ways.

Static binding redirect with app.config to force using the same dependency version

.NET Framework applications can take advantage of an app.config file to configure some of the application’s behaviours declaratively. It’s possible to write an app.config entry that configures assembly binding to redirect assembly loading to a particular version.

Two issues with this for PowerShell are:

  • .NET Core does not support app.config, so this is a powershell.exe-only solution.
  • powershell.exe is a shared application that lives under the System32 directory. This means its likely your module won’t be able to modify its contents on many systems, and even if it can, modifying the app.config could break an existing configuration or affect the loading of other modules.

Setting codebase with app.config

For the same reasons as with setting binding redirects in app.config, trying to configure the app.config codebase setting is generally not going to work in PowerShell modules.

Installing your dependencies to the Global Assembly Cache (GAC)

Another way to resolve dependency version conflicts in .NET Framework is to install dependencies to the GAC, so that different versions can be loaded side-by-side from the GAC.

Again for PowerShell modules, the chief issues here are:

  • The GAC only applies to .NET Framework, so this will not help in PowerShell 6 and above.
  • Installing assemblies to the GAC is a modification of global machine state and may cause side-effects in other applications or to other modules. It may also be difficult to do correctly, even when your module has the required access privileges (and getting it wrong could cause serious issues in other .NET applications machine-wide).

Solving the issue in PowerShell itself…

After reading through all of this and seeing the complexity not just of implementing an isolation solution, but making it work with the PowerShell module system, you may wonder why PowerShell hasn’t put a solution to this problem into the module system yet.

While it’s not something we’re planning to implement imminently, the facility of Assembly Load Contexts available in .NET 5 makes this something worth considering long term.

However, this would represent a large change in the module system, which is a very critical (and already complex) component of PowerShell. In addition, the diversity of PowerShell modules and module scenarios presents a serious challenge in terms of PowerShell correctly isolating modules consistently and without creating edge cases.

In particular, dependencies will be exposed if they are part of a module’s API. For example, if a PowerShell module converts YAML strings to objects and uses YamlDotNet to return objects of type YamlDotNet.YamlDocument, then its dependencies are exposed to the global context.

This can lead to type identity issues when instances of the exposed type are used with APIs expecting the same type (but from a different assembly); specifically you may see confusing messages like “Type YamlDotNet.YamlDocument cannot be cast to type YamlDotNet.YamlDocument“, because even though the two types have the same name, .NET regards them as coming from different assemblies and therefore being different.

In order to safely isolate a dependency used in the module API, API types need to either be a type already found in PowerShell and its public dependencies, like PSObject or System.Xml.XmlDocument, or a new type defined by the module to be an intermediary between the PowerShell context and the dependency context.

It’s likely that any module isolation strategy could break some module, and that module authors will need, to some extent, to understand the implications of dependency isolation and be responsible for deciding whether their module can have its dependencies isolated.

So with that in mind, I encourage you to contribute to the discussion on GitHub. Also take a look at the RFC proposed to address the issue.

Further reading

There’s plenty more to read on the topic of .NET assembly version dependency conflicts. Here are some nice jumping off points:

Final notes

In this blog post, we looked over various ways to solve the issue of having module dependency conflicts in PowerShell, identifying strategies that won’t work, simple strategies that sometimes work, and more complex strategies that are more robust.

In particular we looked at how to implement an Assembly Load Context in .NET Core to make module dependency isolation much easier in PowerShell 6 and up.

This is fairly complicated subject matter, so don’t worry if it doesn’t click immediately. Feel free to read the material here and linked, experiment with implementations (try stepping through it in a debugger), and also get in touch with us on GitHub or Twitter.

Cheers!

Rob Holt

Software Engineer

PowerShell Team

The post Resolving PowerShell Module Assembly Dependency Conflicts appeared first on PowerShell.

PowerShellGet 3.0 Preview 3

This post was originally published on this site

PowerShellGet 3.0 preview 3 is now available on the PowerShell Gallery. The focus of this release is the -RequiredResource parameter for Install-PSResource which now allows for json, hashtable, or .json files as input. For a full list of the issues addressed by this release please refer to this GitHub project.

Note: For background information on PowerShellGet 3.0 please refer to the blog post for the first preview release and the second preview release.

How to install the module

To install this version of PowerShellGet open any PowerShell console and run: Install-Module -Name PowerShellGet -AllowPrerelease -Force

New Features of this release

New Features

  • RequiredResource parameter for Install-PSResource
  • RequiredResourceFile parameter for Install-PSResource
  • IncludeXML parameter in Save-PSResource

Bug Fix

  • Resolved paths in Install-PSResource and Save-PSResource

What is next

GitHub is the best place to track the bugs/feature requests related to this module. We have used a combination of projects and labels on our GitHub repo to track issues for this upcoming release. We are using the label Resolved-3.0 to mark issues that we plan to release at some point before we release the module as GA (generally available). To track issues/features to particular preview and GA releases we are using GitHub projects which are titled to reflect the release.

The focus feature for our next preview release (preview 4) is publishing functionality.

How to Give feedback and Get Support

We cannot overstate how critical user feedback is at this stage in the development of the module. Feedback from preview releases help inform design decisions without incurring a breaking change once generally available and used in production. In order to help us to make key decisions around the behavior of the module please give us feedback by opening issues in our GitHub repository.

Sydney Smith PowerShell Team

The post PowerShellGet 3.0 Preview 3 appeared first on PowerShell.

PowerShell 7 Video Series

This post was originally published on this site

As a part of our PowerShell 7 release, the PowerShell Team put together a series of videos explaining and demoing aspects of the release. The intent of these videos was for User Groups to host events celebrating and discussing PowerShell 7, however, in light of the current guidance against group gatherings, we have decided to make these videos available publicly on the PowerShell Team YouTube channel.

The content is broken down into 10 videos with a total runtime of 58 minutes. We have included links to the videos below, along with links to other related resources.

Helpful Links from the Videos

We hope you enjoy these videos! Please subscribe to the PowerShell Team YouTube channel for more video content from the PowerShell Team.

 

The post PowerShell 7 Video Series appeared first on PowerShell.

PowerShellGet 3.0 Preview 2

This post was originally published on this site

PowerShellGet 3.0 preview 2 is now available on the PowerShell Gallery.
The focus of this release is the Install-PSResource parameter, error messages, and warnings.
For a full list of the issues addressed by this release please refer to this GitHub project.

Note: For background information on PowerShellGet 3.0 please refer to the blog post for the first preview release.

How to install the module

To install this version of PowerShellGet open any PowerShell console and run:
Install-Module -Name PowerShellGet -AllowPrerelease -Force -AllowClobber

New Features of this release

  • Progress bar and -Quiet parameter for Install-PSResource
  • TrustRepository parameter for Install-PSResource
  • NoClobber parameter for Install-PSResource
  • AcceptLicense for Install-PSResource
  • Force parameter for Install-PSResource
  • Reinstall parameter for Install-PSResource
  • Improved error handling

What is next

GitHub is the best place to track the bugs/feature requests related to this module. We have used a combination of projects and labels on our GitHub repo to track issues for this upcoming release. We are using the label Resolved-3.0 to mark issues that we plan to release at some point before we release the module as GA (generally available). To track issues/features to particular preview and GA releases we are using GitHub projects which are titled to reflect the release.

The major feature for our next preview release (preview 3) is allowing for Install-PSResource to accept a path to a psd1 or, json file (using -RequiredResourceFile), or a hashtable or json an input (using -RequiredResource).

How to Give feedback and Get Support

We cannot overstate how critical user feedback is at this stage in the development of the module. Feedback from preview releases help inform design decisions without incurring a breaking change once generally available and used in production. In order to help us to make key decisions around the behavior of the module please give us feedback by opening issues in our GitHub repository.

Sydney Smith
PowerShell Team

The post PowerShellGet 3.0 Preview 2 appeared first on PowerShell.

PowerShell Gallery TLS Support

This post was originally published on this site

Summary

To provide the best-in-class encryption to our customers, the PowerShell Gallery has deprecated Transport Layer Security (TLS) versions 1.0 and 1.1 as of April 2020.

The Microsoft TLS 1.0 implementation has no known security vulnerabilities. But because of the potential for future protocol downgrade attacks and other TLS vulnerabilities, we are discontinuing support for TLS 1.0 and 1.1 in the PowerShell Gallery.

For information about how to remove TLS 1.0 and 1.1 dependencies, see the whitepaper Solving the TLS 1.0 problem.

More information

As of April 2020, TLS 1.2 is set to be the default for the PowerShell Gallery.

Please note that TLS 1.0 and 1.1 was already unsupported, but the actual deprecation when PowerShell Gallery will now stop accepting any connections using TLS 1.0 and 1.1 has occurred.

We recommend that all client-server combinations use TLS 1.2 (or a later version) to maintain connection to the PowerShell Gallery.

Work Around

In your PowerShell session run:

[Net.ServicePointManager]::SecurityProtocol = [Net.SecurityProtocolType]::Tls12

Note: This will not permanently update your TLS but will allow you to interact with the PowerShell Gallery during this session.

Errors I Might See

If you are running an older version of TLS and try to interact with the PowerShell Gallery you may see error messages like:

Publishing

+ ...             Publish-PSArtifactUtility @PublishPSArtifactUtility_Param ...
+                 ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+ CategoryInfo          : InvalidOperation: (:) [Write-Error], WriteErrorException
+ FullyQualifiedErrorId : FailedToCreateCompressedModule,Publish-PSArtifactUtility

Installing

+ ...            $null = PackageManagementInstall-Package @PSBoundParameters + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ 
+ CategoryInfo : ResourceUnavailable: (C:UsersT-Ncho...anagement.nupkg:String) [Install-Package], Exception 
+ FullyQualifiedErrorId : PackageFailedInstallOrDownload,Microsoft.PowerShell.PackageManagement.Cmdlets.InstallPackage

Concerns and Support

Please open an issue in our GitHub repository or contact our gallery support channel through cgadmin@microsoft.com if you have any concerns, challenges, or are unable to upgrade to TLS 1.2 or above.

 

Sydney Smith

PowerShell Team

 

 

The post PowerShell Gallery TLS Support appeared first on PowerShell.

Depending on the right PowerShell NuGet package in your .NET project

This post was originally published on this site

Alongside the pwsh executable packages published with each PowerShell release, the PowerShell team also maintain several NuGet packages that are available on NuGet to allow targeting PowerShell as an API platform in .NET.

As a .NET application that both provides APIs and expects to load .NET libraries implementing its own (binary modules), it’s essential that PowerShell be available in the form of a NuGet package.

Currrently there are several NuGet packages that provide some representation of the PowerShell API surface area, and which to use with a particular project hasn’t always been made clear. This blog post will shed some light on a few common scenarios for PowerShell-targeting .NET projects and how to choose the right NuGet package to target for your PowerShell-oriented .NET project.

Hosting vs referencing

Some .NET projects seek to write code to be loaded into a pre-existing PowerShell runtime (such as pwsh, powershell.exe, the PowerShell Integrated Console or the ISE), while others want to run PowerShell in their own applications.

  • Referencing is for when a project, usually a module, is intended to be loaded into PowerShell. It must be compiled against the APIs that PowerShell provides in order to interact with it, but the PowerShell implementation is supplied by the PowerShell process loading it in. For referencing, a project can use reference assemblies or the actual runtime assemblies as a compilation target, but must ensure that it does not publish any of these with its build.
  • Hosting is when a project needs its own implemenation of PowerShell, usually because it is a standalone application that needs to run PowerShell. In this case, pure reference assemblies cannot be used, and instead a concrete PowerShell implementation must be depended upon. Because a concrete PowerShell implementation must be used, a specific version of PowerShell must be chosen for hosting; a single host application cannot multi-target PowerShell versions.

Publishing projects that target PowerShell as a reference

NOTE: We use the term publish in this blog post to refer to running dotnet publish, which places a .NET library into a directory with all of its dependencies, ready for deployment to a particular runtime.

In order to prevent publishing project dependencies that are just being used as compilation reference targets, it is recommended to set the PrivateAssets attribute:

<PackageReference Include="PowerShellStandard.Library" Version="5.1.0.0" PrivateAssets="all" />

If you forget to do this and use a reference assembly as your target, you may see issues related to using the reference assembly’s default implementation instead of the actual implementation. This may take the form of NullReferenceExceptions, since reference assemblies often mock the implementation API by simply returning null.

Key kinds of PowerShell-targeting .NET projects

While any .NET library or application can embed PowerShell, there are some common scenarios that use PowerShell APIs:

  • Implementing a PowerShell binary module PowerShell binary modules are .NET libraries loaded by PowerShell that must implement PowerShell APIs like the PSCmdlet or CmdletProvider types in order to expose cmdlets or providers respectively. Because they are loaded in, modules seek to compile against references to PowerShell without publishing it in their build. It’s also common for modules to want to support multiple PowerShell versions and platforms, ideally with a minimum of overhead of disk space, complexity or repeated implementation. (See about_Modules for more information about modules.)
  • Implementing a PowerShell Host A PowerShell Host provides an interaction layer for the PowerShell runtime. It is a specific form of hosting, where a PSHost is implemented as a new user interface to PowerShell. For example, the PowerShell ConsoleHost provides a terminal user interface for PowerShell executables, while the PowerShell Editor Services Host and the ISE Host both provide an editor-integrated partially graphical user interface around PowerShell. While it’s possible to load a host onto an existing PowerShell process, it’s much more common for a host implementation to act as a standalone PowerShell implementation that redistributes the PowerShell engine.
  • Calling into PowerShell from another .NET application As with any application, PowerShell can be called as a subprocess to run workloads. However, as a .NET application, it’s also possible to invoke PowerShell in-process to get back full .NET objects for use within the calling application. This is a more general form of hosting, where the application holds its own PowerShell implementation for internal use. Examples of this might be a service or daemon running PowerShell to manage machine state or a web application that runs PowerShell on request to do something like manage cloud deployments.
  • Unit testing PowerShell modules from .NET While modules and other libraries designed to expose functionality to PowerShell should be primarily tested from PowerShell (we recommend Pester), sometimes it’s necessary to unit test APIs written for a PowerShell module from .NET. This situation involves the module code trying to target a number of PowerShell versions, while testing should run it on specific, concrete implementations.

PowerShell NuGet packages at a glance

In this blog post, we’ll cover the following NuGet packages that expose PowerShell APIs:

  • PowerShellStandard.Library, a reference assembly that enables building a single assembly that can be loaded by multiple PowerShell runtimes.
  • Microsoft.PowerShell.SDK, the way to target and rehost the whole PowerShell SDK
  • The System.Management.Automation package, the core PowerShell runtime and engine implementation, that can be useful in minimal hosted implementations and for version-specific targeting scenarios.
  • The Windows PowerShell reference assemblies, the way to target and effectively rehost Windows PowerShell (PowerShell versions 5.1 and below).

NOTE: The PowerShell NuGet package is not a .NET library package at all, but instead provides the PowerShell dotnet global tool implementation. This should not be used by any projects, since it only provides an executable.

PowerShellStandard.Library

The PowerShell Standard library is a reference assembly that captures the intersection of the APIs of PowerShell versions 7, 6 and 5.1. This provides a compile-time-checked API surface to compile .NET code against, allowing .NET projects to target PowerShell versions 7, 6 and 5.1 without risking calling an API that won’t be there.

PowerShell Standard is intended for writing PowerShell modules, or other code only intended to be run after loading it into a PowerShell process. Because it is a reference assembly, PowerShell Standard contains no implementation itself, so provides no functionality for standalone applications.

Using PowerShell Standard with different .NET runtimes

PowerShell Standard targets the .NET Standard 2.0 target runtime, which is a fa???ade runtime designed to provide a common surface area shared by .NET Framework and .NET Core. This allows targeting a single runtime to produce a single assembly that will work with multiple PowerShell versions, but has the following consequences:

  • The PowerShell loading the module or library must be running a minimum of .NET 4.6.1; .NET 4.6 and .NET 4.5.2 do not support .NET Standard. Note that a newer Windows PowerShell version does not mean a newer .NET Framework version; Windows PowerShell 5.1 may run on .NET 4.5.2.
  • In order to work with a PowerShell running .NET Framework 4.7.1 or below, the .NET 4.6.1 NETStandard.Library implemenation is required to provide the netstandard.dll and other shim assemblies in older .NET Framework versions.

PowerShell 6+ provides its own shim assemblies for type forwarding from .NET Framework 4.6.1 (and above) to .NET Core. This means that as long as a module uses only APIs that exist in .NET Core, PowerShell 6+ can load and run it when it has been built for .NET Framework 4.6.1 (the net461 runtime target).

This means that binary modules using PowerShell Standard to target multiple PowerShell versions with a single published DLL have two options:

  1. Publishing an assembly built for the net461 target runtime. This involves:
    • Publishing the project for the net461 runtime
    • Also compiling against the netstandard2.0 runtime (without using its build output) to ensure that all APIs used are also present in .NET Core.
  2. Publishing an assembly build for the netstandard2.0 target runtime. This requires:
    • Publishing the project for the netstandard2.0 runtime
    • Taking the net461 dependencies of NETStandard.Library and copying them into the project assembly’s publish location so that the assembly is type-forwarded corrected in .NET Framework.

Note that to build PowerShell modules or libraries targeting older .NET Framework versions, it may be preferable to target multiple .NET runtimes. This will publish an assembly for each target runtime, and the correct assembly will need to be loaded at module load time (for example with a small psm1 as the root module).

Testing PowerShell Standard projects in .NET

When it comes to testing your module in .NET test runners like xUnit, remember that compile-time checks can only go so far, so you must test your module against the relevant PowerShell platforms.

To test APIs built against PowerShell Standard in .NET, you should add Microsoft.PowerShell.SDK as a testing dependency with .NET Core (with the version set to match the desired PowerShell version), and the appropriate Windows PowerShell reference assemblies with .NET Framework.

For more information on PowerShell Standard and using it to write a binary module that works in multiple PowerShell versions, see this blog post. Also see the PowerShell Standard GitHub repository.

Microsoft.PowerShell.SDK

Microsoft.PowerShell.SDK is a meta-package that pulls together all of the components of the PowerShell SDK into a single NuGet package. A self-contained .NET application can use Microsoft.PowerShell.SDK to run arbitrary PowerShell functionality without depending on any external PowerShell installations or libraries.

NOTE: The PowerShell SDK just refers to all the component packages that make up PowerShell, and which can be used for .NET development with PowerShell.

A given Microsoft.PowerShell.SDK version contains the concrete implementation of the same version of the PowerShell application; version 7.0 contains the implementation of PowerShell 7.0 and running commands or scripts with it will largely behave like running them in PowerShell 7.0.

Running PowerShell commands from the SDK is mostly, but not totally, the same as running them from pwsh; for example, Start-Job currently depends on the pwsh executable being available, and so will not work with Microsoft.PowerShell.SDK by default.

Targeting Microsoft.Powershell.SDK from a .NET application allows you to integrate with all of PowerShell’s implementation assemblies, such as System.Management.Automation, Microsoft.PowerShell.Management and other module assemblies.

Publishing an application targeting Microsoft.PowerShell.SDK will include all these assemblies, along with any dependencies PowerShell requires, as well as other assets that PowerShell requires in its build such as the module manifests for Microsoft.PowerShell.* modules and the ref directory required by Add-Type.

Given the completeness of Microsoft.PowerShell.SDK, it’s best suited for:

  • Implementation of PowerShell hosts.
  • xUnit testing of libraries targeting PowerShell reference assemblies.
  • Invoking PowerShell in-process from a .NET application.

Microsoft.PowerShell.SDK may also be used as a reference target when a .NET project is intended to be used as a module or otherwise loaded by PowerShell, but depends on APIs only present in a particular version of PowerShell. Note that an assembly published against a specific version of Microsoft.PowerShell.SDK will only be safe to load and use in that version of PowerShell; to target multiple PowerShell versions with specific APIs, multiple builds are required, each targeting their own version of Microsoft.PowerShell.SDK.

NOTE: The PowerShell SDK is only available for PowerShell versions 6 and up. To provide equivalent functionality with Windows PowerShell, use the Windows PowerShell reference assemblies described below.

System.Management.Automation

The System.Management.Automation package is the heart of the PowerShell SDK and exists on NuGet chiefly as an asset for Microsoft.PowerShell.SDK to pull in. However, it can also be used directly as a package for smaller hosting scenarios and version-targeting modules.

Specifically, the System.Management.Automation package may be a preferable provider of PowerShell functionality when:

  • You’re only looking to use PowerShell language functionality (in the System.Management.Automation.Language namespace) like the PowerShell parser, AST and AST visitor APIs (for example for static analysis of PowerShell).
  • You only wish to execute specific commands from the Microsoft.PowerShell.Core module and can execute them in a session state created with the CreateDefault2 factory method.

Additionally, System.Management.Automation is a useful reference assembly when:

  • You wish to target APIs that are only present within a specific PowerShell version
  • You won’t be depending on types occurring outside the System.Management.Automation assembly (for example, types exported by cmdlets in Microsoft.PowerShell.* modules).

Windows PowerShell reference assemblies

For PowerShell versions 5.1 and older (Windows PowerShell), there is no SDK to provide an implementation of PowerShell, since Windows PowerShell’s implementation is a part of Windows.

Instead, the Windows PowerShell reference assemblies provide both reference targets and a way to rehost Windows PowerShell, acting the same as the PowerShell SDK does for versions 6 and up.

Rather than being differentiated by version, Windows PowerShell reference assemblies have a different package for each version of Windows PowerShell:

Information on how to use the Windows PowerShell reference assemblies can be found here.

Real-world examples using these NuGet packages

Different PowerShell tooling projects target different PowerShell NuGet packages depending on their needs. Listed here are some notable examples.

PSReadLine

PSReadLine, the PowerShell module that provides much of PowerShell’s rich console experience, targets PowerShell Standard as a dependency rather than a specific PowerShell version, and targets the net461 .NET runtime in its csproj.

PowerShell 6+ supplies its own shim assemblies that allow a DLL targeting the net461 runtime to “just work” when loaded in (by redirecting binding to .NET Framework’s mscorlib.dll to the relevant .NET Core assembly).

This simplifies PSReadLine’s module layout and delivery significantly, since PowerShell Standard ensures the only APIs used will be present in both PowerShell 5.1 and PowerShell 6+, while also allowing the module to ship with only a single assembly.

The .NET 4.6.1 target does mean that Windows PowerShell running on .NET 4.5.2 and .NET 4.6 is not supported though.

PowerShell Editor Services

PowerShell Editor Services (PSES) is the backend for the PowerShell extension for Visual Studio Code, and is actually a form of PowerShell module that gets loaded by a PowerShell executable and then takes over that process to rehost PowerShell within itself while also providing Language Service Protocol and Debug Adapter features.

PSES provides concrete implementation targets for netcoreapp2.1 to target PowerShell 6+ (since PowerShell 7’s netcoreapp3.1 runtime is backwards compatible) and net461 to target Windows PowerShell 5.1, but contains most of its logic in a second assembly that targets netstandard2.0 and PowerShell Standard. This allows it to pull in dependencies required for .NET Core and .NET Framework platforms, while still simplifying most of the codebase behind a uniform abstraction.

Because it is build against PowerShell Standard, PSES requires a runtime implementation of PowerShell in order to be tested correctly. To do this, PSES’s xUnit tests pull in Microsoft.PowerShell.SDK and Microsoft.PowerShell.5.ReferenceAssemblies in order to provide a PowerShell implementation in the test environment.

As with PSReadLine, PSES cannot support .NET 4.6 and below, but it performs a check at runtime before calling any of the APIs that could cause a crash on the lower .NET Framework runtimes.

PSScriptAnalyzer

PSScriptAnalyzer, the linter for PowerShell, must target syntactic elements only introduced in certain versions of PowerShell. Because recognition of these syntactic elements is accomplished by implementing an AstVisitor2, it’s not possible to use PowerShellStandard and also implement AST visitor methods for newer PowerShell syntaxes.

Instead, PSScriptAnalyzer targets each PowerShell version as a build configuration, and produces a separate DLL for each of them. This increases build size and complexity, but allows:

  • Version-specific API targeting
  • Version specific functionality to be implemented with essentially no runtime cost
  • Total support for Windows PowerShell all the way down to .NET Framework 4.5.2

Summary

In this post, we’ve listed and discussed the NuGet packages available to target when implementing a .NET project that uses PowerShell, and the reasons you might have for using one over another.

If you’ve skipped to the summary, some broad recommendations are:

  • PowerShell modules should compile against PowerShell Standard if they only require APIs common to different PowerShell versions.
  • PowerShell hosts and applications that need to run PowerShell internally should target the PowerShell SDK for PowerShell 6+ or the relevant Windows PowerShell reference assemblies for Windows PowerShell.
  • PowerShell modules that need version-specific APIs should target the PowerShell SDK or Windows PowerShell reference assemblies for the required PowerShell versions, using them as reference assemblies (i.e. not publishing the PowerShell dependencies).

If you’re unsure about your scenario, feel free to get in touch by commenting, opening an issue or reaching out to us on social media.

Cheers!

Rob Holt

Software Engineer

PowerShell Team

The post Depending on the right PowerShell NuGet package in your .NET project appeared first on PowerShell.

Secret Management Preview 2 Release

This post was originally published on this site

We are excited to release a second preview of the Secret Management Module. Thanks to the tremendous feedback we received from the first preview release of this module, you will notice a number of breaking changes to the module. This release is still a preview release meaning that it is not feature complete, future releases will face breaking changes, and we are still iterating based on your feedback. It is important to also note that this version of the module is still Windows only as we are currently implementing Linux support which we hope to make available in the next preview release (and MacOS support after). Please note that because of the breaking changes this release requires a complete replacement of the Secret Management module and any extension modules. Additionally, any existing built in local vault secret can no longer be retrieved and must be re-saved.

How to Install Secret Management Preview 2

If you did not install our first preview release, open any PowerShell console and run:

Install-Module Microsoft.PowerShell.SecretManagement -AllowPrerelease

If you installed our first preview release you will want to first remove any secrets from the LocalDefaultVault. Based on feedback we changed the naming convention for secrets stored in CredMan, therefore previous secrets stored in the local vault will no longer be visible after the new version of the module is installed. (Although the user can still view/remove the old secrets via CredMan UI.) Next you will want to run Uninstall-Module Microsoft.PowerShell.SecretsManagement, this extra step is a result of the change we made to the name of the module. Finally you can run the above command Install-Module Microsoft.PowerShell.SecretManagement -AllowPrerelease to install the latest preview release of the module.

Changes in Secret Management Preview 2

New Module Name

We have removed the plurality in the module name to Mirosoft.PowerShell.SecretManagement in order to be consistent with the cmdlet name and to align with PowerShell naming conventions.

New Cmdlet Names

In addition to renaming the module, we have also removed plurality in the cmdlets. The available cmdlets in the module are now:

# Registering extension vaults

Register-SecretVault
Get-SecretVault
Unregister-SecretVault
Test-Vault # new cmdlet in this release

# Accessing secrets

Set-Secret # this cmdlet replaces Add-Secret
Get-Secret
Get-SecretInfo
Remove-Secret

You will notice we renamed the Add-Secret cmdlet to be Set-Secret. This change was based on user feedback that this name better conveyed the intention of the cmldet.

You will also notice that we have added a Test-Vault cmdlet, this change allows vault extension owners to check that the vault is properly configured at registration time.

Other Changes

  • Set-Secret now has a default parameter set that takes SecureString secret input type. This way Set-Secret will always prompt safely for a SecureString. String secret types can still be passed via parameter or pipeline, but default will be SecureString.
PS> Set-Secret -Name MyStringToken

cmdlet Set-Secret at command pipeline position 1
Supply values for the following parameters:
SecureStringSecret: **********

# Set string secret directly
Set-Secret -Name MyStringToken -Secret $token
Set-Secret -Name MyStringToken -Secret 'MyToken'

# Set string secret via pipeline
$token | Set-Secret -Name MyStringToken -NoClobber

  • Added SecretInformation class used to return information from Get-SecretInfo in a uniform way.
  • Changed CredMan naming prefix to ps:SecretName.
  • Added vaultName to all vault extension functions.
  • Fixed additionalParameters parameter in SecretManagementExtension abstract classes.
  • Fixed return byte[] bug in example test script extension.

In coordination with these changes we have also updated our tests and example scripts.

Next Steps

As we move towards a GA release later this year, we are using a GitHub Milestone to track issues we plan to fix. The two major work items that we are currently working toward are:

  • Linux Support (via keyring)
  • MacOS Support (via keychain)

Support and Feedback

For support on the module, feedback, or reporting a bug, please open an issue in the PowerShell/modules repo with “(Secrets Management)” specified in the issue title.

Sydney Smith, PowerShell Team

 

 

The post Secret Management Preview 2 Release appeared first on PowerShell.

Visual Studio Code for PowerShell 7

This post was originally published on this site

We are excited to announce that we have released a major update to the PowerShell extension for Visual Studio Code. This release contains months of architectural work that first shipped in our PowerShell Preview extension in November of 2019, along with incremental bug fixes in the intervening months. If you are new to Visual Studio Code this article is helpful for getting started. If you already use Visual Studio Code with the PowerShell extension, read on to find out what is new.

What’s new

ISE Compatibility Module

We took the documentation from our “How to replicate the ISE experience in Visual Studio Code” doc and turned it into a switch to make the process of using Visual Studio code more familiar for Windows PowerShell ISE users.