WARNING: LRO: 977: cannot aggr pkt from port 0x5000002 as lro session port is 0x5000004

This post was originally published on this site

We have started receiving the warning: “WARNING: LRO: 977: cannot aggr pkt from port 0x5000002 as lro session port is 0x5000004″.  The esx hosts are running 6.5 build 10884925.  I searched through VMware’s knowledge base, without success.  AS of yet I do not see any indications of a problem.  Any information around the error would greatly be appreciated.  We are using a nimble array with HP Proliant DL 380 G10, usiing

 

 

Thanks,

ShineKnox

Using PSScriptAnalyzer to check PowerShell version compatibility

This post was originally published on this site

PSScriptAnalyzer version 1.18 was released recently, and ships with powerful new rules that can check PowerShell scripts for incompatibilities with other PowerShell versions and environments.

In this blog post, the first in a series, we’ll see how to use these new rules to check a script for problems running on PowerShell 3, 5.1 and 6.

Wait, what’s PSScriptAnalyzer?

PSScriptAnalzyer is a module providing static analysis (or linting) and some dynamic analysis (based on the state of your environment) for PowerShell. It’s able to find problems and fix bad habits in PowerShell scripts as you create them, similar to the way the C# compiler will give you warnings and find errors in C# code before it’s executed.

If you use the VSCode PowerShell extension, you might have seen the “green squigglies” and problem reports that PSScriptAnalyzer generates for scripts you author:

Image of PSScriptAnalyzer linting in VSCode with green squigglies

You can install PSScriptAnalyzer to use on your own scripts with:

Install-Module PSScriptAnalyzer -Scope CurrentUser

PSScriptAnalyzer works by running a series of rules on your scripts, each of which independently assesses some issue. For example AvoidUsingCmdletAliases checks that aliases aren’t used in scripts, and MisleadingBackticks checks that backticks at the ends of lines aren’t followed by whitespace.

For more information, see the PSScriptAnalyzer deep dive blog series.

Introducing the compatibility check rules

The new compatibility checking functionality is provided by three new rules:

  • PSUseCompatibleSyntax, which checks whether a syntax used in a script will work in other PowerShell versions.
  • PSUseCompatibleCommands, which checks whether commands used in a script are available in other PowerShell environments.
  • PSUseCompatibleTypes, which checks whether .NET types and static methods/properties are available in other PowerShell environments.

The syntax check rule simply requires a list of PowerShell versions you want to target, and will tell you if a syntax used in your script won’t work in any of those versions.

The command and type checking rules are more sophisticated and rely on profiles (catalogs of commands and types available) from commonly used PowerShell platforms. They require configuration to use these profiles, which we’ll go over below.

For this post, we’ll look at configuring and using PSUseCompatibleSyntax and PSUseCompatibleCommands to check that a script works with different versions of PowerShell. We’ll look at PSUseCompatibleTypes in a later post, although it’s configured very similarly to PSUseCompatibleCommands.

Working example: a small PowerShell script

Imagine we have a small (and contrived) archival script saved to .archiveScript.ps1:

# Import helper module to get folders to archive
Import-Module -FullyQualifiedName @{ ModuleName = ArchiveHelper; ModuleVersion = '1.1' }

$paths = Get-FoldersToArchive -RootPath 'C:DocumentsDocumentsToArchive'
$archiveBasePath = 'ArchiveServerDocumentArchive'

# Dictionary to collect hashes
$hashes = [System.Collections.Generic.Dictionary[string, string]]::new()
foreach ($path in $paths)
{
    # Get the hash of the file and turn it into a base64 string
    $hash = (Get-FileHash -LiteralPath $path).Hash

    # Add this file to the hash catalog
    $hashes[$hash] = $path

    # Now give the archive a unique name and zip it up
    $name = Split-Path -LeafBase $path
    Compress-Archive -LiteralPath $path -DestinationPath (Join-Path $archiveBasePath "$name-$hash.zip")
}

# Add the hash catalog to the archive directory
ConvertTo-Json $hashes | Out-File -LiteralPath (Join-Path $archiveBasePath "catalog.json") -NoNewline

This script was written in PowerShell 6.2, and we’ve tested that it works there. But we also want to run it on other machines, some of which run PowerShell 5.1 and some of which run PowerShell 3.0.

Ideally we will test it on those other platforms, but it would be nice if we could try to iron out as many bugs as possible ahead of time.

Checking syntax with PSUseCompatibleSyntax

The first and easiest rule to apply is PSUseCompatibleSyntax. We’re going to create some settings for PSScriptAnalyzer to enable the rule, and then run analysis on our script to get back any diagnostics about compatibility.

Running PSScriptAnalyzer is straightforward. It comes as a PowerShell module, so once it’s installed on your module path you just invoke it on your file with Invoke-ScriptAnalyzer, like this:

Invoke-ScriptAnalyzer -Path '.archiveScript.ps1`

A very simple invocation like this one will run PSScriptAnalyzer using its default rules and configurations on the script you point it to.

However, because they require more targeted configuration, the compatibility rules are not enabled by default. Instead, we need to supply some settings to run the syntax check rule. In particular, PSUseCompatibleSyntax requires a list of the PowerShell versions you are targeting with your script.

$settings = @{
    Rules = @{
        PSUseCompatibleSyntax = @{
            # This turns the rule on (setting it to false will turn it off)
            Enable = $true

            # List the targeted versions of PowerShell here
            TargetVersions = @(
                '3.0',
                '5.1',
                '6.2'
            )
        }
    }
}

Invoke-ScriptAnalyzer -Path .archiveScript.ps1 -Settings $settings

Running this will present us with the following output:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleSyntax               Warning      archiveScr 8     The constructor syntax
                                                 ipt.ps1          '[System.Collections.Generic.Dictionary[string,
                                                                  string]]::new()' is not available by default in
                                                                  PowerShell versions 3,4

This is telling us that the [dictionary[string, string]]::new() syntax we used won’t work in PowerShell 3. Better than that, in this case the rule has actually proposed a fix:

$diagnostics = Invoke-ScriptAnalyzer -Path .archiveScript.ps1 -Settings $settings
$diagnostics[0].SuggestedCorrections
File              : C:UsersroholtDocumentsDevsandboxVersionedScriptarchiveScript.ps1
Description       : Use the 'New-Object @($arg1, $arg2, ...)' syntax instead for compatibility with PowerShell versions 3,4
StartLineNumber   : 8
StartColumnNumber : 11
EndLineNumber     : 8
EndColumnNumber   : 73
Text              : New-Object 'System.Collections.Generic.Dictionary[string,string]'
Lines             : {New-Object 'System.Collections.Generic.Dictionary[string,string]'}
Start             : Microsoft.Windows.PowerShell.ScriptAnalyzer.Position
End               : Microsoft.Windows.PowerShell.ScriptAnalyzer.Position

The suggested correction is to use New-Object instead. The way this is suggested might seem slightly unhelpful here with all the position information, but we’ll see later why this is useful.

This dictionary example is a bit artificial of course (since a hashtable would come more naturally), but having a spanner thrown into the works in PowerShell 3 or 4 because of a ::new() is not uncommon. The PSUseCompatibleSyntax rule will also warn you about classes, workflows and using statements depending on the versions of PowerShell you’re authoring for.

We’re not going to make the suggested change just yet, since there’s more to show you first.

Checking command usage with PSUseCompatibleCommands

We now want to check the commands. Because command compatibility is a bit more complicated than syntax (since the availability of commands depends on more than what version of PowerShell is being run), we have to target profiles instead.

Profiles are catalogs of information taken from stock machines running common PowerShell environments. The ones shipped in PSScriptAnalyzer can’t always match your working environment perfectly, but they come pretty close (there’s also a way to generate your own profile, detailed in a later blog post). In our case, we’re trying to target PowerShell 3.0, PowerShell 5.1 and PowerShell 6.2 on Windows. We have the first two profiles, but in the last case we’ll need to target 6.1 instead. These targets are very close, so warnings will still be pertinent to using PowerShell 6.2. Later when a 6.2 profile is made available, we’ll be able to switch over to that.

We need to look under the PSUseCompatibleCommands documentation for a list of profiles available by default. For our desired targets we pick:

  • PowerShell 6.1 on Windows Server 2019 (win-8_x64_10.0.14393.0_6.1.3_x64_4.0.30319.42000_core)
  • PowerShell 5.1 on Windows Server 2019 (win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework)
  • PowerShell 3.0 on Windows Server 2012 (win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework)

The long names on the right are canonical profile identifiers, which we use in the settings:

$settings = @{
    Rules = @{
        PSUseCompatibleCommands = @{
            # Turns the rule on
            Enable = $true

            # Lists the PowerShell platforms we want to check compatibility with
            TargetProfiles = @(
                'win-8_x64_10.0.14393.0_6.1.3_x64_4.0.30319.42000_core',
                'win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework',
                'win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework'
            )
        }
    }
}

Invoke-ScriptAnalyzer -Path ./archiveScript.ps1 -Settings $settings

There might be a delay the first time you execute this because the rules have to load the catalogs into a cache. Each catalog of a PowerShell platform contains details of all the modules and .NET assemblies available to PowerShell on that platform, which can be as many as 1700 commands with 15,000 parameters and 100 assemblies with 10,000 types. But once it’s loaded, further compatibility analysis will be fast. We get output like this:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleCommands             Warning      archiveScr 2     The parameter 'FullyQualifiedName' is not available for
                                                 ipt.ps1          command 'Import-Module' by default in PowerShell version
                                                                  '3.0' on platform 'Microsoft Windows Server 2012
                                                                  Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The command 'Get-FileHash' is not available by default in
                                                 ipt.ps1          PowerShell version '3.0' on platform 'Microsoft Windows
                                                                  Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 18    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version
                                                                  '5.1.17763.316' on platform 'Microsoft Windows Server
                                                                  2019 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 18    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 19    The command 'Compress-Archive' is not available by
                                                 ipt.ps1          default in PowerShell version '3.0' on platform
                                                                  'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 23    The parameter 'NoNewline' is not available for command
                                                 ipt.ps1          'Out-File' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'

This is telling us that:

  • Import-Module doesn’t support -FullyQualifiedName in PowerShell 3.0;
  • Get-FileHash doesn’t exist in PowerShell 3.0;
  • Split-Path doesn’t have -LeafBase in PowerShell 5.1 or PowerShell 3.0;
  • Compress-Archive isn’t available in PowerShell 3.0, and;
  • Out-File doesn’t support -NoNewline in PowerShell 3.0

One thing you’ll notice is that the Get-FoldersToArchive function is not being warned about. This is because the compatibility rules are designed to ignore user-provided commands; a command will only be marked as incompatible if it’s present in some profile and not in one of your targets.

Again, we can change the script to fix these warnings, but before we do, I want to show you how to make this a more continuous experience; as you change your script, you want to know if the changes you make break compatibility, and that’s easy to do with the steps below.

Using a settings file for repeated invocation

The first thing we want is to make the PSScriptAnalyzer invocation more automated and reproducible. A nice step toward this is taking the settings hashtable we made and turning it into a declarative data file, separating out the “what” from the “how”.

PSScriptAnalyzer will accept a path to a PSD1 in the -Settings parameter, so all we need to do is turn our hashtable into a PSD1 file, which we’ll make ./PSScriptAnalyzerSettings.psd1. Notice we can merge the settings for both PSUseCompatibleSyntax and PSUseCompatibleCommands:

# PSScriptAnalyzerSettings.psd1
# Settings for PSScriptAnalyzer invocation.
@{
    Rules = @{
        PSUseCompatibleCommands = @{
            # Turns the rule on
            Enable = $true

            # Lists the PowerShell platforms we want to check compatibility with
            TargetProfiles = @(
                'win-8_x64_10.0.14393.0_6.1.3_x64_4.0.30319.42000_core',
                'win-8_x64_10.0.17763.0_5.1.17763.316_x64_4.0.30319.42000_framework',
                'win-8_x64_6.2.9200.0_3.0_x64_4.0.30319.42000_framework'
            )
        }
        PSUseCompatibleSyntax = @{
            # This turns the rule on (setting it to false will turn it off)
            Enable = $true

            # Simply list the targeted versions of PowerShell here
            TargetVersions = @(
                '3.0',
                '5.1',
                '6.2'
            )
        }
    }
}

Now we can run the PSScriptAnalyzer again on the script using the settings file:

Invoke-ScriptAnalyzer -Path ./archiveScript.ps1 -Settings ./PSScriptAnalyzerSettings.psd1

This gives the output:

RuleName                            Severity     ScriptName Line  Message
--------                            --------     ---------- ----  -------
PSUseCompatibleCommands             Warning      archiveScr 1     The parameter 'FullyQualifiedName' is not available for
                                                 ipt.ps1          command 'Import-Module' by default in PowerShell version
                                                                  '3.0' on platform 'Microsoft Windows Server 2012
                                                                  Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 9     The command 'Get-FileHash' is not available by default in
                                                 ipt.ps1          PowerShell version '3.0' on platform 'Microsoft Windows
                                                                  Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 12    The parameter 'LeafBase' is not available for command
                                                 ipt.ps1          'Split-Path' by default in PowerShell version
                                                                  '5.1.17763.316' on platform 'Microsoft Windows Server
                                                                  2019 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 13    The command 'Compress-Archive' is not available by
                                                 ipt.ps1          default in PowerShell version '3.0' on platform
                                                                  'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleCommands             Warning      archiveScr 16    The parameter 'NoNewline' is not available for command
                                                 ipt.ps1          'Out-File' by default in PowerShell version '3.0' on
                                                                  platform 'Microsoft Windows Server 2012 Datacenter'
PSUseCompatibleSyntax               Warning      archiveScr 6     The constructor syntax
                                                 ipt.ps1          '[System.Collections.Generic.Dictionary[string,
                                                                  string]]::new()' is not available by default in
                                                                  PowerShell versions 3,4

Now we don’t depend on any variables anymore, and have a separate spefication of the analysis you want. Using this, you could put this into continuous integration environments for example to check that changes in scripts don’t break compatibility.

But what we really want is to know that PowerShell scripts stay compatible as you edit them. That’s what the settings file is building to, and also where it’s easiest to make the changes you need to make your script compatible. For that, we want to integrate with the VSCode PowerShell extension.

Integrating with VSCode for on-the-fly compatibility checking

As explained at the start of this post, VSCode PowerShell extension has builtin support for PSScriptAnalyzer. In fact, as of version 1.12.0, the PowerShell extension ships with PSScriptAnalyzer 1.18, meaning you don’t need to do anything other than create a settings file to do compatibility analysis.

We already have our settings file ready to go from the last step, so all we have to do is point the PowerShell extension to the file in the VSCode settings.

You can open the settings with Ctrl+, (use Cmd instead of Ctrl on macOS). In the Settings view, we want PowerShell > Script Analysis: Settings Path. In the settings.json view this is "powershell.scriptAnalysis.settingsPath". Entering a relative path here will find a settings file in our workspace, so we just put ./PSScriptAnalyzerSettings.psd1:

VSCode settings GUI with PSScriptAnalyzer settings path configured to "./PSScriptAnalyzerSettings.psd1"

In the settings.json view this will look like:

"powershell.scriptAnalysis.settingsPath": "./PSScriptAnalyzerSettings.psd1"

Now, opening the script in VSCode we see “green squigglies” for compatibility warnings:

VSCode window containing script, with green squigglies underneath incompatible code

In the problems pane, you’ll get a full desrciption of all the incompatibilities:

VSCode problems pane, listing and describing identified compatibility issues

Let’s fix the syntax problem first. If you remember, PSScriptAnalyzer supplies a suggested correction to this problem. VSCode integrates with PSScriptAnalyzer’s suggested corrections and can apply them if you click on the lightbulb or with Ctrl+Space when the region is under the cursor:

VSCode suggesting New-Object instead of ::new() syntax

Applying this change, the script is now:

Import-Module -FullyQualifiedName @{ ModuleName = ArchiveHelper; ModuleVersion = '1.1' }

$paths = Get-FoldersToArchive -RootPath 'C:DocumentsDocumentsToArchive'
$archivePath = 'ArchiveServerDocumentArchive'

$hashes = New-Object 'System.Collections.Generic.Dictionary[string,string]'
foreach ($path in $paths)
{
    $hash = (Get-FileHash -LiteralPath $path).Hash
    $hashes[$hash] = $path
    $name = Split-Path -LeafBase $path
    Compress-Archive -LiteralPath $path -DestinationPath (Join-Path $archivePath "$name-$hash.zip")
}

ConvertTo-Json $hashes | Out-File -LiteralPath (Join-Path $archivePath "catalog.json") -NoNewline

The other incompatibilities don’t have corrections; for now PSUseCompatibleCommands knows what commands are available on each platform, but not what to substitute with when a command isn’t available. So we just need to apply some PowerShell knowledge:

  • Instead of Import-Module -FullyQualifiedName @{...} we use Import-Module -Name ... -Version ...;
  • Instead of Get-FileHash, we’re going to need to use .NET directly and write a function;
  • Instead of Split-Path -LeafBase, we can use [System.IO.Path]::GetFileNameWithoutExtension();
  • Instead of Compress-Archive we’ll need to use more .NET methods in a function, and;
  • Instead of Out-File -NoNewline we can use New-Item -Value

We end up with something like this (the specific implementation is unimportant, but we have something that will work in all versions):

Import-Module -Name ArchiveHelper -Version '1.1'

function CompatibleGetFileHash
{
    param(
        [string]
        $LiteralPath
    )

    try
    {
        $hashAlg = [System.Security.Cryptography.SHA256]::Create()
        $file = [System.IO.File]::Open($LiteralPath, 'Open', 'Read')
        $file.Position = 0
        $hashBytes = $hashAlg.ComputeHash($file)
        return [System.BitConverter]::ToString($hashBytes).Replace('-', '')
    }
    finally
    {
        $file.Dispose()
        $hashAlg.Dispose()
    }
}

function CompatibleCompressArchive
{
    param(
        [string]
        $LiteralPath,

        [string]
        $DestinationPath
    )

    if ($PSVersion.Major -le 3)
    {
        # PSUseCompatibleTypes identifies that [System.IO.Compression.ZipFile]
        # isn't available by default in PowerShell 3 and we have to do this.
        # We'll cover that rule in the next blog post.
        Add-Type -AssemblyName System.IO.Compression.FileSystem -ErrorAction Ignore
    }

    [System.IO.Compression.ZipFile]::Create(
        $LiteralPath,
        $DestinationPath,
        'Optimal',
        <# includeBaseDirectory #> $true)
}

$paths = Get-FoldersToArchive -RootPath 'C:DocumentsDocumentsToArchive'
$archivePath = 'ArchiveServerDocumentArchive'

$hashes = New-Object 'System.Collections.Generic.Dictionary[string,string]'
foreach ($path in $paths)
{
    $hash = CompatibleGetFileHash -LiteralPath $path
    $hashes[$hash] = $path
    $name = [System.IO.Path]::GetFileNameWithoutExtension($path)
    CompatibleCompressArchive -LiteralPath $path -DestinationPath (Join-Path $archivePath "$name-$hash.zip")
}

$jsonStr = ConvertTo-Json $hashes
New-Item -Path (Join-Path $archivePath "catalog.json") -Value $jsonStr

You should notice that as you type, VSCode displays new analysis of what you’re writing and the green squigglies drop away. When we’re done we get a clean bill of health for script compatibility:

VSCode window with script and problems pane, with no green squigglies and no problems

This means you’ll now be able to use this script across all the PowerShell versions you need to target. Better, you now have a configuration in your workspace so as you write more scripts, there is continual checking for compatibility. And if your compatibility targets change, all you need to do is change your configuration file in one place to point to your desired targets, at which point you’ll get analysis for your updated target platforms.

Summary

Hopefully in this blog post you got some idea of the new compatibility rules that come with PSScriptAnalyzer 1.18.

We’ve covered how to set up and use the syntax compatibility checking rule, PSUseCompatibleSyntax, and the command checking rule, PSUseCompatibleCommands, both using a hashtable configuration and a settings PSD1 file.

We’ve also looked at using the compatibility rules in with the PowerShell extension for VSCode, where they come by default from version 1.12.0.

If you’ve got the latest release of the PowerShell extension for VSCode (1.12.1), you’ll be able to set your configuration file and instantly get compatibility checking.

In the next blog post, we’ll look at how to use these rules and PSUseCompatibleTypes (which checks if .NET types and static methods are available on target platforms) can be used to help you write scripts that work cross platform across Windows and Linux using both Windows PowerShell and PowerShell Core.


Rob Holt

Software Engineer

PowerShell Team

The post Using PSScriptAnalyzer to check PowerShell version compatibility appeared first on PowerShell.

Protecting Against Ransomware

This post was originally published on this site

Original release date: April 11, 2019

What is ransomware?

Ransomware is a type of malware threat actors use to infect computers and encrypt computer files until a ransom is paid. (See Protecting Against Malicious Code for more information on malware.) After the initial infection, ransomware will attempt to spread to connected systems, including shared storage drives and other accessible computers.

If the threat actor’s ransom demands are not met (i.e., if the victim does not pay the ransom), the files or encrypted data will usually remain encrypted and unavailable to the victim. Even after a ransom has been paid to unlock encrypted files, threat actors will sometimes demand additional payments, delete a victim’s data, refuse to decrypt the data, or decline to provide a working decryption key to restore the victim’s access. The Federal Government does not support paying ransomware demands. (See the FBI’s ransomware article.)

How does ransomware work?

Ransomware identifies the drives on an infected system and begins to encrypt the files within each drive. Ransomware generally adds an extension to the encrypted files, such as .aaa, .micro, .encrypted, .ttt, .xyz, .zzz, .locky, .crypt, .cryptolocker, .vault, or .petya, to show that the files have been encrypted—the file extension used is unique to the ransomware type.

Once the ransomware has completed file encryption, it creates and displays a file or files containing instructions on how the victim can pay the ransom. If the victim pays the ransom, the threat actor may provide a cryptographic key that the victim can use to unlock the files, making them accessible.

How is ransomware delivered?

Ransomware is commonly delivered through phishing emails or via “drive-by downloads.” Phishing emails often appear as though they have been sent from a legitimate organization or someone known to the victim and entice the user to click on a malicious link or open a malicious attachment. A “drive-by download” is a program that is automatically downloaded from the internet without the user’s consent or often without their knowledge. It is possible the malicious code may run after download, without user interaction. After the malicious code has been run, the computer becomes infected with ransomware.

What can I do to protect my data and networks?

  • Back up your computer. Perform frequent backups of your system and other important files, and verify your backups regularly. If your computer becomes infected with ransomware, you can restore your system to its previous state using your backups.  
  • Store your backups separately. Best practice is to store your backups on a separate device that cannot be accessed from a network, such as on an external hard drive. Once the backup is completed, make sure to disconnect the external hard drive, or separate device from the network or computer. (See the Software Engineering Institute’s page on Ransomware).
  • Train your organization. Organizations should ensure that they provide cybersecurity awareness training to their personnel. Ideally, organizations will have regular, mandatory cybersecurity awareness training sessions to ensure their personnel are informed about current cybersecurity threats and threat actor techniques. To improve workforce awareness, organizations can test their personnel with phishing assessments that simulate real-world phishing emails.

What can I do to prevent ransomware infections?

  • Update and patch your computer. Ensure your applications and operating systems (OSs) have been updated with the latest patches. Vulnerable applications and OSs are the target of most ransomware attacks. (See Understanding Patches and Software Updates.)
  • Use caution with links and when entering website addresses. Be careful when clicking directly on links in emails, even if the sender appears to be someone you know. Attempt to independently verify website addresses (e.g., contact your organization’s helpdesk, search the internet for the sender organization’s website or the topic mentioned in the email). Pay attention to the website addresses you click on, as well as those you enter yourself. Malicious website addresses often appear almost identical to legitimate sites, often using a slight variation in spelling or a different domain (e.g., .com instead of .net). (See Using Caution with Email Attachments.)
  • Open email attachments with caution. Be wary of opening email attachments, even from senders you think you know, particularly when attachments are compressed files or ZIP files.
  • Keep your personal information safe. Check a website’s security to ensure the information you submit is encrypted before you provide it. (See Protecting Your Privacy.)
  • Verify email senders. If you are unsure whether or not an email is legitimate, try to verify the email’s legitimacy by contacting the sender directly. Do not click on any links in the email. If possible, use a previous (legitimate) email to ensure the contact information you have for the sender is authentic before you contact them.
  • Inform yourself. Keep yourself informed about recent cybersecurity threats and up to date on ransomware techniques. You can find information about known phishing attacks on the Anti-Phishing Working Group website. You may also want to sign up for CISA product notifications, which will alert you when a new Alert, Analysis Report, Bulletin, Current Activity, or Tip has been published.
  • Use and maintain preventative software programs. Install antivirus software, firewalls, and email filters—and keep them updated—to reduce malicious network traffic. (See Understanding Firewalls for Home and Small Office Use.)

How do I respond to a ransomware infection?

  • Isolate the infected system. Remove the infected system from all networks, and disable the computer’s wireless, Bluetooth, and any other potential networking capabilities. Ensure all shared and networked drives are disconnected whether wired or wireless.  
  • Turn off other computers and devices. Power-off and segregate (i.e., remove from the network) the infected computer(s). Power-off and segregate any other computers or devices that shared a network with the infected computer(s) that have not been fully encrypted by ransomware. If possible, collect and secure all infected and potentially infected computers and devices in a central location, making sure to clearly label any computers that have been encrypted. Powering-off and segregating infected computers and computers that have not been fully encrypted may allow for the recovery of partially encrypted files by specialists. (See Before You Connect a New Computer to the Internet for tips on how to make a computer more secure before you reconnect it to a network.)
  • Secure your backups. Ensure that your backup data is offline and secure. If possible, scan your backup data with an antivirus program to check that it is free of malware.

What do I do if my computer is infected with ransomware?

References

Author: CISA

This product is provided subject to this Notification and this Privacy & Use policy.

The Next Release of PowerShell – PowerShell 7

This post was originally published on this site

Recently, the PowerShell Team shipped the Generally Available (GA) release of PowerShell Core 6.2. Since that release, we’ve already begun work on the next iteration!

We’re calling the next release PowerShell 7, the reasons for which will be explained in this blog post.

Why 7 and not 6.3?

PowerShell Core usage has grown significantly in the last two years. In particular, the bulk of our growth has come from Linux usage, an encouraging statistic given our investment in making PowerShell viable cross-platform.

image

However, we also can clearly see that our Windows usage has not been growing as significantly, surprising given that PowerShell was popularized on the Windows platform. We believe that this could be occurring because existing Windows PowerShell users have existing automation that is incompatible with PowerShell Core because of unsupported modules, assemblies, and APIs. These folks are unable to take advantage of PowerShell Core’s new features, increased performance, and bug fixes. To address this, we are renewing our efforts towards a full replacement of Windows PowerShell 5.1 with our next release.

This means that Windows PowerShell and PowerShell Core users will be able to use the same version of PowerShell to automate across Windows, Linux, and macOS and on Windows, and PowerShell 7 users will have a very high level of compatibility with Windows PowerShell modules they rely on today.

We’re also going to take the opportunity to simplify our references to PowerShell in documentation and product pages, dropping the “Core” in “PowerShell 7”. The PSEdition will still reflect Core, but this will only be a technical distinction in APIs and documentation where appropriate.

Note that the major version does not imply that we will be making significant breaking changes. While we took the opportunity to make some breaking changes in 6.0, many of those were compromises to ensure our compatibility on non-Windows platforms. Prior to that, Windows PowerShell historically updated its major version based on new versions of Windows rather than Semantic Versioning

.NET Core 3.0

PowerShell Core 6.1 brought compatibility with many built-in Windows PowerShell modules, and our estimation is that PowerShell 7 can attain compatibility with 90+% of the inbox Windows PowerShell modules by leveraging changes in .NET Core 3.0 that bring back many APIs required by modules built on .NET Framework so that they work with .NET Core runtime. For example, we expect Out-GridView to come back (for Windows only, though)!

A significant effort for PowerShell 7 is porting the PowerShell Core 6 code base to .NET Core 3.0 and also working with Windows partner teams to validate their modules against PowerShell 7.

Support Lifecycle Changes

Currently, PowerShell Core is under the Microsoft Modern Lifecycle Policy. This means that PowerShell Core 6 is fix-forward: we produce servicing releases for security fixes and critical bug fixes,
and you must install the latest stable version within 6 months of a new minor version release.

In PowerShell 7, we will align more closely with the .NET Core support lifecycle, enabling PowerShell 7 to have both LTS (Long Term Servicing) and non-LTS releases.

We will still have monthly Preview releases to get feedback early.

When do I get PowerShell 7?

The first Preview release of PowerShell 7 will likely be in May. Be aware, however, that this depends on completing integration and validation of PowerShell with .NET Core 3.0.

Since PowerShell 7 is aligned with the .NET Core timeline, we expect the generally available (GA) release to be some time after the GA of .NET Core 3.0.

What about shipping in Windows?

We are planning on eventually shipping PowerShell 7 in Windows as a side-by-side feature with Windows PowerShell 5.1, but we still need to work out some of the details on how you will manage this inbox version of PowerShell 7.

And since the .NET Core timeline doesn’t align with the Windows timeline, we can’t say right now when it will show up in a future version of Windows 10 or Windows Server.

What other features will be in PowerShell 7?

We haven’t closed on our feature planning yet, but expect another blog post relatively soon with a roadmap of our current feature level plans for PowerShell 7.

Steve Lee
https://twitter.com/Steve_MSFT
Principal Engineering Manager
PowerShell Team

The post The Next Release of PowerShell – PowerShell 7 appeared first on PowerShell.

The PowerShell Gallery is now more Accessible

This post was originally published on this site

Over the past few months, the team has been working hard to make the PowerShell Gallery as accessible as possible. This blog details why it matters and what work has been done.

Why making the PowerShell Gallery more accessible was a priority

Accessible products change lives and allow everyone to be included in our product. Accessibility is also a major component of striving toward Microsoft’s mission to “Empower every person and every organization on the planet to achieve more.” Improvements in accessibility mean improvements in usability which makes the experience better for everyone. In doing accessibility testing for the Gallery, for example, we found that it was confusing for users to distinguish between “deleting” and “unlisting” packages. By clearly naming this action in the UI, it makes the process of unlisting a package more clear for all package owners.

The steps taken to make the PowerShell Gallery more accessible

The first part of the process focused on bug generation and resolution. We used scanning technology to ensure that the Gallery alerts and helper texts were configured properly, and were compatible with screen reading technology. We use Keros scanning which is Microsoft’s premier accessibility tool to identify accessibility issues and worked to triage and fix the detected issues.

For the second part of the process, we undertook a scenario-focused accessibility study. For the study, blind or visually impaired IT professionals went through core scenarios for using the Gallery. These scenarios included: finding packages, publishing packages, managing packages, and getting support. The majority of the scenarios focused on searching for packages as we believe this is the primary way customers interact with the Gallery. After the study concluded we reviewed the results and watched recordings of the participants navigating through our scenarios. This process allowed us to focus on improving our lowest performing scenarios by addressing specific usability improvements. After making these improvements we underwent a review by accessibility experts to assure we had high usability and accessibility.

Usability Improvements

  • Screen Reader Compatibility: Screen reader technologies make consuming web content accessible so we underwent thorough review, and improvement, to ensure that the Gallery was providing accurate, consistent, and helpful information to screen readers. Some examples of areas we improved:
    • Accurate Headers
    • Clearly labeled tables
    • Helpful tool tips
    • Labeled graph node points
  • Improved Aria Tags: Accessible Rich Internet Applications (Aria) is a specification that makes web content more accessible by passing helpful information to assistive technologies such as screen readers. We underwent a thorough review, and enhancement, of our Aria tags to make sure they were as helpful as possible. One improvement we made, for example, was an ARIA description explaining how to use tags in the Gallery search bar.
  • Renamed UI elements to be more descriptive: Through our review we noticed we were generating some confusion by labeling the unlist button as “delete” and we worked to fix these types of issues.
  • Filters: We added filters for the operating system to make it easier to find compatible packages.
  • Results description: we made searching for packages more straightforward by displaying the total number of results and pages.
  • Page Scrolling: we made searching for packages easier by adding multi-page scrolling.

Reporting Issues

Our goal is to make the Gallery completely user friendly. If you encounter any issues in the PowerShell Gallery that make it less accessible/usable we would love to hear about it on our GitHub page. Please file an issue letting us know what we can do to make the Gallery even more accessible.

The post The PowerShell Gallery is now more Accessible appeared first on PowerShell.