What’s New with VMware vCloud Availability 3.5?

This post was originally published on this site

Our team has officially released VMware vCloud Availability 3.5 this week. This is a jam-packed release that has quite a bit of value based on Cloud Provider feedback and customer consumption. Download Page Release Notes In this post, I’m going to highlight the new technical additions that are now part of the VMware vCloud Availability … Continue reading “What’s New with VMware vCloud Availability 3.5?”

The post What’s New with VMware vCloud Availability 3.5? appeared first on Clouds, etc..

Special Webcast: How to Perform a Security Investigation in Amazon Web Services (AWS) – November 14, 2019 1:00pm US/Eastern

This post was originally published on this site

Speakers: Kyle Dickinson and Nam Le

Do you have a plan in place describing how to conduct an effective investigation in AWS? What security controls, techniques, and data sources can you leverage when investigating and containing an incident in the cloud? Learn how to leverage different technologies to determine the source and timeline of the event, and the systems targeted to define a reliable starting point from which to begin your investigations.

In this recorded webcast, SANS instructor and cloud security expert Kyle Dickinson explains how to conduct an investigation in AWS from preparation through completion.

Attendees at this webcast will learn about the:

  • Prerequisites for performing an effective investigation
  • Services that enable an investigation
  • How to plan an investigation
  • Steps to completing an investigation in AWS

Register for this webcast to be among the first to receive the associated whitepaper written by incident response and forensics expert Kyle Dickinson.

vExpert Cloud Management October 2019 Blog Digest

This post was originally published on this site

  VMware’s vExpert program is made up of industry professionals from around the vCommunity. The vExpert title is awarded to individuals who dedicate their time outside of work to share their experiences and useful information about VMware products with the rest of the community. In July of 2019, we spun off a sub-program called vExpert

The post vExpert Cloud Management October 2019 Blog Digest appeared first on VMware Cloud Management.

VMware and the 2019 Tianfu Cup PWN Contest

This post was originally published on this site

We wanted to post a quick acknowledgement that VMware will have representatives in attendance at the 2019 Tianfu Cup PWN Contest in Chengdu, China to review any vulnerabilities that may be demonstrated during the contest. We would like to thank the organizers for inviting us to attend. Stay tuned for further updates. As always please sign up

The post VMware and the 2019 Tianfu Cup PWN Contest appeared first on Security & Compliance Blog.

Minimal Touch VDI Image Building With MDT, PowerCLI, and Chocolatey

This post was originally published on this site

Recently, Mark Brookfield posted a three-part series on the process he uses for building Windows 10 images in HobbitCloud (Part 1, Part 2, Part 3). Mark has put together a great series of posts that explain the tools and the processes that he is using in his lab, and it has inspired me to revisit … Continue reading Minimal Touch VDI Image Building With MDT, PowerCLI, and Chocolatey

Accelerate SQL Server Always On Deployments with AWS Launch Wizard

This post was originally published on this site

Customers sometimes tell us that while they are experts in their domain, their unfamiliarity with the cloud can make getting started more challenging and take more time. They want to be able to quickly and easily deploy enterprise applications on AWS without needing prior tribal knowledge of the AWS platform and best practices, so as to accelerate their journey to the cloud.

Announcing AWS Launch Wizard for SQL Server
AWS Launch Wizard for SQL Server is a simple, intuitive and free to use wizard-based experience that enables quick and easy deployment of high availability SQL solutions on AWS. The wizard walks you through an end-to-end deployment experience of Always On Availability Groups using prescriptive guidance. By answering a few high-level questions about the application such as required performance characteristics the wizard will then take care of identifying, provisioning, and configuring matching AWS resources such as Amazon Elastic Compute Cloud (EC2) instances, Amazon Elastic Block Store (EBS) volumes, and an Amazon Virtual Private Cloud. Based on your selections the wizard presents you with a dynamically generated estimated cost of deployment – as you modify your resource selections, you can see an updated cost assessment to help you match your budget.

Once you approve, AWS Launch Wizard for SQL Server provisions these resources and configures them to create a fully functioning production-ready SQL Server Always On deployment in just a few hours. The created resources are tagged making it easy to identity and work with them and the wizard also creates AWS CloudFormation templates, providing you with a baseline for repeatable and consistent application deployments.

Subsequent SQL Server Always On deployments become faster and easier as AWS Launch Wizard for SQL Server takes care of dealing with the required infrastructure on your behalf, determining the resources to match your application’s requirements such as performance, memory, bandwidth etc (you can modify the recommended defaults if you wish). If you want to bring your own SQL Server licenses, or have other custom requirements for the instances, you can also select to use your own custom AMIs provided they meet certain requirements (noted in the service documentation).

Using AWS Launch Wizard for SQL Server
To get started with my deployment, in the Launch Wizard Console I click the Create deployment button to start the wizard and select SQL Server Always On.


The wizard requires an AWS Identity and Access Management (IAM) role granting it permissions to deploy and access resources in my account. The wizard will check to see if a role named AmazonEC2RoleForLaunchWizard exists in my account. If so it will be used, otherwise a new role will be created. The new role will have two AWS managed policies, AmazonSSMManagedInstanceCore and AmazonEC2RolePolicyforLaunchWizard, attached to it. Note that this one time setup process will be typically performed by an IAM Administrator for your organization. However, the IAM user does not have to be an Administrator and CreateRole, AttachRolePolicy, and GetRole permissions are sufficient to perform these operations. After the role is created, the IAM Administrator can delegate the application deployment process to another IAM user who, in turn, must have the AWS Launch Wizard for SQL Server IAM managed policy called AmazonLaunchWizardFullaccess attached to it.

With the application type selected I can proceed by clicking Next to start configuring my application settings, beginning with setting a deployment name and optionally an Amazon Simple Notification Service (SNS) topic that AWS Launch Wizard for SQL Server can use for notifications and alerts. In the connectivity options I can choose to use an existing Amazon Virtual Private Cloud or have a new one created. I can also specify the name of an existing key pair (or create one). The key pair will be used if I want to RDP into my instances or obtain the administrator password. For a new Virtual Private Cloud I can also configure the IP address or range to which remote desktop access will be permitted:
Instances launched by AWS Launch Wizard for SQL Server will be domain joined to an Active Directory. I can select either an existing AWS Managed AD, or an on-premises AD, or have the wizard create a new AWS Managed Directory for my deployment:

The final application settings relate to SQL Server. This is also where I can specify a custom AMI to be used if I want to bring my own SQL Server licenses or have other customization requirements. Here I’m just going to create a new SQL Server Service account and use an Amazon-provided image with license included. Note that if I choose to use an existing service account it should be part of the Managed AD in which you are deploying:

Clicking Next takes me to a page to define the infrastructure requirements of my application, in terms of CPU and network performance and memory. I can also select the type of storage (solid state vs magnetic) and required SQL Server throughput. The wizard will recommend the resource types to be launched but I can also override and select specific instance and volume types, and I can also set custom tags to apply to the resources that will be created:

The final section of this page shows me the cost estimate based on my selections. This data in this panel is dynamically generated based on my prior selections and I can go back and forth in the wizard, tuning my selections to match my budget:

When I am happy with my selections, clicking Next takes me to wizard’s final Review page where I can view a summary of my selections and acknowledge that AWS resources and AWS Identity and Access Management (IAM) permissions will be created on my behalf, along with the estimated cost as was shown in the estimator on the previous page. My final step is to click Deploy to start the deployment process. Status updates during deployment can be viewed on the Deployments page with a final notification to inform me on completion.

Post-deployment Management
Once my application has been deployed I can manage its resources easily. Firstly I can navigate to Deployments on the AWS Launch Wizard for SQL Server dashboard and using the Actions dropdown I can jump to the Amazon Elastic Compute Cloud (EC2) console where I can manage the EC2 instances, EBS volumes, Active Directory etc. Or, using the same Actions dropdown, I can access SQL Server via the remote desktop gateway instance. If I want to manage future updates and patches to my application using AWS Systems Manager another Actions option takes me to the Systems Manager dashboard for managing my application. I can also use the AWS Launch Wizard for SQL Server to delete deployments performed using the wizard and it will perform a roll-back of all AWS CloudFormation stacks that the service created.

Now Available
AWS Launch Wizard for SQL Server is generally available and you can use it in the following AWS Regions: US East (N. Virginia), US East (Ohio), US West (N. California), US West (Oregon), Canada (Central), South America (Sao Paulo), Asia Pacific (Mumbai), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Seoul), Asia Pacific (Tokyo), EU (Frankfurt), EU (Ireland), EU (London), and EU (Stockholm). Support for the AWS regions in China, and for the GovCloud Region, is in the works. There is no additional charge for using AWS Launch Wizard for SQL Server, only for the resources it creates.

— Steve

AWS Data Exchange – Find, Subscribe To, and Use Data Products

This post was originally published on this site

We live in a data-intensive, data-driven world! Organizations of all types collect, store, process, analyze data and use it to inform and improve their decision-making processes. The AWS Cloud is well-suited to all of these activities; it offers vast amounts of storage, access to any conceivable amount of compute power, and many different types of analytical tools.

In addition to generating and working with data internally, many organizations generate and then share data sets with the general public or within their industry. We made some initial steps to encourage this back in 2008 with the launch of AWS Public Data Sets (Paging Researchers, Analysts, and Developers). That effort has evolved into the Registry of Open Data on AWS (New – Registry of Open Data on AWS (RODA)), which currently contains 118 interesting datasets, with more added all the time.

New AWS Data Exchange
Today, we are taking the next step forward, and are launching AWS Data Exchange. This addition to AWS Marketplace contains over one thousand licensable data products from over 80 data providers. There’s a diverse catalog of free and paid offerings, in categories such as financial services, health care / life sciences, geospatial, weather, and mapping.

If you are a data subscriber, you can quickly find, procure, and start using these products. If you are a data provider, you can easily package, license, and deliver products of your own. Let’s take a look at Data Exchange from both vantage points, and then review some important details.

Let’s define a few important terms before diving in:

Data Provider – An organization that has one or more data products to share.

Data Subscriber – An AWS customer that wants to make use of data products from Data Providers.

Data Product – A collection of data sets.

Data Set – A container for data assets that belong together, grouped by revision.

Revision – A container for one or more data assets as of a point in time.

Data Asset – The actual data, in any desired format.

AWS Data Exchange for Data Subscribers
As a data subscriber, I click View product catalog and start out in the Discover data section of the AWS Data Exchange Console:

Products are available from a long list of vendors:

I can enter a search term, click Search, and then narrow down my results to show only products that have a Free pricing plan:

I can also search for products from a specific vendor, that match a search term, and that have a Free pricing plan:

The second one looks interesting and relevant, so I click on 5 Digit Zip Code Boundaries US (TRIAL) to learn more:

I think I can use this in my app, and want to give it a try, so I click Continue to subscribe. I review the details, read the Data Subscription Agreement, and click Subscribe:

The subscription is activated within a few minutes, and I can see it in my list of Subscriptions:

Then I can download the set to my S3 bucket, and take a look. I click into the data set, and find the Revisions:

I click into the revision, and I can see the assets (containing the actual data) that I am looking for:

I select the asset(s) that I want, and click Export to Amazon S3. Then I choose a bucket, and Click Export to proceed:

This creates a job that will copy the data to my bucket (extra IAM permissions are required here; read the Access Control documentation for more info):

The jobs run asynchronously and copy data from Data Exchange to the bucket. Jobs can be created interactively, as I just showed you, or programmatically. Once the data is in the bucket, I can access and process it in any desired way. I could, for example, use a AWS Lambda function to parse the ZIP file and use the results to update a Amazon DynamoDB table. Or, I could run an AWS Glue crawler to get the data into my Glue catalog, run an Amazon Athena query, and visualize the results in a Amazon QuickSight dashboard.

Subscription can last from 1-36 months with an auto-renew option; subscription fees are billed to my AWS account each month.

AWS Data Exchange for Data Providers
Now I am going to put my “data provider” hat and show you the basics of the publication process (the User Guide contains a more detailed walk-through). In order to be able to license data, I must agree to the terms and conditions, and my application must be approved by AWS.

After I apply and have been approved, I start by creating my first data set. I click Data sets in the navigation, and then Create data set:

I describe my data set, and have the option to tag it, then click Create:

Next, I click Create revision to create the first revision to the data set:

I add a comment, and have the option to tag the revision before clicking Create:

I can copy my data from an existing S3 location, or I can upload it from my desktop:

I choose the second option, select my file, and it appears as an Imported asset after the import job completes. I review everything, and click Finalize for the revision:

My data set is ready right away, and now I can use it to create one or more products:

The console outlines the principal steps:

I can set up public pricing information for my product:

AWS Data Exchange lets me create private pricing plans for individual customers, and it also allows my existing customers to bring their existing (pre-AWS Data Exchange) licenses for my products along with them by creating a Bring Your Own Subscription offer.

I can use the provided Data Subscription Agreement (DSA) provided by AWS Data Exchange, use it as the basis for my own, or I can upload an existing one:

I can use the AWS Data Exchange API to create, update, list, and manage data sets and revisions to them. Functions include CreateDataSet, UpdataSet, ListDataSets, CreateRevision, UpdateAsset, and CreateJob.

Things to Know
Here are a couple of things that you should know about Data Exchange:

Subscription Verification – The data provider can also require additional information in order to verify my subscription. If that is the case, the console will ask me to supply the info, and the provider will review and approve or decline within 45 days:

Here is what the provider sees:

Revisions & Notifications – The Data Provider can revise their data sets at any time. The Data Consumer receives a CloudWatch Event each time a product that they are subscribed to is updated; this can be used to launch a job to retrieve the latest revision of the assets. If you are implementing a system of this type and need some test events, find and subscribe to the Heartbeat product:

Data Categories & Types – Certain categories of data are not permitted on AWS Data Exchange. For example, your data products may not include information that can be used to identify any person, unless that information is already legally available to the public. See, Publishing Guidelines for detailed guidelines on what categories of data are permitted.

Data Provider Location – Data providers must either be a valid legal entity domiciled in the United States or in a member state of the EU.

Available Now
AWS Data Exchange is available now and you can start using it today. If you own some interesting data and would like to publish it, start here. If you are a developer, browse the product catalog and look for data that will add value to your product.

Jeff;

 

 

How to Configure a Syslog Server for NSX Manager, NSX Edge and NSX Controller Cluster

This post was originally published on this site

It is advised to have a centralized location where you can store the logs generated on the infrastructure. Configuring the Syslog server on the environment will help you to achieve this, also this helps you to understand what is going on in your network environment is crucial to the health of your system. Using a […]

The post How to Configure a Syslog Server for NSX Manager, NSX Edge and NSX Controller Cluster appeared first on VMarena.