Web Searches For Archives, (Sun, Sep 14th)

This post was originally published on this site

Johannes wrote a diary entry "Increasing Searches for ZIP Files" where he analyzed the increase of requests for ZIP files (like backup.zip, web.zip, …) for our web honeypots.

I took a look at my logs, and noticed that too. But it's not only ZIP files, but other archives too:

Type
zip
rar
7z
gz
tar

I even had requests for .tar.zip files.

And when it comes to backup files, the following non-archive types are also popular requests:

Filename
backup.sql
backup.json
backup.bak
backup.sh

Looking at the User Agent Strings for these requests, none indicated that these scans were performed by researchers.

And comparing the source IPs of these requests with our researchers list: not a single match.

So it's safe to say that these scans are done with malicious intent, and that you should take Johannes' advice and don't have these types of files on your web servers, and even better, have some policy to avoid this.

 

Didier Stevens
Senior handler
blog.DidierStevens.com

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.

Announcing Amazon EC2 M4 and M4 Pro Mac instances

This post was originally published on this site

As someone who has been using macOS since 2001 and Amazon EC2 Mac instances since their launch 4 years ago, I’ve helped numerous customers scale their continuous integration and delivery (CI/CD) pipelines on AWS. Today, I’m excited to share that Amazon EC2 M4 and M4 Pro Mac instances are now generally available.

Development teams building applications for Apple platforms need powerful computing resources to handle complex build processes and run multiple iOS simulators simultaneously. As development projects grow larger and more sophisticated, teams require increased performance and memory capacity to maintain rapid development cycles.

Apple M4 Mac mini at the core
EC2 M4 Mac instances (known as mac-m4.metal in the API) are built on Apple M4 Mac mini computers and are built on the AWS Nitro System. They feature Apple silicon M4 chips with 10-core CPU (four performance and six efficiency cores), 10-core GPU, 16-core Neural Engine, and 24 GB unified memory, delivering enhanced performance for iOS and macOS application build workloads. When building and testing applications, M4 Mac instances deliver up to 20 percent better application build performance compared to EC2 M2 Mac instances.

EC2 M4 Pro Mac (mac-m4pro.metal in the API) instances are powered by Apple silicon M4 Pro chips with 14-core CPU, 20-core GPU, 16-core Neural Engine, and 48 GB unified memory. These instances offer up to 15 percent better application build performance compared to EC2 M2 Pro Mac instances. The increased memory and computing power make it possible to run more tests in parallel using multiple device simulators.

Each M4 and M4 Pro Mac instance now comes with 2 TB of local storage, providing low-latency storage for improved caching and build and test performance.

Both instance types support macOS Sonoma version 15.6 and later as Amazon Machine Images (AMIs). The AWS Nitro System provides up to 10 Gbps of Amazon Virtual Private Cloud (Amazon VPC) network bandwidth and 8 Gbps of Amazon Elastic Block Store (Amazon EBS) storage bandwidth through high-speed Thunderbolt connections.

Amazon EC2 Mac instances integrate seamlessly with AWS services, which means you can:

Let me show you how to get started
You can launch an EC2 M4 or M4 Pro Mac instances through the AWS Management Console, AWS Command Line Interface (AWS CLI), or AWS SDKs.

For this demo, let’s start an M4 Pro instance from the console. I first allocate a dedicated host to run my instances. On the AWS Management Console, I navigate to EC2, then Dedicated Hosts, and I select Allocate Dedicated Host.

Then, I enter a Name tag and I select the Instance family (mac-m4pro) and an Instance type (mac-m4pro.metal). I choose one Availability Zone and I clear Host maintenance.

EC2 Mac M$ - Dedicated hosts

Alternatively, I can use the command line interface:

aws ec2 allocate-hosts                          
        --availability-zone-id "usw2-az4"       
        --auto-placement "off"                  
        --host-recovery "off"                   
        --host-maintenance "off"                
        --quantity 1                            
        --instance-type "mac-m4pro.metal"

After the dedicated host is allocated to my account, I select the host I just allocated, then I select the Actions menu and choose Launch instance(s) onto host.

Notice the console gives you, among other information, the Latest supported macOS versions for this type of host. In this case, it’s macOS 15.6.

EC2 Mac M4 - Dedicated hosts Launch 

On the Launch an instance page, I enter a Name. I select a macOS Sequoia Amazon Machine Image (AMI). I make sure the Architecture is 64-bit Arm and the Instance type is mac-m4pro.metal.

The rest of the parameters arn’t specific to Amazon EC2 Mac: the network and storage configuration. When starting an instance for development use, make sure you select a volume with minimum 200 Gb or more. The default 100 Gb volume size isn’t sufficient to download and install Xcode.

EC2 Mac M4 - Dedicated hosts Launch DetailsWhen ready, I select the Launch instance orange button on the bottom of the page. The instance will rapidly appear as Running in the console. However, it might take up to 15 minutes to allow you to connect over SSH.

Alternatively, I can use this command:

aws ec2 run-instances 
    --image-id "ami-000420887c24e4ac8"   # AMI ID depends on the region !
    --instance-type "mac-m4pro.metal"   
    --key-name "my-ssh-key-name"        
    --network-interfaces '{"AssociatePublicIpAddress":true,"DeviceIndex":0,"Groups":["sg-0c2f1a3e01b84f3a3"]}'  # Security Group ID depends on your config
    --tag-specifications '{"ResourceType":"instance","Tags":[{"Key":"Name","Value":"My Dev Server"}]}' 
    --placement '{"HostId":"h-0e984064522b4b60b","Tenancy":"host"}'  # Host ID depends on your config 
    --private-dns-name-options '{"HostnameType":"ip-name","EnableResourceNameDnsARecord":true,"EnableResourceNameDnsAAAARecord":false}' 
    --count "1" 

Install Xcode from the Terminal
After the instance is reachable, I can connect using SSH to it and install my development tools. I use xcodeinstall to download and install Xcode 16.4.

From my laptop, I open a session with my Apple developer credentials:

# on my laptop, with permissions to access AWS Secret Manager
» xcodeinstall authenticate -s eu-central-1                                                                                               

Retrieving Apple Developer Portal credentials...
Authenticating...
🔐 Two factors authentication is enabled, enter your 2FA code: 067785
✅ Authenticated with MFA.

I connect to the EC2 Mac instance I just launched. Then, I download and install Xcode:

» ssh ec2-user@44.234.115.119                                                                                                                                                                   

Warning: Permanently added '44.234.115.119' (ED25519) to the list of known hosts.
Last login: Sat Aug 23 13:49:55 2025 from 81.49.207.77

    ┌───┬──┐   __|  __|_  )
    │ ╷╭╯╷ │   _|  (     /
    │  └╮  │  ___|___|___|
    │ ╰─┼╯ │  Amazon EC2
    └───┴──┘  macOS Sequoia 15.6

ec2-user@ip-172-31-54-74 ~ % brew tap sebsto/macos
==> Tapping sebsto/macos
Cloning into '/opt/homebrew/Library/Taps/sebsto/homebrew-macos'...
remote: Enumerating objects: 227, done.
remote: Counting objects: 100% (71/71), done.
remote: Compressing objects: 100% (57/57), done.
remote: Total 227 (delta 22), reused 63 (delta 14), pack-reused 156 (from 1)
Receiving objects: 100% (227/227), 37.93 KiB | 7.59 MiB/s, done.
Resolving deltas: 100% (72/72), done.
Tapped 1 formula (13 files, 61KB).

ec2-user@ip-172-31-54-74 ~ % brew install xcodeinstall 
==> Fetching downloads for: xcodeinstall
==> Fetching sebsto/macos/xcodeinstall
==> Downloading https://github.com/sebsto/xcodeinstall/releases/download/v0.12.0/xcodeinstall-0.12.0.arm64_sequoia.bottle.tar.gz
Already downloaded: /Users/ec2-user/Library/Caches/Homebrew/downloads/9f68a7a50ccfdc479c33074716fd654b8528be0ec2430c87bc2b2fa0c36abb2d--xcodeinstall-0.12.0.arm64_sequoia.bottle.tar.gz
==> Installing xcodeinstall from sebsto/macos
==> Pouring xcodeinstall-0.12.0.arm64_sequoia.bottle.tar.gz
🍺  /opt/homebrew/Cellar/xcodeinstall/0.12.0: 8 files, 55.2MB
==> Running `brew cleanup xcodeinstall`...
Disable this behaviour by setting `HOMEBREW_NO_INSTALL_CLEANUP=1`.
Hide these hints with `HOMEBREW_NO_ENV_HINTS=1` (see `man brew`).
==> No outdated dependents to upgrade!

ec2-user@ip-172-31-54-74 ~ % xcodeinstall download -s eu-central-1 -f -n "Xcode 16.4.xip"
                        Downloading Xcode 16.4
100% [============================================================] 2895 MB / 180.59 MBs
[ OK ]
✅ Xcode 16.4.xip downloaded

ec2-user@ip-172-31-54-74 ~ % xcodeinstall install -n "Xcode 16.4.xip"
Installing...
[1/6] Expanding Xcode xip (this might take a while)
[2/6] Moving Xcode to /Applications
[3/6] Installing additional packages... XcodeSystemResources.pkg
[4/6] Installing additional packages... CoreTypes.pkg
[5/6] Installing additional packages... MobileDevice.pkg
[6/6] Installing additional packages... MobileDeviceDevelopment.pkg
[ OK ]
✅ file:///Users/ec2-user/.xcodeinstall/download/Xcode%2016.4.xip installed

ec2-user@ip-172-31-54-74 ~ % sudo xcodebuild -license accept

ec2-user@ip-172-31-54-74 ~ % 

EC2 Mac M4 - install xcode

Things to know
Select an EBS volume with minimum 200 Gb for development purposes. The 100 Gb default volume size is not sufficient to install Xcode. I usually select 500 Gb. When you increase the EBS volume size after the launch of the instance, remember to resize the APFS filesystem.

Alternatively, you can choose to install your development tools and framework on the low-latency local 2 Tb SSD drive available in the Mac mini. Pay attention that the content of that volume is bound to the instance lifecycle, not the dedicated host. This means that everything will be deleted from the internal SSD storage when you stop and restart the instance.

Themac-m4.metal and mac-m4pro.metal instances support macOS Sequoia 15.6 and later.

You can migrate your existing EC2 Mac instances when the migrated instance runs macOS 15 (Sequoia). Create a custom AMI from your existing instance and start an M4 or M4 Pro instance from this AMI.

Finally, I suggest checking the tutorials I wrote to help you to get started with Amazon EC2 Mac:

Pricing and availability
EC2 M4 and M4 Pro Mac instances are currently available in US East (N. Virginia) and US West (Oregon), with additional Regions planned for the future.

Amazon EC2 Mac instances are available for purchase as Dedicated Hosts through the On-Demand and Savings Plans pricing models. Billing for EC2 Mac instances is per second with a 24-hour minimum allocation period to comply with the Apple macOS Software License Agreement. At the end of the 24-hour minimum allocation period, the host can be released at any time with no further commitment

As someone who works closely with Apple developers, I’m curious to see how you’ll use these new instances to accelerate your development cycles. The combination of increased performance, enhanced memory capacity, and integration with AWS services opens new possibilities for teams building applications for iOS, macOS, iPadOS, tvOS, watchOS, and visionOS platforms. Beyond application development, Apple silicon’s Neural Engine makes these instances cost-effective candidates for running machine learning (ML) inference workloads. I’ll be discussing this topic in detail at AWS re:Invent 2025, where I’ll share benchmarks and best practices for optimizing ML workloads on EC2 Mac instances.

To learn more about EC2 M4 and M4 Pro Mac instances, visit the Amazon EC2 Mac Instances page or refer to the EC2 Mac documentation. You can start using these instances today to modernize your Apple development workflows on AWS.

— seb

Accelerate serverless testing with LocalStack integration in VS Code IDE

This post was originally published on this site

Today, we’re announcing LocalStack integration in the AWS Toolkit for Visual Studio Code that makes it easier than ever for developers to test and debug serverless applications locally. This enhancement builds upon our recent improvements to the AWS Lambda development experience, including the console to IDE integration and remote debugging capabilities we launched in July 2025, continuing our commitment to simplify serverless development on Amazon Web Services (AWS).

When building serverless applications, developers typically focus on three key areas to streamline their testing experience: unit testing, integration testing, and debugging resources running in the cloud. Although AWS Serverless Application Model Command Line Interface (AWS SAM CLI) provides excellent local unit testing capabilities for individual Lambda functions, developers working with event-driven architectures that involve multiple AWS services, such as Amazon Simple Queue Service (Amazon SQS), Amazon EventBridge, and Amazon DynamoDB, need a comprehensive solution for local integration testing. Although LocalStack provided local emulation of AWS services, developers had to previously manage it as a standalone tool, requiring complex configuration and frequent context switching between multiple interfaces, which slowed down the development cycle.

LocalStack integration in AWS Toolkit for VS Code
To address these challenges, we’re introducing LocalStack integration so developers can connect AWS Toolkit for VS Code directly to LocalStack endpoints. With this integration, developers can test and debug serverless applications without switching between tools or managing complex LocalStack setups. Developers can now emulate end-to-end event-driven workflows involving services such as Lambda, Amazon SQS, and EventBridge locally, without needing to manage multiple tools, perform complex endpoint configurations, or deal with service boundary issues that previously required connecting to cloud resources.

The key benefit of this integration is that AWS Toolkit for VS Code can now connect to custom endpoints such as LocalStack, something that wasn’t possible before. Previously, to point AWS Toolkit for VS Code to their LocalStack environment, developers had to perform manual configuration and context switching between tools.

Getting started with LocalStack in VS Code is straightforward. Developers can begin with the LocalStack Free version, which provides local emulation for core AWS services ideal for early-stage development and testing. Using the guided application walkthrough in VS Code, developers can install LocalStack directly from the toolkit interface, which automatically installs the LocalStack extension and guides them through the setup process. When it’s configured, developers can deploy serverless applications directly to the emulated environment and test their functions locally, all without leaving their IDE.

Let’s try it out
First, I’ll update my copy of the AWS Toolkit for VS Code to the latest version. Once, I’ve done this, I can see a new option when I go to Application Builder and click on Walkthrough of Application Builder. This allows me to install LocalStack with a single click.

Once I’ve completed the setup for LocalStack, I can start it up from the status bar and then I’ll be able to select LocalStack from the list of my configured AWS profiles. In this illustration, I am using Application Composer to build a simple serverless architecture using Amazon API Gateway, Lambda, and DynamoDB. Normally, I’d deploy this to AWS using AWS SAM. In this case, I’m going to use the same AWS SAM command to deploy my stack locally.

I just do `sam deploy –guided –profile localstack` from the command line and follow the usual prompts. Deploying to LocalStack using AWS SAM CLI provides the exact same experience I’m used to when deploying to AWS. In the screenshot below, I can see the standard output from AWS SAM, as well as my new LocalStack resources listed in the AWS Toolkit Explorer.

I can even go in to a Lambda function and edit the function code I’ve deployed locally!

Over on the LocalStack website, I can login and take a look at all the resources I have running locally. In the screenshot below, you can see the local DynamoDB table I just deployed.

Enhanced development workflow
These new capabilities complement our recently launched console-to-IDE integration and remote debugging features, creating a comprehensive development experience that addresses different testing needs throughout the development lifecycle. AWS SAM CLI provides excellent local testing for individual Lambda functions, handling unit testing scenarios effectively. For integration testing, the LocalStack integration enables testing of multiservice workflows locally without the complexity of AWS Identity and Access Management (IAM) permissions, Amazon Virtual Private Cloud (Amazon VPC) configurations, or service boundary issues that can slow down development velocity.

When developers need to test using AWS services in development environments, they can use our remote debugging capabilities, which provide full access to Amazon VPC resources and IAM roles. This tiered approach frees up developers to focus on business logic during early development phases using LocalStack, then seamlessly transition to cloud-based testing when they need to validate against AWS service behaviors and configurations. The integration eliminates the need to switch between multiple tools and environments, so developers can identify and fix issues faster while maintaining the flexibility to choose the right testing approach for their specific needs.

Now available
You can start using these new features through the AWS Toolkit for VS Code by updating to v3.74.0. The LocalStack integration is available in all commercial AWS Regions except AWS GovCloud (US) Regions. To learn more, visit the AWS Toolkit for VS Code and Lambda documentation.

For developers who need broader service coverage or advanced capabilities, LocalStack offers additional tiers with expanded features. There are no additional costs from AWS for using this integration.

These enhancements represent another significant step forward in our ongoing commitment to simplifying the serverless development experience. Over the past year, we’ve focused on making VS Code the tool of choice for serverless developers, and this LocalStack integration continues that journey by providing tools for developers to build and test serverless applications more efficiently than ever before.

From YARA Offsets to Virtual Addresses, (Fri, Sep 5th)

This post was originally published on this site

YARA is an excellent tool that most of you probably already know and use daily. If you don't, search on isc.sans.edu, we have a bunch of diaries about it[1]. YARA is very powerful because you can search for arrays of bytes that represent executable code. In this case, you provide the hexadecimal representation of the binary machine code.

Exploit Attempts for Dassault DELMIA Apriso. CVE-2025-5086, (Wed, Sep 3rd)

This post was originally published on this site

When I am thinking about the security of manufacturing environments, I am usually focusing on IoT devices integrated into production lines. All the little sensors and actuators are often very difficult to secure. On the other hand, there is also "big software" that is used to manage manufacturing. One example is DELMIA Apriso by Dassault Systèmes. This type of Manufacturing Operation Management (MOM) or Manufacturing Execution System (MES) ties everything together and promises to connect factory floors to ERP systems. 

Now Open — AWS Asia Pacific (New Zealand) Region

This post was originally published on this site

Kia ora! Today, I’m pleased to share the general availability of the AWS Asia Pacific (New Zealand) Region with three Availability Zones and API name ap-southeast-6. With the new Region, customers can now run workloads and securely store data in New Zealand while serving end users with even lower latency.

The new AWS Asia Pacific (New Zealand) Region will help organizations run their applications and serve end users while maintaining data residency in New Zealand. The NZD $7.5 billion Amazon Web Services (AWS) investment to establish an AWS Region in New Zealand is expected to contribute NZD $10.8 billion to New Zealand’s gross domestic product (GDP) which is estimated to create 1,000 new jobs annually and will enable Kiwi organizations of all sizes to innovate and scale faster using the most secure and resilient infrastructure.

AWS in New Zealand
Since we opened our first office in New Zealand in 2013, we’ve been continuously expanding our infrastructure to better serve Kiwi customers:

Connectivity to the global AWS network – In 2016, AWS enhanced New Zealand’s connectivity to the AWS Global Infrastructure by establishing diverse, high-capacity subsea cable connections, improving network reliability and performance for customers.

Amazon CloudFront – In 2020, AWS expanded its infrastructure footprint in New Zealand by adding two Amazon CloudFront edge locations in Auckland.

AWS Local Zones – To further enhance its infrastructure offerings in New Zealand, AWS introduced an AWS Local Zone in Auckland in 2023 helping customers deliver applications that require single-digit millisecond latency.

AWS Direct Connect – In the same year, AWS also added a Direct Connect location in Auckland to help customers securely link their on-premises networks to AWS resulting in lower networking costs and improved application performance. With this Region launch, AWS is adding another Direct Connect location in Auckland.

Let’s take a look at how AWS customers are leveraging AWS capabilities for diverse needs.

Security and compliance
The New Zealand government has a cloud first policy to encourage cloud adoption across the public sector. AWS supports 143 security standards and compliance certifications, including Payment Card Industry Data Security Standard (PCI DSS), Health Insurance Portability and Accountability Act (HIPAA) and Health Information Technology for Economic and Clinical Health (HITECH), Federal Risk and Authorization Management Program (FedRAMP), General Data Protection Regulation (GDPR), Federal Information Processing Standard (FIPS) 140-3, and National Institute of Standards and Technology (NIST) 800-171, helping customers satisfy compliance requirements around the globe and providing a secure cloud infrastructure.

MATTR, a New Zealand-based organization providing infrastructure and digital trust services to businesses and governments, sees significant benefits from the new Region. To learn more about how MATTR and other organizations like Kiwibank and Deloitte plan to use the AWS New Zealand Region, visit this news article.

Accelerating AI innovation in New Zealand
AWS delivers the most comprehensive set of capabilities for generative AI at every layer of the stack, including a choice of cutting-edge large language models (LLMs) for implementing generative AI with Amazon Bedrock, and the most capable generative AI assistant to transform how work gets done with Amazon Q.

New Zealand customers are already benefiting from the generative AI capabilities offered by AWS.

Thematic is a New Zealand-based global leader in customer intelligence and feedback analysis. Thematic uses generative AI to turn customer feedback data from multiple channels into curated, accurate, and reliable customer intelligence.

“Using Amazon Bedrock is just so incredibly easy that it just makes sense. Whenever we design a solution, we do test more than 10 large language models (LLMs). Consistently the ones offered by AWS are winning those competitions,” said Nathan Holmberg, CTO and Co-Founder, Thematic.

To learn more on other customers like One NZ utilized generative AI, visit this article.

Building cloud skills together
Since signing a memorandum of understanding (MoU) with the New Zealand government in 2022, Amazon has trained more than 50,000 Kiwis toward our goal of 100,000. Amazon is committed to continuing to invest in cloud education through programs including AWS Academy, AWS Skills Builder, AWS Educate, and AWS re/Start. Organizations are using AWS to scale globally while investing in local talent development, supporting New Zealand’s growing demand for cloud expertise.

Xero, a global small business platform helps customers supercharge their business by bringing together the most important small business tools, including accounting, payroll and payments — on one platform. Leveraging AWS since 2016, Xero has scaled its platform globally, enhancing its features and enabling continual innovation.

“Amazon’s commitment to the New Zealand tech industry through their NZD $7.5B investment is promising. It’s a significant vote of confidence that will help connect New Zealand tech exporters with new global opportunities across the AWS ecosystem and the broader Amazon network,” says Bridget Snelling, Xero Country Manager, Aotearoa New Zealand.

Sustainable digital transformation
Through The Climate Pledge, Amazon is committed to reaching net-zero carbon across its business by 2040. AWS is committed to supporting New Zealand’s sustainability goals with efficient and responsible operations of its data centers in the country. The AWS Asia Pacific (New Zealand) Region is underpinned by renewable energy from day one through its agreement with Mercury New Zealand.

Energy companies are using AWS to modernize operations while advancing sustainability goals. Sharesies, a wealth development platform, is using AWS to modernize operations while advancing sustainability goals.

“Sharesies is very supportive of storing customer data in-country and being able to use renewable energy, “ says Sharesies Chief Technical Officer Richard Clark. “To do this in New Zealand on the AWS Cloud and have it fully powered by Mercury’s wind energy is a huge step forward. And very exciting!”

AWS partners in New Zealand
The AWS Partner Network (APN) in New Zealand includes a growing ecosystem of consulting and technology partners helping customers of all sizes design, architect, build, migrate, and manage their workloads on AWS. AWS Partners like Custom D, Grant Thornton Digital, MongoDB, and Parallo are actively supporting customers to deliver innovative solutions tailored to the unique needs of New Zealand organizations across various industries. With the new Region, these partners can now leverage the full capabilities of AWS cloud services locally.

AWS community in New Zealand
New Zealand is also home to one AWS Hero, 26 AWS Community Builders, 6 AWS User Groups and almost 9,000 community members across AWS User Groups in Auckland, Wellington, and Christchurch. If you’re interested in joining AWS User Groups New Zealand, visit their Meetup and social media pages.

Here’s what our AWS Hero Arshad Zackeriya, says about the new Region:

“The launch of the AWS Region in New Zealand is a game-changer for our country. It’s not just about a new set of data centers; it’s about unlocking the potential of New Zealand’s businesses and developer communities, allowing us to build a better, more connected Aotearoa for all.”

Available now
The AWS Asia Pacific (New Zealand) Region is the first infrastructure Region in New Zealand and sixteenth Region in Asia Pacific. With this launch, AWS now spans 120 Availability Zones within 38 geographic Regions around the world, with announced plans for 10 more Availability Zones and three more AWS Regions in the Kingdom of Saudi Arabia, Chile, and the European Sovereign Cloud.

The new Asia Pacific (New Zealand) Region is ready to support your business, and you can find a detailed list of the services available in this Region on the AWS Services by Region page. To learn more, visit the AWS Global Infrastructure page, and start building on ap-southeast-6!

Happy building!
Donnie

AWS Weekly Roundup: Amazon EC2, Amazon Q Developer, IPv6 updates, and more (September 1, 2025)

This post was originally published on this site

My LinkedIn feed was absolutely packed this week with pictures from the AWS Heroes Summit event in Seattle. It was heartwarming to see so many familiar faces and new Heroes coming together.

AWS Heroes Summit 2025

For those not familiar with the AWS Heroes program, it’s a global community recognition initiative that honors individuals who make outstanding contributions to the AWS community. These Heroes share their deep AWS knowledge through content creation, speaking at events, organizing community gatherings, and contributing to open-source projects.

The AWS Heroes Summit brings these exceptional community leaders together, providing a unique platform for knowledge exchange, networking, and collaboration. As someone who regularly interacts with Heroes through our AWS initiatives, I always find these summits invaluable – they offer deep technical discussions, early access to AWS roadmaps, and opportunities to provide direct feedback to AWS service teams. The insights and connections made at these events often translate into better resources and guidance for the broader AWS community.

Last week’s launches

In addition to this inspiring community, here are some AWS launches that caught my attention:

  • AWS expands Internet Protocol v6 (IPv6) support to AWS App Runner, AWS Client VPN, and RDS Data API — Three more AWS services now support IPv6 connectivity, helping you meet compliance requirements and removes the need for handling address translation between IPv4 and IPv6. AWS App Runner now supports IPv6-based inbound and outbound traffic on both public and private App Runner service endpoints. AWS Client VPN announced support for remote access to IPv6 workloads, allowing you to establish secure VPN connections to your IPv6-enabled VPC resources. Finally, RDS Data API now supports IPv6, enabling dual-stack configuration (IPv4 and IPv6) connectivity for your Aurora databases.
  • We launched two new instance families this week: the new storage-optimized I8ge and the general-purpose M8i instances —Our I8ge instances, powered by AWS Graviton4 processors, deliver up to 60% better compute performance compared to their Graviton2-based predecessors. These instances feature third-generation AWS Nitro SSDs, providing up to 55% better real-time storage performance per TB and significantly lower I/O latency. With 120 TB of storage and sizes up to 48xlarge (including two metal options), they offer the highest storage density among AWS Graviton-based storage optimized instances. We also launched M8i and M8i-flex instances with custom Intel Xeon 6 processors. These instances deliver up to 15% better price-performance and 2.5x more memory bandwidth than their predecessors. M8i-flex instances are ideal for general-purpose workloads, available from large to 16xlarge. For demanding applications, you can choose from our SAP-certified M8i instances in 13 sizes, including 2 bare metal options and a new 96xlarge size.
  • Amazon EC2 Mac Dedicated hosts now support Host Recovery and Reboot-based host maintenance — you can enable two new capabilities for your EC2 Mac Dedicated Hosts: Host Recovery and Reboot-based Host Maintenance. Host Recovery automatically detects potential hardware issues on Mac Dedicated Hosts and seamlessly migrates Mac instances to a new replacement host, minimizing disruption to workloads. Reboot-based Host Maintenance automatically stops and restarts instances on replacement hosts when scheduled maintenance events occur, eliminating the need for manual intervention during planned maintenance windows.
  • Amazon Q Developer now supports MCP admin control — Administrators have now the ability to enable or disable the MCP functionality for all the Q Developer clients in their organization. When an administrator disables the functionality, users will not be allowed to add any MCP servers, nor will any previously defined servers be initialized.

Other AWS news

Here are some additional projects and blog posts that you might find interesting:

  • Mastering Amazon Q Developer with Rules — I read an interesting article about Amazon Q Developer’s rules feature this weekend that I want to share with you. What caught my attention is how it solves a pain point I often encounter when working with AI assistants – having to repeatedly explain my coding preferences and standards. With rules, you define your preferences once in Markdown files, and Amazon Q Developer automatically follows them for every interaction. I particularly like how transparent the system is, showing which rules it’s following, and how it helps maintain consistency across teams. Since implementing rules in my projects, I’ve seen more consistent code quality, all while reducing the cognitive load of having to repeatedly explain our standards.
  • Strategies for excelling across all four exam domains of the AWS Certified Machine Learning – Specialty certification. The AWS Training & Certification team, where I spent my first three years at AWS, shared how to prepare for the AWS Certified Machine Learning – Specialty certification, whether you’re starting from scratch or building upon existing AWS Certifications. They share the prerequisites and guidance to help you get ready for this certification and demonstrate your expertise in building ML solutions with AWS.
  • As is now our tradition after Prime Day, we shared the impressive metrics showing how AWS services scaled to support one of the world’s largest shopping events. Amazon Prime Day 2025 was the biggest ever, setting records for both sales volume and total items sold during the 4-day event. This year was particularly special as we saw a significant transformation in the Prime Day experience through advancements in our generative AI offerings, with customers using Alexa+, Rufus, and AI Shopping Guides to discover deals and get product information. The numbers are staggering – Amazon DynamoDB handled tens of trillions of API calls while maintaining high availability, delivering single-digit millisecond responses and peaking at 151 million requests per second. Amazon API Gateway processed over 1 trillion internal service requests—a 30 percent increase in requests on average per day compared to Prime Day 2024.

Upcoming AWS events
Check your calendars and sign up for these upcoming AWS events:

  • AWS Summits — Join free online and in-person events that bring the cloud computing community together to connect, collaborate, and learn about AWS. Register in your nearest city: Toronto (September 4), Los Angeles (September 17), and Bogotá (October 9).
  • AWS re:Invent 2025 — This flagship annual conference is coming to Las Vegas from December 1–5. The event catalog is now available. Mark your calendars for this not to be missed gathering of the AWS community.
  • AWS Community Days — Join community-led conferences that feature technical discussions, workshops, and hands-on labs led by expert AWS users and industry leaders from around the world: Adria (September 5), Baltic (September 10), Aotearoa (September 18), South Africa (September 20), Bolivia (September 20), Portugal (September 27).

Join the AWS Builder Center to learn, build, and connect with builders in the AWS community. Browse here for upcoming in-person and virtual developer-focused events.

That’s all for this week. Check back next Monday for another Weekly Roundup!

— seb