Kickstart Your DShield Honeypot [Guest Diary], (Thu, Oct 3rd)

This post was originally published on this site

[This is a Guest Diary by Joshua Gilman, an ISC intern as part of the SANS.edu BACS program]

Introduction

Setting up a DShield honeypot is just the beginning. The real challenge lies in configuring all the necessary post-installation settings, which can be tedious when deploying multiple honeypots. Drawing from personal experience and valuable feedback from past interns at the Internet Storm Center (ISC), I developed DShieldKickStarter to automate this often repetitive and time-consuming process.

What is DShieldKickStarter?

DShieldKickStarter is not a honeypot deployment tool. Instead, it’s a post-installation configuration script designed to streamline the setup of a honeypot environment after the DShield honeypot software has been installed. The script ensures that honeypots run efficiently with minimal manual effort by automating essential tasks such as setting up log backups, PCAP capture, and installing optional analysis tools.

Key Features of DShieldKickStarter

•    Automated Log Backups: The script organizes, compresses, and password-protects honeypot logs to prevent accidental execution of malicious files.
•    PCAP Capture Setup: Using tcpdump, it captures network traffic while excluding specific ports, ensuring relevant data is logged.
•    Optional Tool Installation: Cowrieprocessor and JSON-Log-Country are included as optional tools. Both were invaluable during my internship for streamlining data analysis.
•    Helpful for Multiple Honeypots: This script is handy when managing several honeypots. It saves time by automating repetitive setup tasks.

Step-by-Step Breakdown

The script automates several critical tasks:
1.    Creating Directories and Setting Permissions 
            Ensures the necessary directory structures for logs, backups, and PCAP data are in place, with proper permissions to secure sensitive files.
2.    Installing Required Packages 
             Installs essential tools such as tcpdump, git, and python3-pip, streamlining the log and packet capture setup.
3.    Configuring Log Rotation and Backups 
             Automatically rotates logs and stores them with password protection. PCAP files and honeypot logs are archived daily, and older backups are cleaned to save space.
4.    Automating PCAP Capture 
             Sets up tcpdump to capture network traffic, excluding predefined ports to ensure relevant data capture. The process is automated via cron jobs.
5.    Optional Tool Integration 
             The script optionally installs cowrieprocessor and JSON-Log-Country, two tools that were extremely helpful during my internship. These streamline log processing and help categorize attack data for further analysis.
6.    SCP Option for Off-Sensor Backup 
             If enabled, the script supports SCP transfers to a remote server, automating the secure transfer of backups for off-sensor storage.

Who Benefits from This?

•    ISC Handlers and Interns: This tool provides a streamlined process for post-installation setup, allowing for faster honeypot deployment and data collection.
•    Cybersecurity Professionals: This tool's time-saving features can benefit anyone interested in setting up a DShield honeypot and contributing to threat intelligence efforts.

Tool Showcase

1. CowrieProcessor

Description

CowrieProcessor is a Python tool designed to process and summarize Cowrie logs, allowing for more accessible and detailed analysis. Cowrie logs can contain overwhelming data as they track every interaction with the honeypot. CowrieProcessor condenses this data into a readable format, focusing on crucial elements like session details, IP addresses, commands entered by attackers, and malicious files downloaded during the session.

Usage and Benefits

The tool automates the parsing of Cowrie logs, providing a summary that includes key metrics such as session duration, attacker IPs, and the commands used during each attack. This is useful for quickly understanding attacker behavior without sifting through massive raw log files. With this, security teams can focus on actionable insights, such as blocking specific IPs or analyzing downloaded malware.

Screenshot Explanation

In the attached screenshot, CowrieProcessor provides a detailed view of a session from an attack on the honeypot. It shows session details, commands attempted by the attacker, and files downloaded, such as the malicious authorized_keys file. The easy-to-read output from CowrieProcessor highlights the attack flow, giving you insight into the malicious actor’s intentions.


CowrieProcessor output showing session details and malicious activities detected by the honeypot.

DShield SIEM (ELK)

Description

While DShield SIEM (ELK) is not included in the script, it is crucial in further analysis and data visualization for honeypots. ELK (Elastic Stack) enables the collection, processing, and real-time visualization of honeypot data. It provides a centralized platform to track attacker behavior, detect patterns, and generate insights through interactive dashboards.

Usage and Benefits

Using ELK, you can monitor key metrics such as the most frequent attacker IPs, session types, and the commands attackers use. ELK dashboards also provide the ability to create custom queries using Kibana Query Language (KQL), which allows you to filter logs by specific attributes like failed logins, session durations, or malicious file downloads.


ELK dashboard showing attack data, top IP addresses, session activity, and trends over time.

Screenshot Explanation

The attached screenshot shows a detailed ELK dashboard summarizing honeypot data. On the left side, the "Top 50 IP" table displays the most active attacking IPs, while the center pie charts break down the types of logs (honeypot, webhoneypot, etc.) and session activity. The bar chart on the right visualizes Cowrie activity over time, helping analysts track attack patterns. KQL can filter this data even further, focusing on specific attacks or malicious behaviors.

KQL (Kibana Query Language)

One of the standout features of ELK is the ability to leverage KQL for deep-dive investigations. For instance, if you want to search for all failed login attempts, you can use a KQL query like:
event.outcome: "login.failed"

This query will instantly filter your logs, allowing you to pinpoint where and when login attempts failed. Another useful query might be filtering by source IP to track all actions from a particular attacker:
source.ip: "45.148.10.242"

With KQL, you can quickly analyze data across large volumes of logs, making it easy to detect anomalies, potential threats, or patterns in attacker behavior.

[1] https://github.com/DShield-ISC/dshield
[2] https://github.com/iamjoshgilman/DShieldKickStarter
[3] https://github.com/jslagrew/cowrieprocessor
[4] https://github.com/justin-leibach/JSON-Log-Country
[5] https://github.com/bruneaug/DShield-SIEM
[6] https://www.elastic.co/guide/en/kibana/current/kuery-query.html
[7] https://www.sans.edu/cyber-security-programs/bachelors-degree/
———–
Guy Bruneau IPSS Inc.
My Handler Page
Twitter: GuyBruneau
gbruneau at isc dot sans dot edu

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.

Security related Docker containers, (Wed, Oct 2nd)

This post was originally published on this site

Over the last 9 months or so, I've been putting together some docker containers that I find useful in my day-to-day malware analysis and forensicating. I have been putting them up on hub.docker.com and decided, I might as well let others know they were there. In a couple of cases, I just found it easier to create a docker container than try to remember to switch in and out of a Python virtualenv. In a couple of other cases, it avoids issues I've had with conflicting version of installed packages. In every case, I'm tracking new releases so I can update my containers when new releases come out and I usually do so within a couple of days of the new release. The ones that I have up at the moment are the following:

NICE DCV is now Amazon DCV with 2024.0 release

This post was originally published on this site

Today, NICE DCV has a new name. So long NICE DCV, welcome Amazon DCV. Today, with the 2024.0 release, along with enhancements and bug fixes, NICE DCV is rebranded to Amazon DCV.

The new name is now also used to consistently refer to the DCV protocol powering AWS managed services such as Amazon AppStream 2.0 and Amazon WorkSpaces.

What is Amazon DCV
Amazon DCV is a high-performance remote display protocol. It lets you securely deliver remote desktops and application streaming from any cloud or data center to any device, over varying network conditions. By using Amazon DCV with Amazon Elastic Compute Cloud (Amazon EC2), you can run graphics-intensive applications remotely on EC2 instances. You can then stream the results to more modest client machines, which eliminates the need for expensive dedicated workstations.

Amazon DCV supports both Windows and major flavors of Linux operating systems on the server side, providing you flexibility to fit your organization’s needs. The client-side that receives the desktops and application streamings could be the native DCV client for Windows, Linux, or macOS or web browsers. The DCV remote server and client transfer only encrypted pixels, not data, so no confidential data is downloaded from the DCV server. When you choose to use Amazon DCV on Amazon Web Services (AWS) with EC2 instances, you can take advantage of the AWS 108 Availability Zones across the 33 geographic Regions and 31 local zones, allowing your remote streaming services to scale globally.

Since Amazon acquired NICE 8 years ago, we’ve witnessed a diverse range of customers adopting DCV. From general-purpose users visualizing business applications to industry-specific professionals, DCV has proven to be versatile. For instance, artists have employed DCV to access powerful cloud workstations for their digital content creation and rendering tasks. In the healthcare sector, medical imaging professionals have used DCV for remote visualization and analysis of patient data. Geoscientists have used DCV to analyze reservoir simulation results, while engineers in manufacturing have used it to visualize computational fluid dynamics experiments. The education and IT support industries have benefited from collaborative sessions in DCV, in which multiple users can share a single desktop.

Notable customers include Quantic Dream, an award-winning game development studio that has harnessed DCV to create high-resolution, low-latency streaming services for their artists and developers. Tally Solutions, an enterprise resource planning (ERP) services provider, has employed DCV to securely stream its ERP software to thousands of customers. Volkswagen has used DCV to provide remote access to computer-aided engineering (CAE) applications for over 1,000 automotive engineers. Amazon Kuiper, an initiative to bring broadband connectivity to underserved communities, has used DCV for designing complex chips.

Within AWS, DCV has been adopted by several services to provide managed solutions to customers. For example, AppStream 2.0 uses DCV to offer secure, reliable, and scalable application streaming. Additionally, since 2020, Amazon WorkSpaces Streaming Protocol (WSP), which is built on DCV and optimized for high performance, is available for Amazon WorkSpaces customers. Today, we’re also phasing out the WSP name and replacing it with DCV. Going forward, you will have DCV as a primary protocol choice in Amazon WorkSpaces.

What’s new with version 2024.0
Amazon DCV 2024.0 introduces several fixes and enhancements for improved performance, security, and ease of use. The 2024.0 release now supports the latest Ubuntu 24.04 LTS, bringing the latest security updates and extended long-term support to simplify system maintenance. The DCV client on Ubuntu 24.04 has built in support for Wayland, offering better graphical rendering efficiency and enhanced application isolation. Additionally, DCV 2024.0 now enables the QUIC UDP protocol by default, allowing clients to benefit from an optimized streaming experience. The release also introduces the capability to blank the Linux host screen when a remote user is connected, preventing local access and interaction with the remote session.

How to get started
The easiest way to test DCV is to spin up a WorkSpaces instance from the WorkSpaces console, selecting one of the DCV-powered bundles, or creating an AppStream session. For this demo however, I want to show you how to install DCV server on an EC2 instance.

I installed DCV server on two servers running on Amazon EC2, one running Windows Server 2022 and one running Ubuntu 24.04. I also installed the client on my macOS laptop. The client and server packages are available to download on our website. For both servers, make sure the security group authorizes inbound connection on UDP or TCP port 8443, the default port DCV uses.

The Windows installation is straightforward: start the msi file, select Next at each step and voilà. It was installed in less time than it took me to write this sentence.

The installation on Linux deserves a bit more care. Amazon Machine Images (AMI) for EC2 servers don’t include any desktop or graphical components. As a prerequisite, I had to install the X Window System and a window manager, and configure X to let users connect and start a graphical user interface session on the server. Fortunately, all these steps are well documented. Here is a summary of the commands I used.

# install desktop packages 
$ sudo apt install ubuntu-desktop

# install a desktop manager 
$ sudo apt install gdm3

# reboot
$ sudo reboot

After the reboot, I installed the DCV server package

# Install the server 
$ sudo apt install ./nice-dcv-server_2024.0.17794-1_amd64.ubuntu2404.deb
$ sudo apt install ./nice-xdcv_2024.0.625-1_amd64.ubuntu2404.deb

# (optional) install the DCV web viewer to allow clients to connect from a web browser
$ sudo apt install ./nice-dcv-web-viewer_2024.0.17794-1_amd64.ubuntu2404.deb

Because my server had no GPU, I also followed these steps to install X11 Dummy driver and configure X11 to use it.

Then, I started the service:

$ sudo systemctl enable dcvserver.service 
$ sudo systemctl start dcvserver.service 
$ sudo systemctl status dcvserver.service 

I created a user at the operating system level and assigned a password and a home directory. Then, I checked my setup on the server before trying to connect from the server.

$ sudo dcv list-sessions
There are no sessions available.

$ sudo dcv create-session console --type virtual --owner seb

$ sudo dcv list-sessions
Session: 'console' (owner:seb type:virtual)

Once my server configuration was ready, I started the DCV client on my laptop. I only had to enter the IP address of the server and the username and password of the user to initiate a session.

DCV Client - enter ip address DCV Client enter username and apssword

On my laptop, I opened a new DCV client window and connected to the other EC2 server. After a few seconds, I was able to remotely work with the Windows and the Ubuntu machine running in the cloud.

DCV two clients from macOS

In this example, I focus on installing Amazon DCV on a single EC2 instance. However, when building your own service infrastructure, you may want to explore the other components that are part of the DCV offering: Amazon DCV Session Manager, Amazon DCV Access Console, and Amazon DCV Connection Gateway.

Pricing and availability
Amazon DCV is free of charges when used on AWS. You only pay for the usage of AWS resources or services, such as EC2 instances, Amazon Workspace desktops, or Amazon App Stream 2.0. If you plan to use DCV with on-premises servers, check the list of license resellers on our website.

Now go build your own servers with DCV.

— seb

Hurricane Helene Aftermath – Cyber Security Awareness Month, (Tue, Oct 1st)

This post was originally published on this site

For a few years now, October has been "National Cyber Security Awareness Month". This year, it is a good opportunity for a refresher on some scams that tend to happen around disasters like Hurricane Helene. The bigger the disaster, the more attractive it is to scammers.

Fake Donation Sites

Hurricane Katrina was the first event that triggered many fake donation websites. Since then, the number of fake donation websites has decreased somewhat, partly due to law enforcement attention and hopefully due to people becoming more aware of these scams. These scams either pretend to be a new charity/group attempting to help or impersonate an existing reputable charity. People in affected areas need help. Please only donate to groups you are familiar with and who were active before the event.

AI Social Media Posts

I believe these posts are mostly created to gain social media followers, maybe with the intent to later reel them into some scam. They often post dramatic images created with AI tools or copied from legitimate accounts. Some may just be interested in some of the monetization schemes social media and video sites are participating. Do not amplify these accounts. Strictly speaking, they are not "fake news," but legitimate news sources who go out to take pictures and gather information need exposure more than these fake accounts. Often, the fake accounts will contribute to at least exaggeration of the impact of the event and reduce, in some cases, the credibility of legitimate recovery efforts

Malware

Attackers may use the event as a pretense to trick victims into opening attachments. In the past, we have seen e-mails and websites that spread malware claiming to include videos or images of the event. These attachments turn out to be executables installing malware.

Fake Assistance Scams

In the aftermath of a disaster, organizations often provide financial aid through loans. Scammers will apply for these loans using stolen identities traded online. If it may take several months for the victim to become aware of this, they often face a request to repay the loan. Sadly, there is not much, if anything, to protect yourself from these scams. The intend of the assistance is to be quick and unburocratic and to "sort things out later". You may have to prove that someone else used your information to apply for the loan.

"Grandparent Scam"

In this scam, a caller will pretend to be a relative or close friend, asking for money. These scams have improved because they can often identify individuals in the disaster area and use them as a pretense to extort money. The caller may claim to be the individual (often they use SMS or other text messaging services), or they may claim to represent a police department or a hospital. Do not respond to any demands for money. Notify your local police department. If you are concerned, try to reach out to the agency calling you using a published number (note that Google listings can be fake). Due to the conditions in affected areas, the local authorities may be unable to respond. Your local law enforcement agency may be able to assist. They often have a published "non-emergency" number you can use instead of 911. Individuals in the affected area may not be reachable due to spotty power and cell service availability.

Final Word

Please let us know if we missed anything. A final word on some disaster preparedness items with an "IT flavor":Broken high voltage power line wire touching cable TV and phone lines.

  1. Have a plan to get out, and if you can get out: get out. You should not stay in the affected area unless you are part of the recovery effort.
  2. Cellular networks fail. Cellular networks tend to work pretty well during smaller disasters, but they need power, towers, and other infrastructure, which will fail in large-scale disasters. Satellite connectivity quickly becomes your only viable option (if you have power). If you have a phone with satellite emergency calling (for example, a recent iPhone), they offer a "demo mode" to familiarize you with the feature.
  3. If you are lucky to already have a Starlink setup, bring the antenna inside before the storm and disconnect the equipment from power to avoid spikes destroying it.
  4. Disconnect as many electric devices from outlets as possible during a power outage (or before power outages are expected). Power outages often come with power spikes and other irregular power events that can destroy sensitive electronics. Do not plug them back in until power is restored and stable.
  5. Even a downed phone or cable TV line can be energized. You may not see the high voltage line that is also down and touches the cable TV line. I took the picture on the right this weekend in my neighborhood of a high-voltage line touching the cable TV and phone line.

 


Johannes B. Ullrich, Ph.D. , Dean of Research, SANS.edu
Twitter|

(c) SANS Internet Storm Center. https://isc.sans.edu Creative Commons Attribution-Noncommercial 3.0 United States License.