Categories
Blog Posts

Why Malware Analysis? Lessons I Learned Through Hands-On Experience

Why I chose my project

When I was first looking through the options for my Senior Capstone Project what immediately caught my eye was the malware analysis project. My concentration for my degree is in cybersecurity, so for that reason I thought it would be the most applicable choice for me. This was also the clear choice for me based on my interests which I have found through my years at Oregon State. My all-time favorite class up to the point of choosing this project was CS 373 – Defense Against the Dark Arts. The class was all about introducing anti-malware and computer forensic principles and methodologies, so I thought this would be a great continuation of those fields.

It turns out that my initial reasonings were correct, and this has by far been my favorite class throughout my senior year. I have really enjoyed getting a hands-on experience in analyzing malware and my knowledge of both what to look for and how to use the various analysis tools has grown substantially. As I reflect on everything I have learned, I think I can best summarize my insights into five major takeaways.

The five major takeaways

1. VM creation and handling

The very first step of this project (after the planning phase) was to create a VM environment that is isolated from the host network and then install all the necessary analysis tools as well as the malware samples onto it. Up until this course my only experience with VM’s was working with those which were pre-built for the purpose of the course which I needed them for.

2. Static analysis is essential for an impactful dynamic analysis

In the various samples that I analyzed this term, I have always found that the stronger my static analysis was, the easier and more conclusive my finding were in dynamic analysis. This initial phase helped me to identify key indicators, like API calls, suspicious strings, file structure, etc. which guided me in my approach during the malwares execution. Essentially, static analysis was my guide for what to look for when I finally executed the program and began monitoring its behavior in real time.

3. Malware analysis required a wide tech stack

I quickly found out when I began analyzing the samples, that I always needed to add another tool in order to get a complete grasp of the software. No single tool or analysis technique will paint a comprehensive picture of the intent of a piece of malware.

4. Simulated networks are vital for finding malware intent

Without using a tool like FakeNet to simulate network traffic, my isolated VM, being disconnected from the internet, would have been insufficient in analyzing samples that utilize a command-and-control (C2) server. FakeNet was able to mimic the expected network responses, so that the malware would behave as if it had successfully made a connection to its C2 server, which lead to further findings of the malwares intent and functionality.

5. Documentation is key to a proper analysis

As I investigated the malware samples, I made a habit of documenting everything I found. This proved extremely useful as I continued my study of the sample. I was always able to look back at what I had documented which made tracking patterns within the sample and understanding how everything tied together much easier.

Categories
Blog Posts

The Highs and Lows of My Malware Analysis Tech Stack

At this point in the term, I have gotten to utilize all of the technologies I have installed onto my VMWare environment through practical malware analysis. In this blog post, I wish to share the difficulties and successes that I have had throughout the process thus far, though I am sure there are many more to come!

The Most Difficult Technology: VMWare Workstation

Ironically, the technology that has caused the most amount of headaches for me, and from what I can discern, many members within my group as well, is the very foundation on which our analysis is done, VMWare Workstation. Going into this course, my experience with VMWare, and VM’s in general, was limited to pre-built VM’s that were designed for the particular class I was taking at OSU. This project was totally different in this regard, having to construct the VM from scratch, and install all of the necessary tools for analysis onto it. Seems simple enough, right?

Well, that’s what I thought, but let me tell you, I have had many long troubleshooting sections trying to get everything to work properly. The primary challenge for me was finding compatible technologies with the operating systems that my VM’s are running, and then successfully transferring and installing them onto their respective machine. Since most technologies I am familiar with in regard to malware analysis are much more modern, I had to go through the tedious process of identifying if they have older versions that are compatible, and if not, finding viable alternatives.

After that headache, I ran into another problem. specifically on my Windows XP VM. VMWare tools doesn’t have any automatic installation process for operating systems older than Windows Vista. This meant that I needed a different way of transferring files than I was familiar with. I ended up going the hardware route by installing all the tools I needed onto a USB flash drive and then transferring that flash drive to the VM where I could install it’s contents onto the machine.

Despite the initial difficulties, I appreciate the experience working with VMWare, as I can say I am much more comfortable navigating the VM building process now, and I think in the future I will have significantly less frustration.

The Most Enjoyable Technology: Wireshark

On the flip side, the technology I have enjoyed working with the most so far is Wireshark. Unlike with VMWare, I have a lot of experience with Wireshark, having used it in multiple courses at Oregon State, as well as in personal research. Because of this, I was able to jump right in without any concerns.

For this project my primary focus with Wireshark has been analyzing malware behavior and detecting potential command and control (C2) traffic. Understanding how malware communicates over a network has proven essential in identifying the potential threats the malware contains, and for that Wireshark has been the best tool for the job.

How Wireshark Works

Wireshark is simply a network protocol analyzer, which captures network data packets in real-time. This allows for deep inspection of the traffic a network has, which can lead to the detection of unusual behavior. The typical Wireshark usage, from my experience goes as follows:

  1. Packet Capture – As mentioned above, Wireshark records all of the inbound and outbound network traffic on a given network interface.
  2. Filtering & Analysis – Once you have completed your packet capture, depending on the nature of the network you are working with, there may be a large amount of packets that were caught in your capture. To analyze these, you can utilize Wireshark’s custom filtering, which allows for you to organize the packets in various ways, for example you can search only for HTTP traffic.
  3. Reconstructing Sessions – If there are suspicious packets that are detected, like in my case during this project, Wireshark offers easy ways of following the traffic and getting a full picture. One way this can be done is through following TCP streams. This stitches all of the packets that belong to a single session together to help paint a full picture of a given connection.

Final Thoughts

While each technology I have used in my malware analysis stack has had its learning curve, I am glad to have the opportunity to sharpen my skills and get a better understanding of how malware analysis is done in a hands on way. I look forward to moving forward with this project and honing in my skills with these various technologies.

Categories
Uncategorized

Clean Code, Clean Workflow

After reading through the first chapter of Robert Martin’s book, Clean Code: A Handbook of Agile Software Craftmanship, I came away with a few key takeaways that I would like to highlight which may prove useful to programmers, particularly those who strive to write code that is not only functional, but also maintainable and scalable.

Code degradation – A personal shortcoming

One area I seek to improve on as I continue in my programming journey is what Martin calls “The Boy Scout Rule”. The Boy Scouts have a rule to “leave the campground cleaner than you found it”, and Martin applies this same rule to writing code. He argues that code needs to be kept clean over time and it is a common problem to see code begin to degrade overtime.

In the past courses here at OSU I have had a couple of occasions where I had projects that lasted around the whole term, and I started off great, adhering to all the principles of code cleanliness that I knew of at the time. However, as the term progressed and deadlines loomed, my code began to get sloppier as I was resorting to quick fixes while ignoring the root causes of issues.

My biggest problem area is the length of my functions. I will often continuous add to a single function rather than break it down into smaller, more manageable pieces. I found a small example of this in my coding project from my Software Engineering I class, here it is:

This is a brief snippet of a damage calculator I was making for a video game I play. In an attempt to not clog this post with code, I omitted two other equip functions, armor and ring, which both had the same forms of error messages. Looking back, I could have made my code much cleaner by creating an equip_item function which was passed information via the equip_weapon, ability, etc. functions, which would save a lot of unnecessary repetition.

Moving Forward

Reflecting on my past code, and after reading about Martin’s Boy Scout Rule, I felt a “click” as I realized some of the shortcomings of my previous programming experiences. The idea of working to leave the code base cleaner than I found it makes a lot sense, and gives a visible goal to look towards as I am working on my code.

References

Martin, Robert C. Clean Code: A Handbook of Agile Software Craftsmanship. 1st edition, Pearson, 2008.

Categories
Uncategorized

Paving the way for Controlled Chaos

Over the last few weeks, since my last blog post, my group and I have been hammering away at developing a solid virtual environment that would be both isolated from our host systems, yet seemingly authentic to any malware through the use of simulated traffic via other virtual machines connected to the same network.

Setting Up the Lab

Before we can analyze any malware, we want to ensure that no harm will be done to our host machines or across our home networks. The first step in doing so is creating a virtual environment via a hypervisor, in our case VMWare. These environments are often called “detonation chambers” or dynamic execution environments, because they allow for a safe and controlled space to execute and observe malware to better understand its behavior.

Creating the groundwork for this isolation chamber, we created a Windows XP VM and cloned it, creating a second identical machine. We then set both of their network adapters to VMnet7, which prevents any communication between the virtual and host networks. We tested this by using CMD and pinging our host machines from the VM, which showed no connection, and then pinging the cloned machine, which showed that communication could be made between the VMs.

Building our Arsenal

Now that we had created the virtual machines which will be hosting the malware, our groups next goal was to build our toolkit by installing a variety of analysis tools on them. While I can’t cover all of the tools we will be utilizing within this blog post, I will go into some detail on a select few of what I think are the most important:

  • FakeNet: This tool simulates network traffic, which will help trick the malware into believing it has access to the internet.
  • Process Explorer: A tool built for system monitoring which allows you to examine any files or registry keys that the malware interacts with.
  • Wireshark: Allows for the capturing and analyzing of packets which will allow for us to read further into what the malware is intending to do.

Looking forward

As the current term comes to a close and all our preparatory work will be built to the test in the next, I am confident in the groundwork our group has laid out in setting up a strong and isolated virtual environment with all the necessary tools in our toolkit. The next term will certainly be exciting as we begin our deep analysis of the malware samples we have selected, but part of me is still a bit nervous, as this is my first time working with a piece of malware in a more free-flowing environment, whereas my only experience thus far has been in a tightly structured course where we were given guides on what to do.

Categories
Blog Posts

The Project is in Motion

Project Updates & Where We are Now

Wow, this term sure has been flying by. So far this project has comprised of a lot of planning and preparatory work so that we can really get into the heavy stuff next term. I have absolutely no complaints about my group or the project as a whole and I am satisfied with where we are at so far. This last week we put together a draft for our design document which essentially laid out all the plans and goals we have for our project and how we plan on completing them.

Seeing how our ideas have begun to take shape into a more structured plan has felt incredibly rewarding. It’s one thing to have some ideas in our heads, but seeing it organized into a roadmap provides my team and I with confidence and a mutual understanding that will help us be more unified and efficient as this project continues underway.

What is to Come

As I mentioned, now that we have a design document we have our roadmap for the whole project in front of us. Within the coming weeks we will be working and building upon the foundation of the project. Our project is unique in that it is not necessarily a “coding project” in the same sense as others taking this course. Our project is meant to analyze malware using both static and dynamic analysis strategies. This malware will be examined within a VM network that we will build using VMWare Workstation.

The majority of our design focus for our V0.0.1 and V0.0.2 will be to create the virtual environment of VMs and download the necessary analysis tools for our testing. In our environment setup we will ensure that our various VMs can communicate with one another as to simulate normal network traffic. Once we have the proper environment setup we will download our various analysis tools. A list of these tools has been written within our design document, but to name a few we download FakeNet, Wireshark, and process explorer.

These tools will allow us to monitor the malware’s behavior from different angles. FakeNet, for example, will simulate network services and capture any outgoing communication attempts made by the malware, giving us insight into its communication patterns. Wireshark will enable us to capture and analyze network packets, allowing us to trace any connections the malware tries to establish. Process Explorer will give us a detailed view of the processes and system resources the malware interacts with, helping us identify suspicious activity at the system level.

Upon the completion of the environment, we should have a strong foundation for the project. With a solid foothold now in place we can look forward to the next term where we will begin our static and dynamic analysis of the malware.

Categories
Blog Posts

Blog Post #1

About Me

Hello everyone, and welcome to my blog! My name is Skylar Eade, I am an undergraduate at Oregon State University studying computer science. I live in Albany, Oregon, which is just 20 minutes out from the Oregon State campus. Outside of computer science one of my biggest hobbies is hiking, which my dog, a German shepherd husky mix, often joins me on.

What got me interested in computer science was initially my childhood obsession with video games. By the time that I discovered YouTube, I began doing lots of research into how computers work, their different parts, and how games were created on them, so it’s safe to say I have been interested for quite some time. 

Favorite Project Options

After browsing through the different available projects, I would have to say that my top two choices would be the Malware Analysis and Algorithm Stock Market Trading Strategies. I feel that my skills are best suited for these projects and they also both are interesting topics. Malware Analysis is a project that really screams out to me, as my concentration is in cybersecurity, so I would love some extra hands on experience with analyzing malware. The Algorithm Stock Market Trading Strategies project also looks quite interesting, with the goal being to implement different trading strategies/algorithms in order to test and evaluate their efficacy.