Categories
Uncategorized

Clean Code, Clean Workflow

After reading through the first chapter of Robert Martin’s book, Clean Code: A Handbook of Agile Software Craftmanship, I came away with a few key takeaways that I would like to highlight which may prove useful to programmers, particularly those who strive to write code that is not only functional, but also maintainable and scalable.

Code degradation – A personal shortcoming

One area I seek to improve on as I continue in my programming journey is what Martin calls “The Boy Scout Rule”. The Boy Scouts have a rule to “leave the campground cleaner than you found it”, and Martin applies this same rule to writing code. He argues that code needs to be kept clean over time and it is a common problem to see code begin to degrade overtime.

In the past courses here at OSU I have had a couple of occasions where I had projects that lasted around the whole term, and I started off great, adhering to all the principles of code cleanliness that I knew of at the time. However, as the term progressed and deadlines loomed, my code began to get sloppier as I was resorting to quick fixes while ignoring the root causes of issues.

My biggest problem area is the length of my functions. I will often continuous add to a single function rather than break it down into smaller, more manageable pieces. I found a small example of this in my coding project from my Software Engineering I class, here it is:

This is a brief snippet of a damage calculator I was making for a video game I play. In an attempt to not clog this post with code, I omitted two other equip functions, armor and ring, which both had the same forms of error messages. Looking back, I could have made my code much cleaner by creating an equip_item function which was passed information via the equip_weapon, ability, etc. functions, which would save a lot of unnecessary repetition.

Moving Forward

Reflecting on my past code, and after reading about Martin’s Boy Scout Rule, I felt a “click” as I realized some of the shortcomings of my previous programming experiences. The idea of working to leave the code base cleaner than I found it makes a lot sense, and gives a visible goal to look towards as I am working on my code.

References

Martin, Robert C. Clean Code: A Handbook of Agile Software Craftsmanship. 1st edition, Pearson, 2008.

Categories
Uncategorized

Paving the way for Controlled Chaos

Over the last few weeks, since my last blog post, my group and I have been hammering away at developing a solid virtual environment that would be both isolated from our host systems, yet seemingly authentic to any malware through the use of simulated traffic via other virtual machines connected to the same network.

Setting Up the Lab

Before we can analyze any malware, we want to ensure that no harm will be done to our host machines or across our home networks. The first step in doing so is creating a virtual environment via a hypervisor, in our case VMWare. These environments are often called “detonation chambers” or dynamic execution environments, because they allow for a safe and controlled space to execute and observe malware to better understand its behavior.

Creating the groundwork for this isolation chamber, we created a Windows XP VM and cloned it, creating a second identical machine. We then set both of their network adapters to VMnet7, which prevents any communication between the virtual and host networks. We tested this by using CMD and pinging our host machines from the VM, which showed no connection, and then pinging the cloned machine, which showed that communication could be made between the VMs.

Building our Arsenal

Now that we had created the virtual machines which will be hosting the malware, our groups next goal was to build our toolkit by installing a variety of analysis tools on them. While I can’t cover all of the tools we will be utilizing within this blog post, I will go into some detail on a select few of what I think are the most important:

  • FakeNet: This tool simulates network traffic, which will help trick the malware into believing it has access to the internet.
  • Process Explorer: A tool built for system monitoring which allows you to examine any files or registry keys that the malware interacts with.
  • Wireshark: Allows for the capturing and analyzing of packets which will allow for us to read further into what the malware is intending to do.

Looking forward

As the current term comes to a close and all our preparatory work will be built to the test in the next, I am confident in the groundwork our group has laid out in setting up a strong and isolated virtual environment with all the necessary tools in our toolkit. The next term will certainly be exciting as we begin our deep analysis of the malware samples we have selected, but part of me is still a bit nervous, as this is my first time working with a piece of malware in a more free-flowing environment, whereas my only experience thus far has been in a tightly structured course where we were given guides on what to do.