Clear, Clean, and Document

The best code has clearly defined variables, has been cleaned up by removing excessive debug statements and/or commented out unused code snippets, and the behavior of the code, as well as the “what”s and “why”s for coding decisions, is clearly documented. I know we have all heard these stressed numerous times throughout our coding education and experiences, but when these principles are not applied it can have extreme negative impacts on a sprint schedule and resource allocations.

This past week I had the opportunity to evaluate some test failures around a chemical solubility model and a solver type function using mixed composition waste as inputs, thermodynamic and reaction kinetics as internal methods of the requirement driven design specification, and a solver function that takes the information from those methods, along with the chemical compositions, and generates a convergent recipe of chemicals to add to the input tank to result in the desired end product. Normally this type of “meaty” test failure is something I love biting into and reworking. But this code, and the documentation with it was so cumbersome and filled with “dead” code and convoluted, and sometimes circular pathways that I wanted to scream and pull my hair out atleast twice every day. This blog started out as a end of the work day rant, but I am done “scream typing” and am ready to learn from these lessons and logically look at some examples below.

  • Clearly Defined Variables

There are so many ways that people have for creating variable names. It is a limitless combination of letters and numbers and cases (camel, snake, etc). But each variable needs to mean something to the reader. I don’t mean the computer code that parses the information in program. But instead I mean a person doing code reviews or trying to debug and/or walk thru the code. In the code that I was working with this week there were variables named xMin, xMax, xCurrent, xLeft, xRight, xLast, xNext, xRoot1, xRoot2, xfnMin, xfkMin, xNextLast, etc. I think you can see the pattern that the person writing the code was using. If these variables were all part of a calculation for “x”, and if each were clearly defined on what they mean or how they were used, it might be ok to do this. But in this instance “x” means nothing. Left was not related to right. Next and Last were in two separate conditionals that never were related. And these values were all reused continuously through ~300 lines of code. Don’t do this. Make the variables relatable. Make the variable names mean something. Max and Min were the only two remotely close to being related and also used how one might expect. Decoding these variable names ended up taking 3 people and lots of white board time to break it down into pieces that were understandable.

  • Clean Code

As the input to the solver function mentioned above, the calling methods copy the chemical composition of a single tank and manipulate that copy with reactions, temperature partitioning, and phase changes from liquid to solid, as an example – the settling of precipitates from the reactions. In this instance, the correlations used to do all of this have come through various temperature and composition dependent calculations based on the calculation used in the final tank end product. Over the 20+ years I have worked where I do there have been 18 different correlations. Each manipulating copied tank compositions, and sometimes making copies of copies. All 18 different ways were STILL in the code that I was working through this week. Some of them were even circular and called a correlation with a different input to get a different pathway through the muck. 95% of this was dead code. Only it wasn’t dead because a conditional was used to throw the dead code into one part of the conditional and the current correlation into the other part of the conditional. In the end we were able to cut these 18 different correlations into 4 with a very linear and clearly defined pathway. Clean up your old code. If something isn’t going to be used- get rid of it. Now days there is so much storage, backups and ways to go back and retrieve a bit of code if for some reason you couldn’t possibly recreate it better by writing it anew.

  • Document With Comments

Comments – None of the code mentioned above had any comments. You might have expected a lot of unneeded commenting with all the circular references and legacy code. But there wasn’t. And back when most of these correlations were first drafted the requirements documentation was severely lacking. The hours spent at the white board trying to understand what the code was intended to do, what our client needed it to do, and what to cut out or leave in could have been used elsewhere. Use comments smartly, clearly, no big explanations- that is for requirements or user stories or any of the other tools to document these types of things. Give simple descriptions for the what: “Used to generate a lower limit for the ALOH3 target molarity”, “Pathway for ALOH4 dependent on sodium convergence of root1 and root2”. Maybe these are not the best descriptions for this class, but anyone looking at the chemistry of the tank will understand that ALOH3 is being adjusted to some target. That target depends on sodium. The end product will be ALOH4 at root1 or root2. Comments and documentation in general, matter.

So what were my big ‘lessons learned’ this week? I learned that each time I sit down to write code I should plan the layout of the program in a way that is clean of cluttered comments and junk code. I learned that even programs that may rarely ever get looked at again need to have clear variable names to understand how the input parameters and internal methods and structure work in the code. Without this, code tracing and debugging is spent wasting time that wouldn’t have otherwise been needed to identify bugs. And lastly, I learned that just because something is clear to me, that I think the coding task is trivial or intuitive, or that anyone with a similar background to mine will easily be able to follow along and understand… comments are the best way to explain a bit of code to the future developers. Always strive to comment and document the functionality, intent, reasoning, even if it is only one line with a requirement number of a bug ticket number… anything that can be used to identify it later are necessary in striving to follow good coding practices.

Adjusting To The Unexpected

Last week I talked about finding the right tool for the job. This week I am going to talk about adapting to unexpected behavior in our processes. I strongly believe the first step to this is avoiding hard set paradigms with regards to software development. Twice, in the last couple of weeks, I have been exposed to opportunities to adapt to unexpected behavior. The first of these opportunities was in my job and is not a success story. I am going to contrast that with an opportunity to adjust in my Senior Project team.

For many years my company has been running a customizable simulation software package that captures a relatively large amount of data. For perspective, approximately 200 GB of data is usually generated for each simulation to capture the data needed for acceptance testing. The unexpected behavior observed was that our simulation tool would seemingly abort towards the later portion of the simulation. All of the great minds got together and decided that the simulation tool must be bad and we needed to yank it out of acceptance testing. It took almost an entire extra sprint to evaluate the code looking for coding errors, memory leaks, and excessive function calls. There were lots of passionate discussions and our processes underwent high scrutiny. But no issues were found. The answer was instead to run several simulations- with each writing out a portion of the data so we could package it all up and give to our customer for their final acceptance testing. Today, about an hour before quitting time, after we had already shipped it to the customer, I realized what the problem was. This release of the simulation tool had several significant requirement changes that pushed the simulation from a 60 year simulation to a 130 year simulation. This was expected. But what had never crossed anyone’s mind through all of our set in place development processes was “can our process for data collection and extraction handle this exponential growth in amount of data?”. We had grown hard set in our beliefs about our simulation tool and how it behaved. We did not adapt well to the unexpected behavior and it cost us a lot of time. Much of which was spent running in circles and even at times throwing blame and questioning capabilities of our entire team.

Opposite to that was the opportunity my Senior Project team had to evaluate and adjust to an unexpected behavior. Before I describe it, I do feel like I need to mention that my partner was the one who caught this and immediately began adapting. For our project we are parsing PDF files for specific wine data. For example, the % alcohol, winery name, type and year of wine, how it was made, etc. Each of these PDFs is like a flyer for each type/year of wine. Across a single winery the format and fonts are usually fairly simple. But from one winery to the next, there is often no similarities. As an example- some may say “ALC 13.0%”, while another winery might say the same thing as “… thirteen percent alcohol by volume”. For most of the customer provided PDFs, extracting the information was easy by taking advantage of the spacing on the PDFs. But for some things this didn’t work. Instead of wrestling through the code to try to force it work, my partner did three things that I was very impressed with: First he stepped back to try to define the problem, then he solicited feedback (from customer, me, internet, etc), and lastly, he looked at possibly using a different approach all together to address the problems.

Maybe the first example feels heavy to me because I lived the experience, where in the second example I was able to observe it. Whatever the reason, I feel like I learned a valuable lesson. I know that there are several factors at play for the different behavior in both of these: the length of time that has been spent locked into a way of thinking, the size of the project, and having the flexibility to be able to try new tools, seek outside help, and adjust. Whatever the differences between them, having had both experiences and reflecting on them has really highlighted the importance of striving to always be adaptable, to be willing to step back and look at a problem from multiple perspectives, and lastly to consider more than just the obvious design elements that might be impacted.

Big Bites With a Little Spoon

Having the right tool, or set of tools, to do any job is critical for minimizing frustration and maximizing production and efficiency. The first time I remember being taught this was when I started learning to shovel snow.  I say learning because it was a process, and sometimes that process was very frustrating and taught me a lesson. My grandpa had sent me out to shovel his driveway with a flat nose shovel. It did ok, but it took a long time to scoop, toss, step forward… over and over again. I didn’t realize it at the time but he had only intended me to use the flat nose shovel to loosen the compact parts from the car driving through. He had set out the regular snow shovel, but I didn’t bother to stop and ask why. It just never occurred to me to consider swapping out tools for the different parts of the job.

When I was almost done my grandpa came back out and asked me why I was using the little flat nose vs the large snow pusher type shovel and showed me the difference between the two. Both shovels did the job, but they did not to the job equally.  I very much felt like I had been trying to take big bites with a little spoon. As an adult I have three shovels for snow- a flat heavy duty one for breaking up compact snow and ice, a lighter one with a long handle for pushing a few inches of snow from one side of the driveway to the other, and a scoop shovel with a special tension handle for scooping and throwing deep snow quickly. I appreciate the lesson my grandpa taught me on this… on using the right tool. My knees and back especially appreciate that lesson the older I get. Like shoveling snow, picking the right tools for software development projects is important for optimizing production and efficiency and minimizing frustration.

Asana is the project management tool that my group is using to track our tasks, schedule resources, plan sprints, and overall manage the workflow of our Wine Data Lake senior project. Jira is another project management tool that I use daily in my job. It is used to schedule sprints, manage tasks, track bugs, and perform overall issue tracking and project workflows. Both products sound similar but they are very different in their usability and user experience. This may be directly correlated to the specifics of the way that they are used or, as was the case with the snow shoveling situation, the size and type of job.

Before getting too far along into using Asana, I recognized right away how easy it is to use. I downloaded the desktop version and within an hour had several tasks scheduled and notifications set. I was so excited with how easy it was to make changes to items, set dependencies, modify the appearance, and setup/filter out notifications. We have a very basic workflow that we are following. Conversely, the way I use JIRA in my job has different workflows for every type of task, bug type, story, etc. Time estimates are based on a point system and some tasks can overlap in points and some track from start to finish. Each ticket has a parent epic and belongs in a very specific project. And each of these things can be linked to each other, to a person, to a requirements repository, software lifecycle processes/control points, etc. I don’t think I can full describe the massive spiderweb of information and controls that are being managed with the JIRA software. But I do think that I have given a little feel for the way I am using both tools. I am using the free basic version of Asana. Jira also has a free version, but I am using the enterprise version for work. Both tools have strengths and weaknesses. While I would love using something like Asana for work, I have come to appreciate that the two tools are better used differently.

The important thing to take away from this (that I have taken away from it) is to look around online, check reviews, check strengths and weaknesses, solicit feedback from other users, maybe play around with free versions of software tools before committing to using one for a project. When we first started discussing tools for this senior project I immediately went towards “use what you know- push to use Jira”. But I am glad that my group member suggested and pushed for Asana. Because JIRA is like a snowblower. It may work great for clearing driveways and sidewalks when the snow is deep but doesn’t really work very well when there is just a little snow and or ice. Where conversely, the way we are using Asana it can handle the ice and small volume of snow and is, essentially, the perfect sized spoon for the perfect sized bite.

My Love For Computer Science

Welcome back. Project assignments have been made and my teammate and I are in the beginning phases of creating a Project Plan. While the project is getting off the ground, I wanted to take this opportunity to talk about my love affair with computer science. As is the case with many gamers and programmers, my love affair with computer science began with video games.

When I was eight my family was relocating, and we ended up staying in my aunt and uncle’s basement for a little while. My cousin, Robert, and I were the same age but were very different. I liked sports, building forts, exploring, and to be outside as much as I could. He liked books, computers and science experiments. All his trinkets and gadgets bored me. All except for his newly released ATARI 2600. It was kept downstairs, right outside my room, and it called to me. At first, when my cousin would play he would offer to let me play too. I declined, but I was curious. Eventually he quit asking, and my curiosity grew stronger. It sounded cool. It looked cool. But I would rather be outside tromping through the dirt and building forts over playing a game on the tv. Eventually I started waking up in the middle of the night to play it. I quickly became hooked, and my cousin and I eventually became friends. He was the one to teach me what a computer was and what a computer programmer did. That following summer I wrote in my journal, “When I grow up I want to be a PE Teacher and a Computer Programmer”. Its funny to look back on the writings/thoughts of 8-year-old me. But that kid definitely had some strong ideas.

Through the years more video games fed my interest- Packman, frogger, Tron, even pinball machines at the local pizzeria fascinated me. My first exposure to a real computer was in high school. It was 1989 and I was in an advanced math class when my teacher approached me and asked if I might be interested in learning computers. That there weren’t very many girls interested and I could be a leader and draw more people to her computer classes. I agreed and the following quarter I was coding in BASIC and FORTRAN. My first year of college I took a couple more programming classes (some derivatives of BASIC and FORTRAN), but then had to drop out because I was living on my own and couldn’t afford tuition. Nor did I have the discipline to do the hard work. Instead, I spent a lot of time playing Nintendo games and working part-time.

That was almost exactly 30 years ago. I eventually did go back to college, but I pursued an engineering degree because my grandpa was an engineer and told me I couldn’t do it. That was some serious reverse psychology, but it worked. In college I developed the passion for programming / using simulation tools to solve problems. And that is as far as my love affair of computer science took me. Until I started the Online program here at Oregon State University. It is very surreal to me that my 8-year-old self knew, and it took me another 30 years to fulfill that childhood dream. I’m excited for this Capstone Course. I’m excited for my Senior Project (I will discuss it next week), and I am very excited to graduate and complete a degree in Computer Science.