Reinventing the wheel

I made dis

Just like that, we’re already wrapping up week 8 of our 10 (more like 9) week project. We’ve made pretty good progress on Asteroid Crusaders, and although we’re not exactly 80% done, next week still has great promise for productivity and I’m feeling a little reflective. While not the biggest project I’ve worked on, Asteroid Crusaders may rank among the more complex, with many different parts that have to come together and some mind bending simultaneous programming in a curly-brace and semicolon language next to a whitespace language. Part of the complexity arose due to the fact that we had to come up with the overall design of the project in a relatively short amount of time and then had a fairly tight time budget to get it all implemented. In other words, no time to step back and reevaluate the approach. And that’s how I wound up doing something I knew I would never want to do: write a game in Javascript.

One of the interesting things about programming is that at any given point in the coding process we are redoing something that has already been done a thousand times before, and likely been done better. There is nothing new in Asteroid Crusaders, especially regarding the theme, and the implementation is just the combination of tons of little posts of stack overflow (noted in the code where reliance was particularly heavy). It’s only in the sum of the parts that the originality comes into focus, hopefully. And now that I have the great advantage of hindsight, I know one thing is certain: I should have leveraged much more of that sweet code perfected by someone else.

the evolution

I can’t even begin to enumerate the code written by others that went into our version of Asteroids, starting with Python and JS and all the way down into multitudes of libraries. But there are some things I did myself: I wrote a websocket server in python using TCP sockets. I wrote a (very simple) physics simulation in Python. I wrote a game in Javascript.

At the start of the project a significant decision was made on how to implement the WebGL layer. We had the option of using raw WebGL and all that fun linear algebra or using a library. We went with PIXI JS, a 2d focused WebGL library. I think it is actually written in TypeScript, which makes a lot of sense. But I started using it with Javascript, and by about week 3 when I started to realize this was a Bad Idea, there was no time to go back.

We are supposed to spend about 10 hours per week on this project. Today I ran into a bug, client side. I was trying to add a new feature which involved some changes that broke a previously working feature. After some time I found one specific line of code was responsible for the break – I could comment it out and the problem would arise, or I could leave it in and things would work fine. The problem was that this line of code made no sense.

5 hours later

Maybe you might realize where this is going. That line of code was indeed broken, but it was stopping the rest of the function from executing, which caused the bug. Thanks Javascript for your ‘stay alive at all costs’ attitude. And that’s why next time I do this I am going to use Unity – AAA game engine, built in WebGL export, statically-typed, built-in networking.

What is this going to cost, anyway?

Ain’t nothing cheap about spaceships

As I wrap up this degree with the final capstone course (that I hopefully won’t have to retake), I find myself asking the exact same question I did when I was applying to the program – how much will it cost? This time I’m not talking about tuition – that last payment was four days ago and I’ll never think of it again – I’m talking about the cost of the cloud.

AsteroidCrusaders runs on AWS. So far we have three EC2 T2 micro instances (front end, back end, game server), a hosted zone, and some tables in DynamoDB. SSL certs are free these days if you’re willing to do a little maintenance every three months. There are a few other random costs like Secrets, but that is only a few cents. Our biggest single expense was for asteroidcrusaders.com at $12, I think that is for a year. So, if we didn’t have a free tier account that would all probably add up to 10-20 bucks per month.

But what happens if people actually play the game? We will have to leverage more of everything from AWS. And at some point, we’ll start to care about how much the infrastructure costs per user. We would need that number (hypothetically) to determine how many ads we have to saturate the page with or how much to charge for the very fine privilege of flying an orange triangle around instead of the boring old green one. There is plenty that goes into these calculations that I have no ability to predict right now, much like the memory leaks in my game code clogging up our EC2 game servers. Hopefully our EC2 usage efficiencies will go up in the near future. There is one cost, however, that I haven’t yet mentioned which will scale in a very predictable way: data transfer cost.

As we come up on the midpoint demonstration, AsteroidCrusaders runs all simulations on the server. Why? Because that makes it simple to synchronize all the clients and we need something to demonstrate. So 30 times per second, the game server collects all of the inputs from the connected players, does some trigonometry that had me reaching for my 8th graders math book, and feeds back to every connected client the new positions of all the triangles in space. What the game server does not do yet is check for collisions. We’re only at the course midpoint after all.

The point is AWS is going to charge us for each of the 30 position updates per second that are sent from the EC2 game server back to the client. After a free GB on the house, it’s going to cost about $0.10 per GB for a good while until bulk pricing starts to bring that back down. To plug in some numbers, for each object in the game world the game server send two 32 byte numbers for the x and y coordinates, 8 bytes for the object id, and a byte for its rotation, and perhaps 10 other bytes for random flags (we will omit all the bytes for the json structure we currently use and pretend we’re better than that). So that is 83 bytes per object in space sent to each connected client 30 times per second. With 50 asteroids and 3 players, that would be ( 83 * 53 * 3 * 30) 395910 bytes per second leaving AWS and getting charged. If those players play for an hour they will have used 1.43 GB, or $0.14!

OK, we’re not quite at tuition dollar amounts yet, but I’m still glad I took the time to figure this out before seeing the AWS bill (go ahead, ask me about my student loan interest), and we actually have taken steps to control this cost, such as taking steps to ensure a player can’t idle for too long in the game world and dropping the idea of letting players shoot thousands of missiles. Lasers are much cooler anyway.

Two appspec.yml files, one repo

I probably had it coming after my previous post ranting about DevOps.

Among the first tasks I had for our capstone project Asteroid Crusaders was setting up the continual deployment from our Github to our AWS cloud. The goal was to use Github Actions to make any change to the main branch automatically run on our EC2 instance. We aren’t using ECS for this particular case because as far as we can tell that would require a load balancer that we just don’t need. Hence, we were going to use AWS CodeDeploy. To use CodeDeploy you set up a named task that deploys the repo to a group of servers identified by a tag, and then CodeDeploy follows the instructions in the appspec.yml file in the repository to launch your code. You can call that task from Github Actions and give it a commit hash (once the proper IAM roles are sorted).

Aside: before your EC2 instance can use CodeDeploy, you have to ssh into the instance, install wget, then use wget to download the CodeDeploy agent and install it. Which is ridiculous.

Anyway, appspec.yml files look like this:

version: 0.0
os: linux
files:
  - source: backend/
    destination: /home/ec2-user/backend/
hooks:
  AfterInstall:
   - location: backend/deploy/setup.sh
     timeout: 300
     runas: root
  ApplicationStart:
   - location: backend/deploy/run.sh
     timeout: 300
     runas: root

The problem for this with Asteroid Crusaders is that we have two EC2 instances, one for the frontend and one for the backend, we have one git repo with folders dividing up the code for each, and CodeDeploy reads the file ‘appspec.yml’ at the root of the repo. How does one get a particular script to run on a particular server? A quick trip to StackOverflow told me the solution is to have two appspec files – e.g. appspec-frontend.yml and appspec-backend.yml – and rename the appropriate one in a job in the CD process before sending it on to the proper CodeDeploy task.

So I happily added my separate appspec files and a ‘mv backend/deploy/appspec-backend.yml appspec.yml’ into my Github action and thought I would be good to go.

Nope.

The problem with that plan is that CodeDeploy takes a commit hash and the hash that my Github action knew was the one that triggered the action – my mv command was not committed anywhere. So I had to commit after mv, and then find the hash of that commit. This is where I learned that Github actions can’t commit to a protected branch, like when you require a pull request to merge into main. So I have to mv the appspec file, then commit to an unprotected branch, then get the hash, then send that hash to CodeDeploy using the slightly obtuse Github action variable system.

Here is my final functional Github action for the backend:

name: Backend CI/CD

on:
  push:
    branches: [ main ]
    paths:
    - 'backend/**'
  pull_request:
    branches: [ main ]
    paths:
    - 'backend/**'

  # Allows you to run this workflow manually from the Actions tab
  workflow_dispatch:

jobs:     
  setup-repo:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v2
    
      - name: Pick appspec file
        id: appsec-set
        run: |
          git config --global user.email "GithubCD@nowhere.com"
          git config --global user.name "GithubCD"
          git remote set-url origin https://sntwo:${{ github.token }}@github.com/sntwo/asteroidcrusaders.git
          git push -d origin auto-backend-deployment
          git checkout -b auto-backend-deployment
          mv backend/deploy/appspec-backend.yml appspec.yml
          git add appspec.yml
          git commit -m 'production backend deployment'
          git push --set-upstream origin auto-backend-deployment
          git fetch
      - name: get sha
        id: psha-set
        run: |
          echo "::set-output name=psha::$(git log origin/auto-backend-deployment | head -1 | tail -c 41)"
      
      #Push to AWS
      - name: Configure AWS credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-west-2

      - name: Create CodeDeploy Deployment
        id: deploy
        run: |
          aws deploy create-deployment \
            --application-name backend-server-git \
            --deployment-group-name backend-server-group \
            --deployment-config-name CodeDeployDefault.OneAtATime \
            --github-location repository=${{ github.repository }},commitId=${{steps.psha-set.outputs.psha}}

Dev(st)Ops: a rant

DevOps is the combination of cultural philosophies, practices, and tools that increases an organization’s ability to deliver applications and services at high velocity: evolving and improving products at a faster pace than organizations using traditional software development and infrastructure management processes. This speed enables organizations to better serve their customers and compete more effectively in the market.https://aws.amazon.com/devops/what-is-devops/

Yes, that’s an outhouse

This post is about DevOps. It begins in Africa.

Burkina Faso is a land-locked country in West Africa on the southern edge of the Sahara. It is a very poor country. After graduating college with a B.S. in general science, which basically meant I had no job prospects but maybe could have gone to medical school if I had better grades, I found myself in Burkina Faso as a Peace Corps volunteer, teaching science. I vividly remember flying into the country for the first time. It was nighttime, and as we felt our jet start the final descent, I looked out the window and saw an enormous patchwork of fires. They were campfire-style fires, not forest fires, and especially not electric city lights as you would expect to see from a plane about to land in the United States. Although dark, it was still early in the evening (Burkina Faso is close to the equator and the sun goes down early), and people were cooking dinner. That’s how poor Burkina Faso is.

After landing, I vividly remember getting off the plane (via stairs), and almost being bowled over by stifling the heat, and wondering if it was from the jet engine next to us. It wasn’t. That’s how hot Burkina Faso is.

There are many stories I could tell about how the heat made life miserable, but one stands out for the purpose of this post. Malaria is a huge problem in West Africa. Mosquito nets are supposed to be able to cheaply prevent malaria. Have you tried to sleep under a net when the temperature is over 100°F? It’s horrible. You need all the air circulation you can get. That’s why mosquito nets haven’t solved malaria.

Everyone in Burkina Faso had a mosquito net, of course, which they never used because they were too hot top sleep under. Bill Gates or some other charitable entity had kindly bought a net for everyone. Charity has spent vast amounts of money in Burkina Faso.

Another item charity bought was speed bumps. With poorly maintained vehicles, no rules to speak of, and nonexistent medical care, the mortality rate due to traffic accidents was quite high. Therefore, at some point before I arrived in country, vast strips of speed bumps were installed on the major thoroughfares leading into the capital city of Ouagadougou with charity dollars. Like, hundreds of yards of speed bumps, and they were very sharp, axle breaking bumps. Anecdotes are not data, but I saw multiple accidents at these speed bumps. There were no signs to mark when they started, and of course there were no street lights. Another item charity dollars bought in Burkina Faso was outhouses.

Today I work for a very large company that has been around for quite some time. I work with ‘big data’ for online marketing. As a marketer, I am not considered to be in a very technical role – we have a DevOps team to help with the technical stuff. What does DevOps do for us marketers? In my opinion, they provide the mosquito nets, speed bumps, and outhouses.

It makes sense to start with the outhouse analogy. I’m not talking about a honeypot to catch hackers. Burkina Faso has next to no portable water or plumbing, and sanitation is an enormous issue. Outhouses really do have the potential to vastly increase the quality of life for people living there. Vast amounts of outhouses were paid for and built by charity dollars. There was one major problem with the implementation: after the contractors that built all of these extremely nice outhouses left, the locals in charge immediately went to each and every one and slapped a padlock on it. They didn’t want the pristine outhouses to be ruined by use.

At work, IT has been building on-prem servers for 50 years and spawned DevOps sometime in the last two decades. But now we have the cloud. A few months ago I provisioned a private Factorio server for my son and myself to play on AWS. It took about 20 minutes to set up and it costs $9 per month, billed to my credit card. At work, it takes about 3 months to get a cloud resource up and running. We aren’t allowed to just do it ourselves – we open a ticket and submit an architecture diagram and then escalate the ticket after it doesn’t get a response in 5 business days and then amend the architecture diagram 5 times and then we do a security review and revise some more, and then finally we get the resource. Guess what happens when the project requirements inevitably change.

So I would argue that in my situation DevOps has put the lock on the proverbial outhouse, and prevents me from doing something I know how to do. I have the DevOps counterparts in my head for speed bumps (obvious) and mosquito nets (maybe slightly less obvious?) but that would take another post. Oh and about DevOps and security? Let’s just say that would be yet another separate blog post, but I’m probably not allowed to post that one.

To be continued . . .

I was a cheat

* but not in the way that violates the OSU Code of Conduct

My best friend Steven got the original Nintendo system first, basically when it became available in the United States. The NES was groundbreaking – no kid had ever had something that offered this level of gameplay before. When I went over to Steven’s house (and when we weren’t forbidden from being inside), we would play Contra, that game where you would jump through the jungle shooting evil commandos and weird aliens with swinging tentacles. I think we were seven.

up, up, down, down, left, right, left, right, a, b… how does it go again?

One great thing about Contra was that two players could cooperatively play at the same time, which made it that much more awesome. But even with a buddy helping you, there is one thing that still sticks in my mind – Contra was hard. You didn’t have any of these gentle-making modern inventions like ‘HP’. No, take one single pixel bullet to your character’s hit box and you are out one of your three lives. And did I mention this game was a bullet hell? Certainly no first-grader in today’s more gentle generation of gamers would last more than a minute in that environment, and Steven and I probably shouldn’t have been able to either, except for one thing: Steven knew the Konami code.

The Konami code was a combination of inputs that you had to enter in the short time the intro screen was sliding in that would give you 30 lives. It was also more than that – the Konami code was my first lesson that the mode of operation of a program could be changed in inobvious ways. It was definitely a cheat, it might be sort of a hack (although in those pre-internet days these codes were also revenue generators that helped sell gaming magazines), and it definitely wasn’t coding, but it got me into coding, sort of, eventually.

After discovering the Konami code it was only a matter of time for me to start looking for hidden ways to gain forbidden power in every video or computer game. In middle school we discovered Doom, and not long after someone was told by some else to type in ‘IDDQD’ to get immortality. Not long after we found out about ‘IDKFA’ which gave some other upgrades, and then suddenly we were actually trying to hack the game by typing in random codes to see what we could do. I hadn’t taken CS 225 at that point so the idea that I just had to systematically try 26^5 combinations permutations didn’t come up. I don’t think we ever discovered any new codes ourselves.

By that time the internet was a thing, and with the power of search engines we were soon using hex editors to give our saved games infinite money in games like Sim City and Civilization. We had leveled up our cheating, and doing things that I would not understand until I took CS 271.

It’s a mob!

It wasn’t long until the untold of power that was the hex editor actually made gaming way less fun. Everything was too cheap and easy – even a 7th grader could tell that dominating the simple AI with 8K RAM using the hex editor was pretty lame. I might have actually played a game without cheating a few times.

Luckily the rest of the world was marching on in gaming technology, and we got internet multiplayer games in the form of text-based role playing games (Multi-User Dungeons, MUDs) that normally had anywhere from 1 to 200 players pressing ‘n’,’e’,’s’, or ‘w’, to navigate between text-paragraph-rooms, kill other players and also non-player characters, which were actually called mobs which was short for mobile objects, and did I mention that when my son got into Minecraft and starting talking about ‘mobs’ for the same reason I almost fainted . . . and I digress.

The point about multiplayer games is that cheating was definitely back on the table. It was way more fun to dominate another human being using fair or unfair methods than the stupid local AI. In my defense, most MUDs, and definitely the only ones I played, were created and run by unpaid volunteers (the ‘Immortals’ or ‘Imms’) whose primary compensation was the ability to program their cheat characters so they could slaughter the rest of us that were merely playing the game. However, because the implementation team was somewhat amateurish (or didn’t care), there were always tons of probably unintended exploits in these games to, well, exploit.

My MUD of choice was called SneezyMUD, and I played it for a few years and might have had some lower grades in college than I should have had due to that game. SneezyMUD eventually shut down due to a huge fight among the staff and player base over, you guessed it, a massive cheating scandal (I had nothing to do with it I swear), and my grades picked up and I graduated with a chemistry degree. A while later, the head Imm open-sourced the code onto Bitbucket, and another fan of the game fixed up the code, opened up a new server, and also kept the development open-source on Github (https://www.github.com/sneezymud/sneezymud). I found out, started playing the game again, realized the code base was an interesting source of cheat potential fun history, and started delving into the spaghetti of C++ that had been written over 25 years.

To jump to the end of the story, using the source code of SneezyMUD to my advantage was fun, but much like in the days of using a hex editor on save files, it became unsatisfying. I actually became annoyed with some of the exploits I discovered, and with a lot of help from the current SneezyMUD staff I managed to submit some pull requests to the project to fix things up. Simple fixes turned evolved into feature additions, and I wound up learning a lot about open source software development to my surprise. That led me to me leaving my chemistry job to work for a friend who had a web app startup, enrolling in the OSU online CS program, and it seems like all of a sudden I’m in my last class, CS 467, capstone.

My capstone project? An online multiplayer game (definitely not a MUD). Right now I’m obviously not sure how it will turn out, but I really hope that someone will like it enough to try to cheat on it.