In my previous blog post, I briefly touched on the fact that since I took CS 340 (Introduction to Databases) last quarter, I was probably the best equipped to handle the database aspects of my team’s capstone project. My other two team member’s last worked on databases almost a year ago. I didn’t mind this though, to be honest. While I wasn’t a fan of how the class was laid out, I very much enjoyed the content of CS 340. Databases just made sense to me.
In CS 340, we focused on relational databases using SQL with MariaDB as the database management system (DBMS). This was also mostly the extent of what one of my team members had worked with. However, the other teammate has quite a bit of industry experience and suggested we use PostgreSQL, since it’s one of the most widely used (and open-source) DBMS. I agreed since it’d be a good opportunity to expand my technical knowledge a bit, and the learning curve wouldn’t be super steep considering I already have experience with a different SQL-based DBMS. We plan to deploy our web app via Google Cloud Platform (GCP), however. So we had to weigh the benefits of using PostgreSQL over Datastore (GCP’s NoSQL database). While the integration would be almost seamless with Datastore, we felt that the learning curve would be too high.
With this greenlight, I went ahead and drafted the entity-relationship (ER) diagram along with a database outline/schema. After peer reviews by my teammates, I was ready to begin building the database. Although this might illicit some boo’s from more experienced software engineers and command line aficionados, I mostly used the PostgreSQL client’s GUI to construct the database. Being able to create, edit, and visualize the different entities and their relationships was a big help. The only downside was that almost all of the official PostgreSQL documentation describes the command line interface (CLI), so I had to fumble around for a bit. It wasn’t too bad, though. The biggest trouble came when I finished the database, however.
Luckily, one of my teammate’s volunteered to handle setting up the database on GCP; all I had to do was create a dump file for the database. Should be easy right? The PostgreSQL GUI had an option to export the database, so I thought it would only take a minute or two. After going through the various export options, I downloaded a SQL dump of the database. But when I’d open the file, it’d be empty or have a ton of unreadable characters. I thought this was strange, so I’d try to import the database on the PostgreSQL client to make sure something more wasn’t going on behind the scenes. To no one’s surprise, the database was failing to import. Online resources were only related to using the CLI dump functionality, so I decided to try this route instead.
No luck. The commands were not being recognized, so I figured it was the PATH variable. I checked it, and everything seemed fine. I scoured the internet for help, even going on page two of Google search results (desperate, I know). Nothing worked, though. I was losing my mind over why this wasn’t working; I was spending hours trying to figure this out. I figured all my work had been in vain, and I’d just have to create the database on GCP itself, but that’d create other problems such as cost of hosting and such. Come 2 AM, I decide to check the PATH variable one last time…
…
I had misspelled one word. Flipped two letters by accident. It immediately worked; no problem whatsoever. I wish I could say this was my first battle with PATH, and I know it won’t be my last.