Uploading Images to AWS S3

With our backend endpoints and user authentication in place, it’s now time to revisit S3. Amazon’s simple storage service provides developers to store whatever they need in the cloud.

Our project utilizes multiple resources with the hope of staying on the free tiers for each. The frontend is hosted on Vercel, while our backend is running on my personal server here at home. Although it’s handy to have our backend hosted locally, I don’t want to see major delays in communication with the frontend.

In my limited experience using S3, I didn’t need to worry about delays since my code was running on the same domain. There are a few options I’ve come across to handle file uploads to S3.

Use Multer to stream file to S3 bucket:

This is the traditional approach that I have used before. It requires reading the file contents, storing the data in a temporary location on the server, and streaming the file to S3. Before handling the file, you can easily check for a valid JWT in the request headers. If all of our code was running on my server, this is probably the solution I would use.

Pre-signed POST to upload directly from client-side:

With this option, the client sends a request for an “upload URL,” which is “pre-signed” with the necessary data (AWS credentials, file field requirements, user authentication, etc.) for the file to be uploaded directly to S3. With this approach, our backend would not be dealing with file streaming. This also allows for finer control of restricting the file size and type.

The downside to this approach is the higher level of complexity. Amazon Lambda functions and the use of Amazon’s API Gateway are often referenced in tutorials and guides. It may be possible to use our server to pre-sign a post for upload to S3, but with only a few more weeks left in the quarter, this option could become too time-consuming.

Pre-signed URLs???

This approach is very similar to a pre-signed POST in that the client has to request access to upload a file to S3. Pre-signed URLs use a PUT request to S3 but follow a similar pattern of requesting the URL from your own server or through API Gateway and Lambda functions. With Pre-signed URLs, you lose flexibility and some security. You can’t check the file size, and once the URL is generated, anyone can access it and use it multiple times until the URL expires. Implementing this option appears to be much simpler than the pre-signed POST.

I’ll need more time to research these options before implementing file upload routes. Our front end uses the NextJS framework, which I have not had as much time to work with so far. The built-in server functionality with NextJS could be the best way to handle file uploads to S3!

Post-Graduate Blogging

As the weeks progress through the Capstone quarter, I’ve started to think about blogging in a post-graduate sense. Short of creating a full-blown custom solution, there are many off-the-shelf options available such as WordPress, LiveJournal, and Tumblr. I recently became aware of Static Site Generators such as Jekyll, Next.js, Hugo, and Gatsby, which are somewhere in the middle of these two camps and leverage templating languages as part of the “glue” holding everything together.

Along with these solutions are hosting providers such as GitHub Pages and Cloudflare, which provide hosting for free/at minimal cost, offering clever integrations to kick off builds and publish content when a new commit is pushed to the main branch. These solutions seemed to provide the right balance of flexibility and management overhead as they can be completely serverless solutions if desired, which also comes with inherent benefits to security and scalability. I’ve learned many developers on GitHub take advantage of this offering to showcase their portfolios.

By generating static assets from a series of templates, metadata, and content material, you eliminate the need for a dedicated backend in most cases. There are no databases to manage, and when publishing to a hosting provider there are no frontend servers to manage either. Your pre-generated pages are all neatly bundled and shipped off to a server, or collection of servers in the case of CDNs like GitHub Pages and Cloudflare, for blazing-fast load times!

I’ve been tinkering around with this for the past week and have started to get a personal blog put together. Landing on Cloudflare Pages for content hosting and Jekyll for the static content generation, I’m able to make updates and add new content fairly easily by pushing my changes to the main branch after verifying locally. I can see how this approach could work well for a collaborative blog, as anyone with access to the repository could add new pages or posts to the site by pushing commits via Git for publishing.

Local development is easy too, with a couple of YAML configurations you can specify overrides for easier debugging. Any changes on disk also cause Jekyll to rebuild immediately, so you can quickly iterate on design and content. I’m using a spare Linux machine for my development environment which I connect to via Visual Studio Code over SSH.

If you’re looking for a lightweight blogging solution that can grow with you long-term, I recommend you check out some of the static site generators on jamstack.org.