New site architectural overview
As I promised in the intro, here’s a breakdown of the software and processes used to construct and delivery the site:
- Hugo: flexible static site generation from markdown
- Gitlab: version control for site source, continuous delivery system
- Amazon Web Services (AWS): static content hosting, load balancing, content distribution, certificate management
In a nutshell:
- The source repo lives in Gitlab.
- I update the site by updating/creating new markdown files for posts and projects.
- The changes are pushed to Gitlab, which kicks off the Gitlab CI runner.
- In most of my other projects, this would run tests. But my website doesn’t have any tests (yet?), so its just a matter of building and publishing.
- The gitlab CI runner launches a Docker container which builds the static content from the markdown.
- It pushes the static content up to an Amazon S3 bucket which hosts the site.
- Amazon Route 53 DNS sends cgbaker.net traffic to Amazon CloudFront instances, which terminates SSL and caches the content stored in the S3 bucket.
- Magic happens in your browser.
When I set out last week (on vacation) to design the site, I knew I wanted a site with static content hosted in S3, using Gitlab CI (❤️❤️❤️) for deployment. When I started Googling around for answers, I found the following post: http://www.rjocoleman.io/post/gitlab-s3-deployment/. I looked into Hugo, decided it would do what I wanted, and then ripped off the instructions at that site.
So, what’s different from Robert Coleman’s excellent 👍 setup? Not a lot. I have http://cgbaker.net 301 redirecting (per this post) to http://www.cgbaker.net. I have CloudFront configured to redirect HTTP to HTTPS. I’ve started tweaking the Tranquilpeak theme to my liking (initially, adding support for archetype project, in addition to the native post archetype).
I also use a custom docker image based on Alpine Linux with git, hugo and aws-cli built-in. The image is small (fast to download at build time) and has all the necessary tools baked in. The source for the image is available here. It is configured at Docker Hub as an automated build, so that whenever I update the repo in Github (they don’t support Gitlab yet), the image is automatically built and made available in the Docker Hub.
Feel free to post questions in the comments below.
Update: I updated the CI script to explicitly perform an invalidation of the CloudFront origins on each site update, so that I can keep the TTL at a more reasonable length. This was inspired by a post from the StormPath team.