This website is using gohugo.io as a static site generator. I first learned about hugo when Let’s Encrypt announced they migrated their website over to Hugo. I always wanted to create a personal website using a static site generator and previously tried out Jekyll and Hexo, though both had their quirks and I never finished deploying anything.
Hugo on the opposite was pretty straight forward. Heading over to https://gohugo.io/getting-started/quick-start/ and install it by simply downloading one binary file. No tedious setup of gems, npm packages, pip files or language frameworks. It just works out of the box!
After following the steps I hit the first blocker, of course I chose a different theme for my quick start. Unfortunately the theme required some config values to be set, otherwise it will just fail with cryptic error messages. Fortunately it only took me some minutes to fix it by reading the theme’s readme file. Reminder for future self: Read readme’s :) Afterwards I was able to create the about page and a short “Hello World!” post.
In good old fashion I created a few bash scripts that deployed the generated files from the /public
directory to an AWS S3 bucket with the help of aws-cli
. Just remember to set the --acl public-read
property when syncing files, otherwise you’ll end up with a 403 error.
Next step: Cloudflare DNS. I’ve been using Cloudflare’s free CDN service for quite a while because it has a convenient way of managing my DNS entries and offers free SSL certs out of the box. So I pointed the www
CNAME entry to the public AWS path and pressed F5. All I was presented with was a spinning wheel for 15 seconds and a HTTP 522 error afterwards. I forgot to deactivate “Full SSL communication” in the “Crypto” tab. Cloudflare tried to request the S3 bucket via HTTPS which is not available of course. Changing the setting to “Flexible” fixed it.
I’m hosting the source code and posts on a private gitlab.com repository, this also enables me to use their excellent CI/CD system. So I ditched the bash script approach and went with gitlab CI definition instead. Here’s my current version of the .gitlab-ci.yml
file:
Update 3rd May 2019:
I recently moved this blog over to Gitlab Pages, this removes the step to publish your generated files to S3. It’s a free service by Gitlab and works pretty well for me. Here’s the update version of .gitlab-ci.yml
.
stages:
- deploy
pages:
image: monachus/hugo:v0.55.3
stage: deploy
script:
- hugo -d public
environment:
name: pages
url: https://www.fabiangruber.de
artifacts:
paths:
- public/
only:
- master
stages:
- build
- deploy
build:
image: ubuntu:xenial
stage: build
before_script:
- apt-get update && apt-get -qq -y install wget
- wget "https://github.com/gohugoio/hugo/releases/download/v0.37.1/hugo_0.37.1_Linux-64bit.deb"
- dpkg -i hugo_0.37.1_Linux-64bit.deb
script:
- hugo
artifacts:
paths:
- public/
s3:
image: python:2-stretch
stage: deploy
before_script:
- pip install awscli --quiet
script:
- aws s3 sync public/ s3://www.fabiangruber.de --acl public-read --delete
environment:
name: production
url: https://www.fabiangruber.de
only:
- master
dependencies:
- build
To deploy this post, all I have to do is commit the changes and push them to the master
branch of my gitlab repository. The gitlab CI runner will build the new html and deploy it to production. After a few minutes my changes are online.