3 minutes
Deploy Hugo Website From Github To AWS S3
When you need to host a static website generated by Hugo AWS S3 solution comes to be a very handy, simple, and cheap solution. In my case I use AWS Cloudfront to handle the HTTPS connection, AWS S3 to store the website and AWS Route53 to handle my domain. I’ll explain this setup in a further post.
To do the deployment process we could easily do it manually from our computer, but why not using GitHub Actions? This will allow us to just push our changes to the git repo without having to do anything manually to get our static website updated.
Setup Our GitHub Action
1. Create .github
folder in your project root folder
mkdir .github
Here we’ll place all our GitHub setups.
3. Create .github/workflows
to place our workflows
mkdir .github/workflows
3. Create build
workflow
touch .github/workflows/build.yml
4. Update the workflow with the following setup
name: Hugo Build and Deploy to S3
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
jobs:
build:
name: Build and Deploy
runs-on: ubuntu-latest
steps:
- name: Check out master
uses: actions/checkout@master
- name: Build and deploy
uses: AlbertMorenoDEV/deploy-hugo-to-s3-action@v0.0.5
with:
hugo-version: 0.89.0
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
Let’s explain this file piece by piece.
This is the name of the workflow that will see in GitHub UI, chose the name you prefer:
name: Hugo Build and Deploy to S3
Here we are telling GitHub to execute this workflow in every push or pull request over the master
branch:
on:
push:
branches: [ master ]
pull_request:
branches: [ master ]
âšī¸ Remember to change master
by main
if this is your working branch.
Each workflow can be formed by several jobs, in our case we’ll just have one that’s named build
:
jobs:
build:
name: Build and Deploy
...
We tell GitHub to run this job on and ubuntu instance:
jobs:
build:
...
runs-on: ubuntu-latest
...
Then it’s job can have several steps, we first need to download the code with a checkout:
jobs:
build:
...
steps:
- name: Check out master
uses: actions/checkout@master
...
Then we’ll finally build the static website and upload it to AWS S3 using Deploy Hugo To S3 Action:
jobs:
build:
...
steps:
...
- name: Build and deploy
uses: AlbertMorenoDEV/deploy-hugo-to-s3-action@v0.0.5
with:
hugo-version: 0.89.0
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
âšī¸ Notice that you’ll need to set up two new secrets in the repository: AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
. This will be the AWS IAM User credentials with enough permissions to do the upload to S3, I recommend the following policy for this user:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "S3WriteAccess",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:ListBucket",
"s3:DeleteObject",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::<aws-s3-bucket-name>",
"arn:aws:s3:::<aws-s3-bucket-name>/*"
]
},
{
"Sid": "CloudFrontCacheInvalidationAccess",
"Effect": "Allow",
"Action": [
"cloudfront:GetInvalidation",
"cloudfront:CreateInvalidation"
],
"Resource": [
"arn:aws:cloudfront::<aws-account-id>:distribution/<aws-cloudfront-distribution-id>"
]
}
]
}
Notice that some data needs to change, it’s just a template.