Deploying static sites to Amazon S3 with Bitbucket Pipelines

Atlassian recently released a new feature for their hosted Bitbucket product called Pipelines. In this post we will show you how we use Pipelines to deploy a static site to AWS S3.

Requirements

  • AWS Access Keys for accessing existing S3 Bucket.
  • Bitbucket Repository with sample codebase for static website.( we have used the sample project from here).

Steps to achieve

  • S3 bucket configuration
  • Pipeline configuration
  • Codebase configuration

S3 Bucket Configuration:

  • Create a S3 bucket.
  • Get access keys for accessing specific bucket.
  • Open s3 bucket properties
    1. Enable website hosting.
    2. Point to index.html
    3. save

Pipeline Configuration

  • Go to the settings for the respective repository.
  • Enable pipeline.
    1. Here, we have chosen Node js docker image template

Now, we have to edit our pipeline YAML File in a way that, when we commit any code to master branch this will trigger the pipeline commands (which is to move the static files to s3) to be executed.

Some other use cases with pipeline

  1. Testing – We can perform operations like npm install, npm run tests, run sonar-code-quality, etc.,
  2. Monitoring – Notify users through email when build success/failure.
  3. Deployment – deploying code to servers.

Lets see how the pipeline files looks like,

image is the docker name to be used.
default pipeline steps are executed on all branches unless specified otherwise. We can also make our code to be tested here.
branches pipeline steps are executes on the specified branches.

The steps in master branch is for deploying our static website to S3. We gonna make this happen with the help of node modules [gulp, gulp-awspublish]

Codebase configuration:

  • Lets create a branch from master, name it develop
  • Two modules listed below will serve our purpose,
    1. gulp –  automated task runner.
    2. gulp-awspublish – publish code to S3
  • Install those modules using the below commands,
    1. npm install gulp -g
    2. npm install gulp gulp-awspublish
  • Configure gulpfile.js
  • Create a task “publish”, then configure awspublish,
  • Define the source folder to publish,
  • Commit the code in development, you can see the pipeline runs automatically based on the “default” configuration (reached near to our goal 🙂 )
  • Define environment variables in bucket repository settings, pipeline -> environment variables.
  • Give pull request from develop to master [include for reviewer], if every thing is ok, then merge the PR, the code will be merged into master and pipeline will start to run with the steps specified in master branch.

The newly changed code automatically deployed to S3,

The maintenance can be done by the teams building the library – very little devops time required.

Overall, Pipelines is easy to get along with. It’s a welcome addition to Bitbucket and if you already work on the command line, it’s very easy to get things working.