In parts 1 and 2 of this series, I covered the basics of creating a serverless API using the serverless framework and also securing the API using Auth0 and AWS API Gateway. In this part, I will detail the process to use Bitbucket pipelines to automatically build and deploy the API upon code check-in.
CI / CD Process Flow
The code for our awesome API will be checked into a GIT repository hosted on Bitbucket. Once the pipeline is configured, it will run with every push and depending on the branch that was changed, an automated build and deployment will be kicked off.
Bitbucket Pipeline Configuration
The following pipeline.yml file details the bitbucket pipeline steps to be followed upon code check-in.
image: node:6.9.4 pipelines: branches: master: - step: script: # Builds and deploys to the production environment - cd my-serverless-api - npm install - npm install serverless -g - serverless config credentials --provider aws --key $AWS_ACCESS_KEY_ID --secret $AWS_SECRET_ACCESS_KEY --stage prod - sls deploy --stage prod development: - step: script: # Builds and deploys to the development environment - cd my-serverless-api - npm install - npm install serverless -g - serverless config credentials --provider aws --key $AWS_ACCESS_KEY_ID --secret $AWS_SECRET_ACCESS_KEY --stage dev - sls deploy --stage dev
Let’s have a quick look at what we did in the yml file above.
- We start by making use of a node image to give us a container with NodeJs installed.
- We added configuration for both the development and master (production) branches
- The steps for both branches are the same, apart from the actual deploy step which details a –stage parameter. Remember in part 1 we added the dev stage, which creates an API and Lambda function with a dev suffix. This helps with supporting multiple environments and versions of our API
- In my case, I added the root folder, and not just the function folder, so I need to perform a cd command first to get into the function folder
- We then run
ainstall command to install all the dependencies as well as the serverless framework npm
- The serverless framework is then configured with environment variables. The same credentials can be used as in part 1.
,the API is deployed
Go ahead and make some changes to your API and then commit the changes to your branch. On commit, the Bitbucket pipeline will run, and deploy your API to AWS, completely removing the need for manual deployment, and streamlining the development process.
This process can be further optimised with automated unit testing and roles based branching and merging strategies, but that is a subject for another day.