AWS Partner Network (APN) Blog
AWS Sample Integrations for Atlassian Bitbucket Pipelines
Editor’s note: For the latest information on Atlassian Bitbucket, visit the Atlassian website.
By Josh Campbell, Partner Solutions Architect at AWS
Today, APN Partner and AWS DevOps Competency Partner Atlassian announced the beta of Bitbucket Pipelines, which allows customers to trigger build, test, and deploy actions every time they commit code to Bitbucket Cloud. Bitbucket is a source code control service that hosts Git and Mercurial repositories. Here at AWS, we have built a number of sample integrations to demonstrate how customers can use this new feature to deploy their code changes from Bitbucket to update their running applications on AWS.
As an example, customers can now deploy functions to AWS Lambda, Docker containers to Amazon EC2 Container Service, or an application to AWS Elastic Beanstalk, all by simply pushing code to a Bitbucket repository. Take a look at sample configuration scripts on Bitbucket to get an idea of how you can take advantage of this new Bitbucket functionality and automate code deployments to your AWS applications.
Using Atlassian Bitbucket Pipelines with AWS
You can easily enable Bitbucket Pipelines on a Bitbucket repository by choosing a new icon in the menu bar. The commits page in your repository will also have a new column called “Builds” where you can see the result of the Pipelines actions that were run on that commit. The Pipelines page shows further information about the commits.
Once you enable Bitbucket Pipelines, you’ll need to include a YAML configuration file called bitbucket-pipelines.yml that details the actions to take for your branches. The configuration file describes a set of build steps to take for each branch in Bitbucket. It provides the flexibility to limit build steps to certain branches or take different actions for specific branches. For example, you might want a deployment to AWS Lambda step to be taken only when a commit is made on the “master” branch.
Under the hood, Bitbucket Pipelines uses a Docker container to perform the build steps. You can specify any Docker image that is accessible by Bitbucket, including private images if you specify credentials to access them. The container starts up and then runs the build steps in the order specified in your configuration file. One thing to note is that creating your own Docker image with all required tools and libraries for your build steps helps speed up build time.
The build steps specified in the configuration file are nothing more than shell commands that get executed on the Docker image. Therefore, you can run scripts, in any language supported by the Docker image you choose, as part of the build steps. These scripts can be stored either directly in your repository or in an Internet-accessible location such as Amazon S3. To support the launch of Bitbucket Pipelines, AWS has published sample scripts, using Python and the boto3 SDK, that help you get started on integrating with several AWS services, including AWS Lambda, AWS Elastic Beanstalk, AWS CodeDeploy, and AWS CloudFormation. You can try these samples out with three easy steps:
- Copy the Python script into your repository.
- Incorporate the bundling and execution of the script in your YAML configuration file.
- Configure any environment variables required by the scripts.
Detailed instructions on how to use these samples are specified in the README file in their repositories. More information on using Bitbucket Pipelines can be found in Atlassian’s official documentation.
Check out Atlassian’s blog post on the Bitbucket Pipelines beta >>