For the last two and half years, I have been doing much development around automation, tools, microservices and serverless applications. Before that, I was developing applications and services in PHP. Back in those days, I was using a traditional set of tools to integrate, test and deploy my code using GitHub, Travis, Chef (AWS OpsWorks), and a whole lot of manual setup.
Recently I was wondering what it would take to launch a PHP application taking advantage of some new and old technologies that I have had the opportunity to learn and master both at work and in my own time.
The objective of this post is to define a Continuous Integration/Continuous Deployment pipeline and the application infrastructure using the following technologies:
- AWS CloudFormation to define the infrastructure as code
- CodePipeline as the primary orchestration mechanism
- CodeBuild to run tests and install dependencies
- S3 to save deployment assets
- SNS as the notification mechanism
- GitHub webhooks, protected branches and status checks.
Infrastructure as Code
All the code is defined as CloudFormation templates and the code is hosted on the GitHub repository "Elastic Beanstalk CICD"
Continuous integration refers to the practice of merging code into a central repository or integration branch (I refer to this as the integration branch) after a series of code quality checks have passed. There are many methodologies to enable this process including Git Flow and GitHub Flow.
Some of these checks include but are not limited to:
- Unit test
- Code Reviews
- Security validation
- Code analysis
Enabling Required GitHub Status Checks:
Check the article "Enabling required status checks" for more information.
AWS CodeBuild is a simple build service, and unlike some of it is more popular counterparts, you only pay for the time that the service is in use. Another advantage of CodeBuld is that you can define custom docker images for your build container and these can be hosted in Dockerhub or on Elastic Container Registry (ECR). This can be very powerful and quite simple to do if you have a bit of experience with Docker.
To add a GitHub check, we are going to use CodeBuild and trigger it every time code is pushed to the GitHub repository and every time there is a PullRequest. At this time I do not know a way to automatically set up the GitHub webhook without having to create a CloudFormation custom resource. Therefore I am just going to define the
CodeBuildTest resource and set it up manually.
A CodeBuild definition is straight forward:
There are a few things to notice here:
- The environment variables passed to code build can be used to perform custom actions
- We specify a custom
BuildSpecfile name, this will enable us to define a different series of steps for testing code; the
buildspec.ymlis used to define the steps to build the code.
buildspec-test.yml is defined as follows:
Since CodeBuild does not have an image to run PHP, I have created a PHP-Build image that contains several tools required to prepare a PHP application to be deployed.
Continuous Deployment is the process in which teams reliably and continuously release code to a production environment. In this instance, every change that passes the automated tests and is merged into the integration branch is automatically deployed.
The main orchestration mechanism is going to be AWS CodePipeline. In CodePipeline we can define different types of steps, but for the sake of this example the following process is defined:
This step listens for code changes in the GitHub repository. The code is downloaded into an S3 bucket. CodePipeline then triggers CodeBuild to start the build process.
CodeBuild is used once again, this time around the steps will be different from the automated tests run. In this instance, the CodeBuild "phases" include:
Install: required dependencies like OS packages, updates and more.
Pre Build: Setup the environment to run the test, build the code, generate configuration, etc.
Build: Here you can build your code or run unit tests.
Post Build: perform any cleanup steps
There is an additional optional step called
artifacts. While it is optional, it is essential to use it here to pass the code that we prepared during the build phase to the next step in the pipeline.
This is the act of publishing the latest version of the repository prepared by code build to the application server!
Elastic Beanstalk is a Platform as a Service offering from AWS. It makes it easy to launch and scale applications developed in various programming languages including Java, .Net, PHP, Node.js, Python and more.
This post focuses on PHP, but it can be modified to work with other platforms.
The template defines parameters to set up the following options
- Application Name (This must match the code-pipeline application name)
- Solution Stack Version (The AWS version of the stack. See Supported Platforms for more information)
- Instance type
- Environment Type (single instance or auto-scaling)
- Min and Max number of instances
- Document Root
- PHP version
- PHP Memory Limit
- Environment variables for a database (user, pass, host, DB name)
The diagram shows the application setup when enabling load balancing, but it can be deployed as a single instance. It also shows an RDS instance, but the CloudFormation template does not define it.
The template defines an S3 bucket, the files bucket. It intends to have a place to store any assets required by the application such as images, CSS, js, and others. Ideally, the S3 would have a CloudFront distribution in front of it, but it is an exercise for the reader to implement this.
The stack generates and exports a few values:
- The EB application name
- The EB environment name
- The artifacts bucket
- The files output bucket
I believe essential to remove as many barriers as possible for developers to safely put their code into production, enabling them to have quick feedback on their work and respond faster when issues arise.
Setting up a CI/CD pipeline process early on in the life of any application can foster this process.
At the same time, it is vital for every organization to invest in good engineering practices: testing and reviewing code just to mention a couple; it is everyone's responsibility to ensure the code is held to a very high standard, and no single team (QA or DevOps) can hold the entire responsibility.
I hope this pipeline can serve as a template to start new projects taking advantage of several tools and best practices for CI/CD, but it is by no means an extensive and you are encouraged to improve upon it.