Team Insights: Deploying to AWS via Azure DevOps

Cloud-based services for continuous deployment are very popular because they are flexible and reliable, while they remove the need to maintain in-house build servers. This all motivated us at Code4Nord to search for solutions to automatically deploy our project to Amazon Web Services (AWS) without the need of a local server.

The starting configuration for the project was a TeamCity continuous integration build server from which the package was uploaded to Amazon S3 (Amazon Simple Storage Service), and then deployed to an Amazon EC2 instance with AWS CodeDeploy. From this configuration, we wanted to remove the TeamCity part and replace it with a cloud-based build agent, while experimenting a bit and not having to use the entire suite provided by Amazon for continuous integration and continuous deployment (CI/CD).

For the proof of concept, I first started configuring a Docker Cloud build, but the second build attempt went on a wrong path. So, after being stuck for two hours, the cancelling process then took another eight. Not having too much time to create this proof of concept, I decided to move on to another provider: Azure DevOps.

Azure DevOps fit our needs very well, and I was able to configure the build pretty easily, even if our project is divided on two separate solutions — an Angular 7 front-end and an ASP.NET Web API 2 back-end.

Setting up a build in Azure DevOps

The first step after creating the project was to create a Pipeline, by first selecting Bitbucket Cloud as source, and providing the repository and branch for which I would like to configure the build.

Next, I selected ASP.NET as a template and my pipeline was created with six predefined tasks for the agent to run after each push executed to the connected repository:

  1. NuGet (the package manager for .NET) and NuGet Package Restore were added to prepare the packages that are needed to be built by the back-end project. The only thing I had to provide at this point that was not out-of-the-box was the path to the SLN file, to help NuGet Restore get the list of the required packages.
  2. Build solution, which needed the path to the SLN file as well.
  3. Test Assemblies. No additional settings were required at this step, as it will take the unit tests from the solution and run them after the Build solution task completes successfully.
  4. The publish symbols path and Publish Build Artifacts were added automatically, but I removed them as the needs were different for our configuration.

Following these steps, the build of the back-end was successfully completed. I continued configuring the pipeline by adding the required tasks to build the front-end. Having an Angular 7 project which I built using Angular CLI (command line interface), I needed two npm tasks:

  1. Use a custom command to install the @angular/cli (I simply set the type to “custom”, and the command to “install @angular/cli -g”)
  2. Execute the install of all packages needed by the project, where I provided the name of the folder which contains the package.json file.

The front-end build process configuration was completed by one last item in the pipeline: a Command Line task, which will execute the build command: ng build –prod -c dev.

Because the “ng build” command creates the result of the build as a dist folder in the front-end’s root, I needed a Publish Artifact task to copy the entire “dist” in the folder generated by the back-end build process from the first section of the pipeline. This way, the next (and last) step will be able to upload everything in one batch to S3, from where AWS CodeDeploy will continue the deployment process.

The final task for the pipeline was the AWS CodeDeploy Deployment. As AWS CodeDeploy is configured to take an archive from S3 in order to execute the deployment to the EC2 instance, I created a bucket in S3 which will be the host of these archives. The Azure DevOps pipeline needs the name of this bucket so it will know where to upload the files from the workspace – the ones generated by the builds of both the front-end and the back-end. For this configuration, I had to set the Deployment Revision Source of the pipeline task as “Folder or archive file in the workspace”. The other option for this would have been to use an archive from S3, but in our case, the files that need to be uploaded are in the workspace, not in S3. So, I provided the source folder of the files from the workspace and the S3 bucket name that I created to be the host.

The Azure DevOps pipeline is one of the main components of our project’s continuous deployment procedure. We are using continuous deployment so our clients have access to the newest features and fixes as soon as they have been implemented and approved during the code review process. The feedback of clients and the active communication with them are among the most important elements of the development of this project, and the best way for them to see our progress is to interact with the project constantly.

Next, I had to provide the region of our AWS, and I had to create an AWS Service Connection to be used as credentials, which required an Access Key ID and Secret Access Key from the AWS Management Console. In order to finish configuring this last task, I needed details about our CodeDeploy setup from AWS: the names of the Application and Deployment Group created to execute the deployment to our EC2 instance having as source the S3 bucket that I also specified in the pipeline task.

Deployment seen from Azure DevOps

Deployment seen from AWS Management Console

That was it! Our build was up and running on our EC2 instance. A simple way of swapping a crucial element of our deployment process (TeamCity) to a new one, with an additional bonus: 1800 free build hours per month.

About This Project: I’m writing this as a software developer on this project. Our project focuses on business management by providing an interface for tracking companies’ financial and legal data retrieved from multiple providers. The project also offers the possibility to store and organize official documents which then can be linked to various functionalities, to help the users have a good overview of a company’s status.

Previous post Next post

Visit from Agillic’s Leadership Team

Read More

Understanding DevOps Services and How They Can Increase Your Business Value

Read More

Comments are closed.