Overview of Code Pipeline
AWS Code Pipeline is an AWS service that helps to deploy the software automatically. It will allow the developer to deliver the code for new updates quickly. It is a continuous delivery service. CodePipeline will automatically build, test, and launch an application every time the code changes.
It will reduce the development time, which contains only a few manual steps. With CodePipeline, we can easily configure the software release at different stages.
Top Features of CodePipeline
- Access Control: We can manage the user activities and control the user who can make changes to the code by using AWS IAM.
- Receive Notification: We can integrate Amazon SNS notifications to receive the notifications. Every notification consists of status messages of the events.
- Fast Delivery: It will allow to rapidly release the new features to the users as and when there is a new release.
- Easy Integration: We can use pre-built plugins or custom plugins in steps of your release process.
- Quick Start: It is a service that will continuously manage to deliver the service and connect to the current system and tools.
What is Monorepo?
Many organizations make use of GitHub as a source code repository. Multiple applications and services store the source code of numerous projects in a single repository, and it is called monorepo.
Monorepo is a centralized place where the codebase lives in the same repository. Everyone in the team uses the latest version of code for the projects. It is a single repository that stores the codes of multiple applications and libraries.
Benefits of using Monorepo:
- Shared Code: Code is stored in a single repository, and everyone in the team can share the latest version of the code. This enables to share of the logic between the frontend and backend.
- Ease of maintenance: When the code is updated in the library, the update is made to all the applications which use the shared library.
- Atomic changes: With monorepo, there is no need to coordinate with multiple teams or projects if there is any change in the code.
- Code reuse: Different projects can use the common code stored in the repository.
- Faster code review: We can quickly track and review the code changes when the code is stored in a single repository.
The release pipeline is invoked by default when there is a change in the code repository. While using GitHub as the pipeline source the pipeline, to detect the changes in the remote branch and start the pipeline, the CodePipeline will make use of a webhook. When we make changes in any of the folders in the repository of the monorepo with GitHub, the CodePipeline gets an event at the repository level.
The above diagram describes how GitHub events will invoke a monorepo service-specific pipeline by evaluating the event triggered using Lambda.
- Add customizations to start pipeline based on external factors: Whether a pipeline should be triggered or not can be customized using custom code. We can create custom logic to trigger the pipelines.
- Have multiple pipelines with a single source: By making groups, we can make changes to the selected pipelines, when multiple pipelines use single repo.
- Avoid reacting to unimportant files: When changing the file doesn’t affect the application, we can avoid triggering the pipeline.
Sample Architecture Diagram
The above figure describes the following steps:
- Changes to the source code are completed and stored in the GitHub repo.
- The push event will be triggered by the Webhook.
- API Gateway will authenticate the GitHub webhook push event and then invoke the lambda function
- Post Lambda function is invoked, it will check for the configuration files that are stored in the S3 bucket.
- Lambda function will start the CodePipeline.
The above solution describes the following parts:
- Amazon API Gateway: It will receive the authentication GitHub webhook push event from the GitHub Repo. It will also evaluate the push event incoming from GitHub and start the pipeline
- Amazon S3: S3 bucket will store the specific CodePipeline configuration files
- AWS CodeBuild: It contains the build stage of the pipeline
The CodePipeline will automatically trigger the pipeline to release the latest version of the source code. By choosing the release change we can manually run the latest version of the code.
Creating the Lambda Function
The Lambda function is responsible for evaluating and authenticating the events. The function can be resolved through GitHub event payloads by determining which files are changed, deleted, added and appropriate actions are performed:
- The single pipeline is started depending on the folder changes in GitHub.
- Multiple Pipelines are started.
- If non-relevant files are changed, ignore the change.
S3 stores the project configuration details, Lambda can read the configuration file and decides what actions to be taken when the particular file is matched from the GitHub event.
Creating the GitHub Webhook
Webhooks allow external services to be notified of certain events provided by GitHub. Webhooks are used to create to push events. POST requests are generated to the URL specified for any files committed and pushed to the repository.
In the above example, the GitHub source code repository is monitored by two pipelines. Based on the GitHub events, the lambda function will decide which pipeline to run. The lambda function has the logic to ignore the unimportant file. Using S3, API Gateway, and Lambda combination servers as logic to invoke pipeline.
CloudThat is the official AWS (Amazon Web Services) Advanced Consulting Partner and Training partner and Microsoft gold partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best in industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.
Drop a query if you have any questions regarding Github Integration, AWS CodePipeline, and DevOps best practices and I will get back to you quickly. To get started, go through our Expert Advisory page and Managed Services Package that is CloudThat’s offerings.
Can multiple projects be put in a single repository?
Yes. Multiple projects can be put in a single repository, but they would have to be on different branches.
When to use monorepo and when multi-repo?
Monorepos can be used when the entire team can view the changes done to the code by a single person. Multi-Repos creates separate repos for every team.