In the world of software development, code review and approval are important processes to ensure the quality, security, and functionality of the software being developed. However, managers charged with overseeing these critical processes often face numerous challenges, including:
- Lack of technical experience – Managers may not have deep technical knowledge of the programming language used or may not have been involved in software engineering for an extended period. This results in a knowledge gap that can make it difficult for them to accurately assess the impact and robustness of proposed code changes.
- Time limitations – Code review and approval can be a time-consuming process, especially on larger or complex projects. Managers must balance the thoroughness of the review and the pressure to meet project schedules.
- Change request volume – Dealing with a high volume of change requests is a common challenge for managers, especially if they oversee multiple teams and projects. Similar to the time constraint challenge, managers must be able to handle those requests efficiently so as not to slow down project progress.
- manual effort – Code review requires manual effort on the part of managers and the lack of automation can make it difficult to scale the process.
- Documentation – Proper documentation of the code review and approval process is important for transparency and accountability.
With the rise of generative artificial intelligence (ai), managers can now take advantage of this transformative technology and integrate it with the AWS suite of tools and implementation services to streamline the review and approval process in a way that was not possible before. In this post, we explore a solution that offers an integrated end-to-end deployment workflow that incorporates automated change analysis and summary along with approval workflow functionality. We use Amazon Bedrock, a fully managed service that makes base models (FMs) from leading ai startups and Amazon available via an API, so you can choose from a wide range of FMs to find the model that suits you. best suits your use case. With Amazon Bedrock's serverless experience, you can get started quickly, privately customize FMs with your own data, and integrate and deploy them into your applications using AWS tools without having to manage any infrastructure.
Solution Overview
The following diagram illustrates the architecture of the solution.
The workflow consists of the following steps:
- A developer submits new code changes to their code repository (such as AWS CodeCommit), which automatically triggers the start of an AWS CodePipeline deployment.
- The application code goes through a code creation process, performs vulnerability scanning, and performs unit testing using your preferred tools.
- AWS CodeBuild retrieves the repository and performs a git show command to extract code differences between the current commit version and the previous commit version. This produces line-by-line output indicating the code changes made in this version.
- CodeBuild saves the result to an Amazon DynamoDB table with additional reference information:
- CodePipeline Execution ID
- AWS Region
- CodePipeline Name
- CodeBuild build number
- Date and Time
- State
- Amazon DynamoDB Streams captures data modifications made to the table.
- The DynamoDB stream triggers an AWS Lambda function to process the captured log.
- The function invokes the Anthropic Claude v2 model in Amazon Bedrock via Amazon Bedrock Model Invocation API call. The code differences, along with a message, are provided as input to the model for analysis and a summary of the code changes is returned as output.
- The model output is saved again in the same DynamoDB table.
- The administrator is notified through Amazon Simple Email Service (Amazon SES) that the code changes are summary and that their approval is required for the deployment.
- The administrator reviews the email and provides their decision (approve or reject) along with review comments through the CodePipeline console.
- Amazon EventBridge captures the approval decision and review comments, triggering a Lambda function to save them back to DynamoDB.
- If approved, the pipeline deploys the application code using your preferred tools. If rejected, the workflow ends and the deployment does not continue.
In the following sections, you will deploy the solution and verify the end-to-end workflow.
Previous requirements
To follow the instructions in this solution, you need the following prerequisites:
Implement the solution
To deploy the solution, complete the following steps:
- Choose launch stack to launch a CloudFormation stack on
us-east-1
: - For Email address, enter an email address that you have access to. The summary of the code changes will be sent to this email address.
- For Model IDleave as the default anthropic.claude-v2, which is the Anthropic Claude v2 model.
Deploying the template will take about 4 minutes.
- When you receive an email from Amazon SES to verify your email address, choose the link provided to authorize your email address.
- You will receive an email titled “Summary of Changes” for the initial commit of the sample repository to CodeCommit.
- In the AWS CloudFormation console, navigate to the Departures implemented stack tab.
- Copy the value of RepoCloneURL. You need it to access the sample code repository.
Try the solution
You can test the end-to-end workflow by assuming the role of a developer and pushing some changes to the code. A set of sample codes has been prepared on CodeCommit. To access the CodeCommit repository, enter the following commands in your IDE:
You'll find the following directory structure for an AWS Cloud Development Kit (AWS CDK) application that creates a Lambda function to perform bubble sort on a string of integers. The Lambda function can be accessed through a publicly available URL.
Make three changes to the application codes.
- To enhance the function to support quick and bubble sort algorithm, take a parameter to allow selection of which algorithm to use and return both the algorithm used and the sorted array in the output, replace all contents of
lambda/index.py
with the following code:
- To reduce the function timeout setting from 10 minutes to 5 seconds (because we don't expect the function to run more than a few seconds), update line 47 in
my_sample_project/my_sample_project_stack.py
as follows:
- To restrict the invocation of the function using IAM for greater security, update line 56 in
my_sample_project/my_sample_project_stack.py
as follows:
- Push the code changes by entering the following commands:
This starts the CodePipeline deployment workflow from steps 1 to 9, as described in the solution overview. When invoking the Amazon Bedrock model, we provide the following message: