Understanding the target architecture of the DevSecOps pipeline – Implementing DevSecOps with AWS

Mike Naughton | August 9th, 2024


A pipeline definition in AWS CodePipeline consists of at least two stages, and corresponding actions within each stage. We covered the constructs of AWS CodePipeline in detail in Chapter 5, Rolling Out a CI/CD Pipeline. Typical stages that come to our mind when we think of the entire life cycle of software delivery are source, build, test, and deploy. In the exercise that follows, we will mainly focus on the first two stages and understand how DevSecOps practices could be introduced in a Docker image delivery workflow, as an example.

We developed and deployed a Python-based To-Do list manager application in the previous chapters. We will reuse the application code and respective Docker manifest file. Our main focus will be on introducing security assessments that can be integrated into the build phase of this application, leading to the creation of a secure Docker image. Figure 9.2 highlights the different types of security scans we will run:

Figure 9.2 – Different stages of the DevSecOps pipeline

Let’s begin by outlining the steps that are executed in each stage of the pipeline:

  1. New code changes: Almost every CI/CD pipeline has an automated trigger for an event of interest. In this case, we are interested in new commits being pushed by a developer into the CodeCommit repository that contains the application code.
  2. CodePipeline execution: Soon after a git commit is registered in the CodeCommit repository, the pipeline execution begins. As shown in the preceding diagram, Source is the first stage, and this is where CodePipeline pulls the most recent version of the code changes and makes them available in subsequent stages and actions.
  3. Application validation: The second stage in the pipeline defines the actions that assess the code and its dependencies against security vulnerabilities. This stage further consists of two actions that execute in parallel:

I. Software Composition Analysis (SCA): This is a CodePipeline action withinthe application validation stage. We leverage CodeBuild to provide a temporary compute environment that can be used to install the Dependency-Check utility from OWASP. Post installation, we validate the Python code dependencies defined in the requirements.txt file. If you have not worked with Python before, the requirements.txt file lists all the dependencies – that is, the libraries and packages used by a Python project.

Dependency-check is an SCA tool that works with various programming languages and detects publicly disclosed vulnerabilities of the project dependencies. This is achieved by assessing the presence of a Common Platform Enumeration (CPE) identifier for a specified dependency. In the event of discovery, a comprehensive report will be generated, containing links to the corresponding CVE entries.

II. Static Application Security Testing (SAST): In this CodePipeline action, we use the Bandit utility, which scans Python code to bring security issues to the surface. Bandit creates an abstract syntax tree (AST) from all the Python code files it scans and then runs security validation plugins against this AST. The output is a report that highlights security issues, along with the suggested actions for the developers.

  1. Dockerfile validation: This is the third stage of the pipeline and is triggered after theapplication code scans are green. As you might notice, we are following the principle of inside-out, by first scanning the application itself, and then tackling the abstractions on top of it, one level at a time. This also helps with optimized usage of pipeline resources since there is no point in testing the Docker image before ensuring a good security posture for the code that will be added to it.

For Dockerfile linting and security analysis, we make use of hadolint. In addition to providing basic linting capabilities, hadolint also allows security professionals to enforce certain things. For example, you can define the trusted registries these Dockerfiles can be based on or suppress false positives. All validations from hadolint can be accessed at https://github.com/ hadolint/hadolint#rules.

  1. Dockerfile build: This is the last and final stage of the pipeline where a Dockerimage is built and then published to an Elastic Container Registry (ECR) repository for use in deployment activities.

The workflow we are deploying can be easily extended with specific deployment actions for any other software. In the ideal scenario, you would have additional stages that consume the artifacts (Docker images) from this pipeline and deploy them onto an environment that supports running Docker containers, such as Elastic Container Service (ECS) or Elastic Kubernetes Service (EKS).

Having understood the overall flow and what we are trying to achieve with the DevSecOps pipeline, let’s outline the two components of the code base that we will be working with.

Leave a Reply

Your email address will not be published. Required fields are marked *