Building Secure CI/CD Pipelines for DoD Applications

The Department of War has embraced DevSecOps as the standard approach for developing and delivering software to warfighters. But building a CI/CD pipeline that satisfies both engineering best practices and the rigorous security requirements of DoD environments is a fundamentally different challenge than standing up a pipeline for a commercial SaaS application. This article walks through the essential components of a secure CI/CD pipeline for DoD applications, the security tooling that must be integrated at each stage, and the Authority to Operate (ATO) considerations that shape pipeline architecture.

The DevSecOps Imperative in Defense

Traditional software delivery in defense environments followed a waterfall model: long development cycles, monolithic releases, and security assessments conducted at the end of the process. This approach produced software that was often outdated by the time it reached operations, and security vulnerabilities discovered late in the cycle caused costly rework and schedule delays.

The DoD Enterprise DevSecOps Reference Design, published by the Department of War’s Chief Information Officer, established a new paradigm. DevSecOps integrates security into every phase of the software delivery lifecycle — from code commit through production deployment. The goal is continuous delivery of secure, mission-capable software at the speed of relevance.

Pipeline Architecture: Core Components

A DoD-compliant CI/CD pipeline typically includes these core components, each serving a specific function in the secure delivery chain.

Source Code Management (Git): All code, infrastructure definitions, and configuration must reside in a version-controlled repository. GitLab is the most widely adopted SCM platform in DoD DevSecOps environments, though GitHub Enterprise and Bitbucket are also used. Repository access must be controlled through role-based permissions, and all commits should be signed to ensure code provenance. Branch protection rules enforce code review requirements before merges to protected branches.

Build Automation (Jenkins/GitLab CI): Automated build systems compile source code, execute unit tests, and produce deployable artifacts. Jenkins remains a workhorse in many DoD environments due to its extensive plugin ecosystem and flexibility. GitLab CI/CD offers a more integrated experience for teams already using GitLab for source control. Build agents must run on hardened infrastructure and produce reproducible builds to support audit requirements.

Static Application Security Testing (SAST): Tools like SonarQube and Fortify analyze source code for security vulnerabilities, code quality issues, and compliance violations before the code is compiled. SAST scans should execute automatically on every commit or merge request, and findings above defined severity thresholds should block the pipeline. SonarQube’s quality gates provide a configurable mechanism for enforcing code quality standards.

Dynamic Application Security Testing (DAST): OWASP ZAP, Burp Suite, and similar tools test running applications for vulnerabilities that SAST cannot detect — such as injection flaws, authentication weaknesses, and session management issues. DAST scans should run against deployed instances in staging environments before promotion to production.

Container Security in the Pipeline

Containerized deployments are the norm in modern DoD applications, and container security must be baked into the pipeline at multiple points.

Base Image Management: The DoD maintains hardened container base images through Iron Bank, a repository of DoD-approved container images that have undergone rigorous security scanning and hardening. Pipelines should pull base images exclusively from Iron Bank or equivalent approved registries. Using unapproved base images introduces unvetted code into the software supply chain.

Container Scanning: Tools such as Twistlock (now Prisma Cloud), Anchore, and Trivy scan container images for known vulnerabilities in operating system packages and application dependencies. Container scans should execute after image build and before the image is pushed to the production registry. Images with critical or high-severity vulnerabilities must be remediated before deployment.

Runtime Security: Container runtime security tools monitor deployed containers for anomalous behavior — unexpected network connections, privilege escalation attempts, or file system modifications. Falco and Sysdig are commonly deployed in DoD Kubernetes environments for this purpose.

Infrastructure as Code and Configuration Management

Infrastructure definitions (Terraform, CloudFormation) and configuration management (Ansible, Puppet) must be treated as code — version controlled, peer reviewed, and tested. Security scanning tools such as Checkov and tfsec analyze infrastructure-as-code templates for misconfigurations, such as overly permissive security groups, unencrypted storage volumes, or missing logging configurations. These scans should be integrated into the same pipeline that processes application code.

ATO Considerations for CI/CD

The pipeline itself is a system that requires authorization. Under the Risk Management Framework (RMF), the CI/CD pipeline and its components — Jenkins, GitLab, SonarQube, container registries, artifact repositories — must be included in the system’s security boundary and assessed against applicable NIST SP 800-53 controls.

The DoD’s Continuous Authority to Operate (cATO) model aligns naturally with DevSecOps pipelines. Rather than a point-in-time assessment, cATO relies on continuous monitoring, automated security testing, and real-time visibility into the security posture of deployed applications. A well-instrumented CI/CD pipeline provides much of the evidence needed to support a cATO by demonstrating that security controls are continuously enforced through automated testing.

Key ATO artifacts that the pipeline should generate automatically include Software Bill of Materials (SBOM), vulnerability scan results with remediation status, code quality and security scan reports, deployment audit logs, and configuration compliance assessments. Automating the generation of these artifacts reduces the administrative burden on development teams and accelerates the authorization process.

Zapata Technology’s DevSecOps Approach

Zapata Technology has designed and implemented secure CI/CD pipelines for multiple Department of War programs, integrating security tooling at every stage while maintaining the development velocity that mission owners demand. Our software engineering services include pipeline architecture, security toolchain integration, cATO support, and ongoing pipeline operations. We work within customer-accredited environments and tailor pipeline configurations to meet the specific security requirements of each program.

Conclusion

Building a secure CI/CD pipeline for DoD applications requires more than plugging security tools into a standard pipeline template. It demands a deliberate architecture that integrates security at every stage, generates the artifacts needed for authorization, and supports the continuous monitoring that modern ATO models require. Organizations that invest in building these capabilities early will deliver more secure software faster — and spend less time and money navigating the authorization process.

Frequently Asked Questions

What is the difference between DevOps and DevSecOps?

DevOps focuses on integrating software development and IT operations to improve delivery speed and reliability. DevSecOps extends this by embedding security practices and automated security testing at every stage of the pipeline — from code commit through production deployment. In Department of War environments, DevSecOps is not optional; it is a requirement for achieving and maintaining Authority to Operate (ATO). Zapata Technology’s software engineering services implement DevSecOps as a core practice.

Can CI/CD pipelines operate on classified networks?

Yes. CI/CD pipelines can and do operate on classified networks at IL4, IL5, and IL6. The key requirements are that all pipeline components — source control, build servers, artifact repositories, security scanners, and container registries — must be deployed within the accredited security boundary and meet applicable STIG hardening requirements. Air-gapped environments require special considerations for dependency management and tool updates, as external repositories are not accessible.

What tools are approved for DoD DevSecOps?

The DoD DevSecOps reference design includes tools such as GitLab for source control and CI/CD orchestration, SonarQube for static code analysis, Fortify for application security testing, Twistlock (Prisma Cloud) and Anchore for container scanning, and Iron Bank for hardened container base images. The specific toolset may vary by program and classification level. Zapata Technology helps programs select and integrate the right toolchain through our software engineering services.

Contact Us We're Hiring 888-708-9840 Follow Us