How to ensure Secure DevOps and good Software Quality

What is DevOps

With the term DevOps I saw many variations. There are organizations where teams would never get access to the production environment. The work is split in different teams (and sometimes they are in different organizations). The development team create a new release candidate, and the operation team deploy it on productive. This is sometimes necessary in sensitive domains. But the disadvantage is that for trouble management multiple teams are involved and they need to communicate with each other. The development team know their software, the operation team is experienced with the runtime environment. Finding problems and bugs need a collaborative interaction between that teams.

Usually it’s more efficient when one cross-functional team take the responsibility for the whole product lifecycle like ‘you build it, you run it’. That does not mean every team member need the skills in each domain, but the team itself should have all necessary skills. This has some advantages:

  • The team can learn from bugs, problems and incidents. This will effect in a better design for the software.
  • The team can solve all issues regarding the product and its runtime environment without the need of collaboration with other teams.
  • The team is able to deliver without external dependencies.
  • The team would be motivated to design a maintainable and stable product because they need to maintain it by themself.

The question is: how can a team get the full responsibility for the software and also for the whole runtime environment without the risk of issues and security lacks? Software infrastructure and also software libraries get more and more targets for attacks. But there are many tools and methods that will help to ensure a safety software development process (what is also called Secure DevOps) that reduces the risk of potential security lacks.

In this post I want to share my experience how to ensure Secure DevOps and good Software Quality.

Secure Devops & Software Quality with Static Code Analysis

There are many reasons for static code analysis. Some of them are…

  • The code should be formatted in a way the team aligned
  • Some bugs can be identified by static code analysis
  • Complex and redundant hotspots get identified
  • Bad readable code or don’t do’s get identified
  • Potential security lacks (like for ex. hard coded secrets) get identified
  • Architecture violations can be identified

A common used tool for static code analysis is Sonarqube. It supports many languages and rulesets that come from a big community.

Measurement of the Test Coverage for a good Software Quality

It’s clear that a good software product should have automated tests to ensure the software is still running after changes and refactorings. A test coverage can be a good indicator to measure the amount of tests. But this metric should be used in a pragmatic way. There are many kinds of coverage like path coverage, branch coverage, condition coverage to name a view of them. In my opinion, it’s waste of time to cover 100% of the source code. The product team should discuss which elements of a product need a high test coverage. Some indicators could be

  • Complex hotspots
  • Important usecases
  • Cost of damage
  • Amount of changes
  • Security-sensitive parts like authentication and authorisation

Sonarqube help to measure the test coverage. With exclude files it’s possible to exclude code that should not be considered. For example it’s waste of time to test accessor methods without logic or generated code.

Quality Gates

It’s possible to define quality gates. They contain rules that defines when the gate is passed or not. For example critical bugs from the static code analysis, a bad security rating or a low test coverage can break the quality gate. If the deployment pipeline is configured with a build breaker, the pipeline would fail if the quality gate is not passed.

Dependency Check as part of Secure DevOps

Another big part of Secure DevOps is the verification of external libraries. More and more libraries have security lacks that get used by potential attackers. One tool that help is Nexus IQ. This tool can be integrated in the build pipelines. During the build process the Nexus IQ Auditor checks all libraries. The tool create a reports for each build and displays license issues and security lacks. It recommends newer or older versions of a corrupted library where the error doesn’t exists. I would recommend to configure the pipeline in a way that the pipeline break in the case of critical errors.

Secure the Runtime Infrastructure as part of Secure DevOps

It’s really important to secure the runtime infrastructure. Special in public cloud environments this is absolutely necessary. We are humans and everybody can make mistakes. But for luck there are also many ways to reduce the risk of potential security lacks massive.

There are two ways to achieve that.

  1. Static code analysis

Many infrastructure tools like Terraform and Pulumi provide the ability for testing the infrastructure code. This can help to ensure a high level of security. For example it can be verified that databases have firewalls right configured, anonymous access is not enabled and secure transport protocols are used. Many potential security risks can be covered by the corresponding tests.

2. Policies

Another way is to use the corresponding runtime infrastructure. For example azure provides the ability to define policies. There are hundreds of policies that can be configured. For example one policy can ensure that each database is protected by a firewall. If a user via CLI, API or UI (-> or let’s say in any kind of way) forget to configure a firewall, the provisioning of the corresponding resource fails. Alternative of existing policies it’s also possible to create individual custom policies.


There are many ways to ensure secure devops and good software quality. It’s possible to analyze the code for potential bugs, bad maintenance and potential security lacks. The relevant modules should be covered by automated tests. Quality gates ensure that the aligned test coverage and software quality is fulfilled. It’s also possible to identify corrupted libraries. DevOps is possible, and instead of splitting development and ops in two teams it’s in the most cases a better way to give the full responsibility in a cross functional product team. Security and infrastructure specialists can use tools like Terraform or Azure Policies to make sure that the runtime infrastructure has a good security level. The risk of human mistakes can be reduces massively.

All this actions should be integrated in the pipeline. If problems occur, the pipeline should break to prevent the deployment of corrupted software.

It should be clear that it’s impossible to decrease the risk of security problems to zero. But these actions can help to reduce the risk massively. For me this is the current baseline for a secure devops environment, but there are quite more tools and methods on the market. And this field is a massive growing area in face of potential threats and also tools for preventing them.