Skip to main contentIBM Cloud Patterns

Continuous Integration with Jenkins

Use Jenkins or Tekton to automate your continuous integration build process for your code

In IBM Garage Method, one of the Develop practices is continuous integration. The Developer Environment uses a Jenkins or Tekton pipeline to automate continuous integration.

What is continuous integration

Continuous integration is a software development technique where software is built regularly by a team in an automated fashion. This quote helps explain it:

Continuous Integration is a software development practice where members of a team integrate their work frequently, usually each person integrates at least daily - leading to multiple integrations per day. Each integration is verified by an automated build (including test) to detect integration errors as quickly as possible. Many teams find that this approach leads to significantly reduced integration problems and allows a team to develop cohesive software more rapidly

– Martin Fowler

What is Jenkins

Jenkins is a self-contained, open source automation server that can be used to automate all sorts of tasks related to building, testing, and delivering or deploying software. It is a perfect tool for helping manage continuous integration tasks for a wide range of software components.

Jenkins Pipeline (or simply “Pipeline”) is a suite of plugins that supports implementing and integrating continuous delivery pipelines into Jenkins.

A continuous delivery pipeline is an automated expression of your process for getting software from version control right through to your users and customers.

Jenkins Pipeline provides an extensible set of tools for modeling simple-to-complex delivery pipelines “as code.” The definition of a Jenkins Pipeline is typically written into a text file (called a Jenkinsfile) that in turn is checked into a project’s source control repository.

Pipelines

Pipelines offer a set of stages or steps that can be chained together to allow a level of software automation. This automation can be tailored to the specific project requirements.

You can read more information about Jenkins Pipelines here

Stages

Pipelines are defined in a Jenkinsfile that sits in the root of your application code. It defines a number of stages. Each of the Code Patterns includes a Jenkinsfile that offers a number of stages. The stages have been configured to complete the build, test, package, and deploy of the application code. Each stage can use the defined defined secrets and config maps that were previously configured during the installation of Development cluster setup.

Developer Tools Pipeline

To enable application compatibility between Kubernetes and OpenShift, the Jenkinsfile is consistent between pipeline registration with both platforms. Also, the Docker images are built from UBI images so that their containers can run on both platforms.

These are the stages in the pipeline and a description of what each stage does. The bold stage names indicate the stages that are required; the italics stage names indicate optional stages that can be deleted or will be ignored if the tool supporting the stage is not installed. These stages represent a typical production pipeline flow for a cloud-native application.

  • Setup: Clones the code into the pipeline
  • Build: Runs the build commands for the code
  • Test: Validates the unit tests for the code
  • Publish pacts: Publishes any pact contracts that have been defined
  • Sonar scan: Runs a sonar code scan of the source code and publishes the results to SonarQube
  • Verify environment: Validates the OpenShift or IKS environment configuration is valid
  • Build image: Builds the code into a Docker images and stores it in the IBM Cloud Image registry
  • Deploy to DEV env: Deploys the Docker image tagged version to dev namespace using Helm Chart
  • Health Check: Validates the Health Endpoint of the deployed application
  • Package Helm Chart: Stores the tagged version of the Helm chart in Artifactory
  • Trigger CD Pipeline: This is a GitOps stage that will update the build number in designated git repo and trigger ArgoCD for deployment to test

Registering Pipelines

  • The Code Patterns are a good place to start to see how Jenkinsfile and Dockerfile should be configured for use in a Jenkins CI pipeline. To register your git repo, use the CLI. The CLI tools adds a new command called kubectl pipeline, This command automates a number of manual steps you would have to do with Jenkins, including: managing secrets, webhooks, and pipeline registration in the Jenkins tools.

  • By default, the pipeline will register into the current namespace and will copy all the configMaps and secrets from the tools namespace to this namespace. This means the pipeline can execute, knowing it has access to the key information that enables it to integrate with both the cloud platform and the various development tools.

Registering Pipeline in new namespace

  • You can use any namespace you want to register a pipeline. Using the kubcectl sync command it will created you a new namespace for your team. It will copy the necessary secrets and configMaps into that namespace and configure the build agents pods to run in that namespace.

    kubectl sync dev-team-one --dev
  • You can structure multi teams squads, teams, pairs or students working in the same Development cluster given them their own namespace to work in for CI activities.

  • Create a template app and clone it to your Cloud Shell environment

  • Then register the code with the Jenkins environment using the following command

    kubectl pipeline
  • You will be prompted for you Git Hub Personal Access token

  • It will complete with the registration of the application code in a Jenkins pipeline

Continuous deployment

In addition to continuous integration, the Developer Environment also supports continuous delivery using Artifactory and ArgoCD.

Running Application

Once the Jenkins or Tekton pipeline has successfully completed you can validate your app has been successfully deployed.

  • Open the Kubernetes Dashboard or OpenShift Console and select the {new-namespace} project

  • Get the hostname for the application from ingress

    kubectl endpoints -n {new-namespace}
  • You can use the the oc command to get the name of the deployed application

    oc endpoints -n {new-project}
  • Use the application URL to open it your browser for testing

  • For more information read the Pipeline Test Instructions

Once you become familiar with deploying code into OpenShift using Tekton, read up about how you can manage code deployment with Continuous Delivery with ArgoCD and Artifactory