CI/CD With Jenkins: A Gentle Introduction - Spot.io

CI/CD With Jenkins: A Gentle Introduction

What Is Jenkins? 

Jenkins is a powerful, open-source automation tool primarily used for continuous integration and continuous delivery (CI/CD). It is currently owned and supported by CloudBees. It is written in Java, and its functionality can be extended through plugins. Jenkins allows developers to automate various stages of their build pipeline. It’s a server-based system that runs in servlet containers like Apache Tomcat.

In many organizations, Jenkins is an enabler of continuous integration and continuous delivery: 

  • Continuous integration is a practice that encourages developers to integrate their code into a shared repository several times a day. Each check-in is then verified by an automated build, allowing teams to detect problems early. 
  • Continuous delivery is a series of practices designed to ensure that code can be rapidly and safely deployed to production by delivering every change to a production-like environment and ensuring business applications and services function as expected through rigorous automated testing.

In this article:

Why Use Jenkins for CI/CD 

Extensibility

One of the key reasons why Jenkins is a popular choice for CI/CD is its extensibility. Jenkins comes with a rich ecosystem of plugins – over 1,500 of them! These plugins help extend its functionality, allowing it to integrate with almost all common DevOps tools, including Docker, Git, and Maven.

The plugin architecture also allows for easy customization. If there isn’t a plugin that meets your specific need, you can write your own. This flexibility makes Jenkins a robust and adaptable tool, able to handle complex CI/CD pipelines.

Pipeline as Code

Another significant advantage of Jenkins is that it allows a delivery pipeline to be coded and versioned, just like the application that it builds.

With Jenkins Pipeline (a suite of plugins provided by the makers of Jenkins), the continuous delivery pipeline can be defined via a text file called a ‘Jenkinsfile.’ This file can be written in Groovy or a simpler DSL (Domain Specific Language) provided by Jenkins, and is version-controlled with the rest of the project source code. 

This approach brings several benefits: the pipeline can be reviewed, iterated upon, and audited. Changes to the pipeline can be versioned and tested, and the pipeline can be shared across multiple projects.

Flexible Configuration

Jenkins can be configured via its web interface, which includes on-the-fly error checks and inline help. There’s no need to tweak XML manually (but it is possible for advanced users).

With its master/agent architecture, Jenkins can also easily distribute work across multiple machines, helping drive builds, tests, and deployments across multiple platforms faster. This feature is particularly beneficial for large projects or projects that need to be tested in different environments.

Mature Project

Lastly, Jenkins is a mature project. It was initially released in 2004 as Hudson, and it became Jenkins in 2011. Over the years, it has been heavily tested and has proven its reliability and stability in numerous production environments.

Its maturity also means it has a vast and active community of users and contributors who can provide support, share experiences and best practices. This not only helps in resolving any issues that you may come across, but it also means that the project is continually evolving and improving.

Learn more in our detailed guide to Azure DevOps pipeline

Creating a CI/CD Pipeline Using Jenkins 

We’ll review the general steps involved in creating a CI/CD pipeline with Jenkins. We’ll provide links to further resources you can use to learn about the stages in more detail.

Download and Install Jenkins

The first step in creating a CI/CD pipeline with Jenkins is to download and install the software. This can be done by visiting the official Jenkins download page and choosing the appropriate version based on the operating system of your machine. The download is available in the form of a .war file.

After the download, installation proceeds depending on the operating system. For Windows, it’s as simple as running the downloaded executable file. For Linux systems, the .war file can be executed using the command java -jar jenkins.war. Post-installation, the Jenkins server will be up and running on the default port 8080.

Learn more: https://www.jenkins.io/doc/book/installing/ 

Start and Configure Jenkins

After successfully installing Jenkins, the next step is to start and configure it. The configuration process begins with unlocking Jenkins. This is done by retrieving the automatically generated administrative password from a file in the Jenkins home directory.

After unlocking, the customization process starts with the installation of suggested plugins. Plugins are an essential part of Jenkins as they extend its functionality. After installing the plugins, you’ll create and configure your first administrative user. At this point the Jenkins instance is ready to be used.

Learn more:

Creating a New Pipeline

In Jenkins, a pipeline is a series of tasks that are executed sequentially. Creating a new pipeline is a straightforward process. It involves clicking on New Item on the Jenkins dashboard, naming the pipeline, and selecting Pipeline as the project type.

After creating the pipeline, it needs to be set up. This setup involves defining the pipeline structure in a file called Jenkinsfile. This file is written in a domain-specific language based on Groovy and defines all the stages and steps of the pipeline. 

Below is an example of a simple Jenkinsfile with three pipeline stages, shared in the Jenkins documentation.

Jenkinsfile (Declarative Pipeline)
pipeline {
    agent any
    stages {
        stage('Build') {
            steps {
                echo 'Building..'
            }
        }
        stage('Test') {
            steps {
                echo 'Testing..'
            }
        }
        stage('Deploy') {
            steps {
                echo 'Deploying....'
            }
        }
    }
}

Learn more: https://www.jenkins.io/doc/pipeline/tour/hello-world/ 

Automating Tests

Automating tests is a crucial aspect of CI/CD with Jenkins. It ensures that any change or addition to the source code doesn’t break the existing functionality of the software. This is done by writing test cases and then configuring Jenkins to automatically run these tests every time a change is made.

Automated testing involves integration with testing frameworks like JUnit or Selenium. The results of these tests are then reported back to Jenkins, which determines if the build is successful or if it has failed.

Learn more: https://www.jenkins.io/doc/developer/testing/ 

Managing Artifacts

Artifacts in Jenkins are the files which are generated as a result of a build. These include executable files, log files, and reports. The management of these artifacts is crucial as they provide valuable information about the build.

Artifact management in Jenkins involves two key concepts:

  • Archiving allows the storage of artifacts for future reference
  • Fingerprinting is used to track the usage of artifacts across different builds or projects

Learn more: 

Monitoring and Logging

Monitoring is crucial for maintaining the health and efficiency of the CI/CD pipeline. Jenkins offers real-time monitoring through the Jenkins dashboard, which shows a view of all ongoing and completed build tasks. Additionally, users can configure email or SMS notifications for build results and critical alerts, ensuring immediate awareness of any issues.

In addition to monitoring via the dashboard, Jenkins logs every action taken during the build and deployment process, providing a detailed record of events. These logs are invaluable for troubleshooting failures or performance bottlenecks. Jenkins also allows the integration with external logging tools like ELK Stack (Elasticsearch, Logstash, Kibana) or Splunk, enabling more sophisticated log analysis and visualization.

Learn more: 

CI/CD for Kubernetes with Spot by NetApp

Continuous Delivery has entered a new phase as more and more applications are migrating to microservices, with Kubernetes as the container orchestrator of choice for many. Kubernetes enables agility and faster software development cycles, but as release frequency increases, supporting delivery at large-scale becomes complex and inefficient. 

Spot by NetApp introduced Ocean CD as part of the Ocean suite for Kubernetes to address the specific challenges of modern delivery release cycles. Ocean CD provides complete deployment and verification automation in one fully managed solution, making it easy for users to execute deployments with high confidence. Key features of Ocean CD include:

Out-of-the-box progressive delivery strategies 

Canary and blue/green strategies are easy to define, automate and customize. Developers commit code, use any CI tool and Ocean CD detects the deployments, automatically initiating the assigned rollout strategy. 

Continuous verification automation

Ensure stability and quality of deployments even as release frequency increases. Routine verifications of deployments are conducted automatically and based on metrics from monitoring tools like DataDog and New Relic. 

Automatic rollback 

When issues are detected, Ocean CD initiates safe rollbacks and automatically tunes infrastructure to meet changing requirements of workloads. Continuous improvements are made to application deployments based on metrics collected during verification processes. 

To learn more about Ocean for Continuous Delivery read our blog post or visit the product page