DevOpsil
Jenkins
91%
Fresh

Jenkins Shared Libraries: Reusable Pipeline Code at Scale

Sarah ChenSarah Chen23 min read

Once you have more than a handful of Jenkins pipelines, you will notice the same patterns appearing everywhere: the same Docker login step, the same Slack notification block, the same deployment logic copy-pasted across repositories. A bug fix in one place means updating dozens of Jenkinsfiles across dozens of repos. Shared libraries solve this by letting you extract common pipeline code into a versioned, testable library that every Jenkinsfile can import.

This guide covers everything you need to build production-grade shared libraries: directory structure, global variables, class-based libraries, full pipeline templates, testing strategies, versioning approaches, security considerations, and the debugging techniques you will need when things go wrong.

Why Shared Libraries Matter

The problems shared libraries solve are real and painful:

  • Code duplication -- Fixing a bug in your deployment step means updating 40 Jenkinsfiles across 40 repos. Miss one, and it breaks in production at 2 AM.
  • Inconsistency -- Team A's notification step works differently from Team B's because they diverged months ago. Now nobody knows which version is correct.
  • Untested pipeline code -- Without shared libraries, pipeline logic lives in Jenkinsfiles that are never unit tested. You only find out about bugs when a build fails.
  • Onboarding friction -- New teams have to learn pipeline patterns from scratch instead of calling well-documented library functions.
  • Security gaps -- Each team implements credential handling differently. Some do it well. Some hardcode tokens in Jenkinsfiles.

With shared libraries, your Jenkinsfiles become thin orchestration layers that call well-tested, centrally maintained functions. A three-line Jenkinsfile can replace eighty lines of pipeline code.

When to Introduce Shared Libraries

Do not build a shared library for one pipeline. The overhead is not worth it. Introduce shared libraries when you see:

  • The same pipeline pattern used in 3+ repositories
  • Teams copying Jenkinsfiles between repos and diverging over time
  • Common steps (Docker builds, deployments, notifications) with subtle variations
  • New teams asking "how do I set up CI/CD for my project?"

Directory Structure

A shared library is a Git repository with a specific layout. Jenkins expects this structure and loads code from these directories automatically.

jenkins-shared-library/
+-- vars/
|   +-- deployToK8s.groovy          # Global variable / custom step
|   +-- deployToK8s.txt             # Help text (shows in Jenkins docs)
|   +-- notifySlack.groovy
|   +-- notifySlack.txt
|   +-- dockerBuildPush.groovy
|   +-- dockerBuildPush.txt
|   +-- standardPipeline.groovy
|   +-- standardPipeline.txt
+-- src/
|   +-- com/
|       +-- example/
|           +-- Docker.groovy       # Groovy class
|           +-- GitUtils.groovy
|           +-- SlackNotifier.groovy
|           +-- Constants.groovy
+-- resources/
|   +-- deploy-template.yaml        # Resource files (templates, configs)
|   +-- helm-values-template.yaml
|   +-- config.json
+-- test/
|   +-- groovy/
|       +-- DeployToK8sTest.groovy  # Unit tests
|       +-- NotifySlackTest.groovy
|       +-- DockerBuildPushTest.groovy
+-- Jenkinsfile                     # CI for the library itself
+-- build.gradle                    # Test runner configuration
+-- README.md

Each directory has a specific purpose:

DirectoryPurposeHow It Is Accessed
vars/Global variables and custom steps. Each .groovy file becomes a pipeline step.Called directly: deployToK8s()
src/Groovy classes with standard package structure. Compiled when the library loads.Imported: import com.example.Docker
resources/Non-Groovy files -- templates, configs, scripts. Loaded at runtime.Accessed: libraryResource('deploy-template.yaml')
test/Unit tests. Not loaded by Jenkins. Run locally or in the library's own CI.Run: ./gradlew test

Design Decision: vars/ vs src/

Factorvars/ (Global Variables)src/ (Classes)
ComplexitySimple steps, 1-2 functionsComplex logic, multiple methods
Access to pipeline contextDirect (implicit this)Must pass steps object explicitly
TestabilityModerate (mock pipeline context)High (standard unit testing)
DiscoverabilityShows in Pipeline Syntax referenceRequires documentation
SerializationAutomaticMust implement Serializable

Rule of thumb: Start with vars/ for everything. Move to src/ when a step grows complex enough to benefit from OOP patterns, or when you need logic that is independent of the Jenkins pipeline context.

Creating Global Variables

Files in vars/ are the simplest and most common way to expose functionality. Each file defines a global variable that becomes available as a pipeline step. The filename becomes the step name.

Simple Custom Step

// vars/notifySlack.groovy
def call(Map config = [:]) {
    def channel = config.channel ?: '#ci-cd'
    def status = config.status ?: currentBuild.currentResult ?: 'UNKNOWN'
    def color = 'warning'

    switch (status) {
        case 'SUCCESS':
            color = 'good'
            break
        case 'FAILURE':
            color = 'danger'
            break
        case 'UNSTABLE':
            color = 'warning'
            break
        case 'ABORTED':
            color = '#808080'
            break
    }

    def duration = currentBuild.durationString?.replace(' and counting', '') ?: 'unknown'
    def branch = env.GIT_BRANCH ?: env.BRANCH_NAME ?: 'unknown'

    def message = [
        "*${status}*: <${env.BUILD_URL}|${env.JOB_NAME} #${env.BUILD_NUMBER}>",
        "Branch: `${branch}`",
        "Duration: ${duration}",
    ].join('\n')

    if (config.additionalMessage) {
        message += "\n${config.additionalMessage}"
    }

    slackSend(channel: channel, color: color, message: message)
}

Usage in a Jenkinsfile:

@Library('my-shared-lib') _

pipeline {
    agent any
    stages {
        stage('Build') {
            steps {
                sh 'make build'
            }
        }
    }
    post {
        success {
            notifySlack(status: 'SUCCESS')
        }
        failure {
            notifySlack(status: 'FAILURE', channel: '#alerts')
        }
        unstable {
            notifySlack(
                status: 'UNSTABLE',
                additionalMessage: 'Some tests failed. Check the report.'
            )
        }
    }
}

Custom Step with Body Closure

You can create steps that wrap other steps, similar to withCredentials or timeout:

// vars/withDockerRegistry.groovy
def call(Map config, Closure body) {
    def registry = config.registry ?: error("'registry' parameter is required")
    def credentialsId = config.credentialsId ?: error("'credentialsId' parameter is required")

    withCredentials([usernamePassword(
        credentialsId: credentialsId,
        usernameVariable: 'DOCKER_USER',
        passwordVariable: 'DOCKER_PASS'
    )]) {
        sh "echo \$DOCKER_PASS | docker login ${registry} -u \$DOCKER_USER --password-stdin"
        try {
            body()
        } finally {
            sh "docker logout ${registry}"
        }
    }
}

Usage:

withDockerRegistry(registry: 'registry.example.com', credentialsId: 'docker-creds') {
    sh 'docker push registry.example.com/my-app:latest'
    sh 'docker push registry.example.com/my-app:v1.2.3'
}
// Docker is automatically logged out here, even if the push fails

Docker Build and Push Step

Here is a production-grade step that handles the full Docker workflow with validation, error handling, and useful defaults:

// vars/dockerBuildPush.groovy
def call(Map config) {
    // Validate required parameters
    def image = config.image ?: error("'image' parameter is required")
    def tag = config.tag ?: env.GIT_COMMIT?.take(8) ?: 'latest'
    def dockerfile = config.dockerfile ?: 'Dockerfile'
    def context = config.context ?: '.'
    def registry = config.registry ?: ''
    def credentialsId = config.credentialsId ?: 'docker-registry'
    def buildArgs = config.buildArgs ?: [:]
    def additionalTags = config.additionalTags ?: []
    def push = config.push != false  // Default true

    def fullImage = registry ? "${registry}/${image}" : image
    def primaryTag = "${fullImage}:${tag}"

    // Build the --build-arg flags
    def buildArgsStr = buildArgs.collect { k, v -> "--build-arg ${k}=${v}" }.join(' ')

    // Build the image
    echo "Building Docker image: ${primaryTag}"
    sh "docker build ${buildArgsStr} -f ${dockerfile} -t ${primaryTag} ${context}"

    // Apply additional tags
    def allTags = [primaryTag]
    additionalTags.each { extraTag ->
        def fullExtraTag = "${fullImage}:${extraTag}"
        sh "docker tag ${primaryTag} ${fullExtraTag}"
        allTags.add(fullExtraTag)
    }

    // Push if requested and registry is configured
    if (push && registry) {
        withCredentials([usernamePassword(
            credentialsId: credentialsId,
            usernameVariable: 'REG_USER',
            passwordVariable: 'REG_PASS'
        )]) {
            sh "echo \$REG_PASS | docker login ${registry} -u \$REG_USER --password-stdin"
            allTags.each { tagName ->
                echo "Pushing: ${tagName}"
                sh "docker push ${tagName}"
            }
            sh "docker logout ${registry}"
        }
    }

    // Return the primary tag for downstream use
    return primaryTag
}

Usage:

stage('Build & Push') {
    steps {
        script {
            def imageTag = dockerBuildPush(
                image: 'my-service',
                registry: 'registry.example.com',
                credentialsId: 'registry-creds',
                buildArgs: [APP_ENV: 'production', BUILD_DATE: new Date().format('yyyy-MM-dd')],
                additionalTags: ['latest', env.BRANCH_NAME]
            )
            echo "Primary image: ${imageTag}"
        }
    }
}

Kubernetes Deployment Step

// vars/deployToK8s.groovy
def call(Map config) {
    def image = config.image ?: error("'image' parameter is required")
    def namespace = config.namespace ?: 'default'
    def deployment = config.deployment ?: error("'deployment' parameter is required")
    def container = config.container ?: deployment
    def kubeConfigId = config.kubeConfigId ?: error("'kubeConfigId' parameter is required")
    def timeout = config.timeout ?: 300
    def verify = config.verify != false  // Default true

    withCredentials([file(credentialsId: kubeConfigId, variable: 'KUBECONFIG')]) {
        // Update the image
        sh """
            kubectl set image deployment/${deployment} \
              ${container}=${image} \
              --namespace=${namespace}
        """

        // Wait for rollout to complete
        if (verify) {
            echo "Waiting for rollout to complete (timeout: ${timeout}s)..."
            def rolloutStatus = sh(
                script: """
                    kubectl rollout status deployment/${deployment} \
                      --namespace=${namespace} \
                      --timeout=${timeout}s
                """,
                returnStatus: true
            )

            if (rolloutStatus != 0) {
                // Get pod status for debugging
                echo "Rollout failed. Fetching pod status..."
                sh """
                    kubectl get pods -l app=${deployment} \
                      --namespace=${namespace} -o wide
                    kubectl describe deployment/${deployment} \
                      --namespace=${namespace}
                """
                error("Deployment rollout failed for ${deployment} in ${namespace}")
            }

            // Show final state
            sh """
                kubectl get deployment/${deployment} \
                  --namespace=${namespace} -o wide
            """
        }
    }
}

Usage:

stage('Deploy Staging') {
    steps {
        deployToK8s(
            image: "registry.example.com/my-service:${GIT_COMMIT.take(8)}",
            deployment: 'my-service',
            namespace: 'staging',
            kubeConfigId: 'kubeconfig-staging',
            timeout: 300
        )
    }
}

Resource File Loading

The resources/ directory holds templates and configuration files that your library steps can load at runtime:

// vars/deployWithHelm.groovy
def call(Map config) {
    def chart = config.chart ?: error("'chart' parameter is required")
    def release = config.release ?: config.chart
    def namespace = config.namespace ?: 'default'
    def values = config.values ?: [:]

    // Load the base values template from the library's resources
    def baseValues = libraryResource('helm-values-template.yaml')

    // Write it to the workspace and overlay custom values
    writeFile file: 'base-values.yaml', text: baseValues

    // Build the --set flags from the values map
    def setFlags = values.collect { k, v -> "--set ${k}=${v}" }.join(' ')

    sh """
        helm upgrade --install ${release} ${chart} \
          --namespace ${namespace} \
          --create-namespace \
          -f base-values.yaml \
          ${setFlags} \
          --wait --timeout 10m
    """
}

Class-Based Libraries in src/

For more complex logic, use classes in the src/ directory. These follow standard Groovy/Java packaging conventions and give you proper object-oriented structure.

Git Utilities Class

// src/com/example/GitUtils.groovy
package com.example

class GitUtils implements Serializable {

    def steps

    GitUtils(steps) {
        this.steps = steps
    }

    String getShortCommit() {
        return steps.sh(script: 'git rev-parse --short HEAD', returnStdout: true).trim()
    }

    String getBranchName() {
        return steps.sh(script: 'git rev-parse --abbrev-ref HEAD', returnStdout: true).trim()
    }

    String getLastTag() {
        return steps.sh(
            script: 'git describe --tags --abbrev=0 2>/dev/null || echo "none"',
            returnStdout: true
        ).trim()
    }

    List getChangedFiles(String baseBranch = 'main') {
        def output = steps.sh(
            script: "git diff --name-only origin/${baseBranch}...HEAD",
            returnStdout: true
        ).trim()
        return output ? output.split('\n').toList() : []
    }

    boolean hasChangesIn(String path, String baseBranch = 'main') {
        def changed = getChangedFiles(baseBranch)
        return changed.any { it.startsWith(path) }
    }

    Map getChangesByDirectory(String baseBranch = 'main') {
        def changed = getChangedFiles(baseBranch)
        def result = [:]
        changed.each { file ->
            def dir = file.contains('/') ? file.split('/')[0] : '.'
            if (!result.containsKey(dir)) {
                result[dir] = []
            }
            result[dir].add(file)
        }
        return result
    }

    String getCommitMessage() {
        return steps.sh(script: 'git log -1 --pretty=%B', returnStdout: true).trim()
    }

    boolean commitMessageContains(String text) {
        return getCommitMessage().toLowerCase().contains(text.toLowerCase())
    }
}

Constants Class

// src/com/example/Constants.groovy
package com.example

class Constants {
    static final String DOCKER_REGISTRY = 'registry.example.com'
    static final String SLACK_CHANNEL_CI = '#ci-cd'
    static final String SLACK_CHANNEL_ALERTS = '#ci-alerts'
    static final String SLACK_CHANNEL_DEPLOYS = '#deployments'

    static final Map DEFAULT_TIMEOUTS = [
        build: 15,
        test: 20,
        deploy: 10,
        pipeline: 45
    ]

    static final Map ENVIRONMENTS = [
        dev: [
            namespace: 'development',
            kubeConfigId: 'kubeconfig-dev',
            autoApprove: true
        ],
        staging: [
            namespace: 'staging',
            kubeConfigId: 'kubeconfig-staging',
            autoApprove: true
        ],
        production: [
            namespace: 'production',
            kubeConfigId: 'kubeconfig-prod',
            autoApprove: false
        ]
    ]
}

Using Classes in a Jenkinsfile

@Library('my-shared-lib') _
import com.example.GitUtils
import com.example.Constants

pipeline {
    agent any
    stages {
        stage('Analyze Changes') {
            steps {
                script {
                    def git = new GitUtils(this)
                    def changes = git.getChangesByDirectory()

                    echo "Changed directories: ${changes.keySet()}"

                    if (git.hasChangesIn('frontend/')) {
                        echo 'Frontend changes detected -- will build frontend'
                    }
                    if (git.hasChangesIn('backend/')) {
                        echo 'Backend changes detected -- will build backend'
                    }
                    if (git.commitMessageContains('[skip ci]')) {
                        echo 'Skipping CI as requested in commit message'
                        currentBuild.result = 'NOT_BUILT'
                        return
                    }
                }
            }
        }
        stage('Deploy') {
            steps {
                script {
                    def envConfig = Constants.ENVIRONMENTS['staging']
                    deployToK8s(
                        image: "${Constants.DOCKER_REGISTRY}/my-app:${env.GIT_COMMIT.take(8)}",
                        deployment: 'my-app',
                        namespace: envConfig.namespace,
                        kubeConfigId: envConfig.kubeConfigId
                    )
                }
            }
        }
    }
}

The Serializable Requirement

The Serializable interface is important. Jenkins pipelines can be paused and resumed (during input steps, agent reconnections, or controller restarts), and any objects in scope must be serializable. Without it, you will get NotSerializableException errors.

If your class contains non-serializable fields, mark them as transient:

class MyHelper implements Serializable {
    def steps
    transient def httpClient  // Won't be serialized

    // Re-initialize transient fields when needed
    private def getClient() {
        if (httpClient == null) {
            httpClient = new URL('https://api.example.com').openConnection()
        }
        return httpClient
    }
}

Loading Libraries in a Jenkinsfile

Configuring the Library in Jenkins

Before any Jenkinsfile can use the library, configure it in Jenkins:

  1. Go to Manage Jenkins, then System (or Configure System)
  2. Scroll to Global Pipeline Libraries
  3. Add a library with:
    • Name: my-shared-lib
    • Default version: main (branch name)
    • Allow default version to be overridden: checked
    • Include @Library changes in job recent changes: checked
    • Retrieval method: Modern SCM, then Git
    • Project repository: https://github.com/your-org/jenkins-shared-library.git
    • Credentials: Select appropriate credentials
Configuration OptionRecommended SettingReason
Load implicitlyUncheckedExplicit is better -- teams should know they are using a library
Allow default version overrideCheckedTeams can test library changes without affecting others
Include changes in recent changesCheckedSee library updates in the build changelog
Cache fetched versionsCheckedFaster pipeline starts

The @Library Annotation

// Load default version (configured in Jenkins)
@Library('my-shared-lib') _

// Load a specific branch
@Library('my-shared-lib@develop') _

// Load a specific tag (recommended for production stability)
@Library('my-shared-lib@v2.1.0') _

// Load a specific commit
@Library('my-shared-lib@abc1234') _

// Load multiple libraries
@Library(['my-shared-lib@main', 'other-lib@v1.0']) _

The trailing underscore _ is required when using @Library at the top level. It is a Groovy annotation applied to an import statement, and the underscore acts as a placeholder since there is nothing to import explicitly (the vars/ directory contents are loaded automatically).

Dynamic Loading

You can also load libraries dynamically inside a pipeline:

pipeline {
    agent any
    stages {
        stage('Setup') {
            steps {
                script {
                    // Decide which version to load based on the branch
                    def libVersion = env.BRANCH_NAME == 'main' ? 'v2.0.0' : 'develop'
                    library "my-shared-lib@${libVersion}"
                }
            }
        }
        stage('Use Library') {
            steps {
                notifySlack(status: 'STARTED')
            }
        }
    }
}

Dynamic loading is useful when you need to decide which library version to load at runtime, or when the library version depends on project configuration.

Folder-Level Libraries

You can configure shared libraries at the folder level, not just globally. This lets different teams or projects use different libraries or different default versions:

  1. Navigate to the Jenkins folder that contains your jobs
  2. Click Configure
  3. Add a library under Pipeline Libraries

Folder-level libraries take precedence over global libraries with the same name.

Creating a Standard Pipeline Template

One of the most powerful patterns is creating a full pipeline template that teams call with minimal configuration. This is the "golden path" approach: you define the standard way to build, test, and deploy, and teams opt in by calling your template.

// vars/standardPipeline.groovy
def call(Map config) {
    // Validate required config
    def imageName = config.imageName ?: error("'imageName' is required")

    // Apply defaults
    def buildImage = config.buildImage ?: 'node:20-alpine'
    def testCommand = config.testCommand ?: 'npm ci && npm test'
    def lintCommand = config.lintCommand ?: 'npm run lint'
    def buildCommand = config.buildCommand ?: 'npm run build'
    def registry = config.registry ?: 'registry.example.com'
    def registryCreds = config.registryCreds ?: 'registry-creds'
    def slackChannel = config.slackChannel ?: '#ci-cd'
    def timeout = config.timeout ?: 30
    def runLint = config.runLint != false
    def deployToStaging = config.deployToStaging != false
    def deployToProduction = config.deployToProduction != false

    pipeline {
        agent none

        options {
            timeout(time: timeout, unit: 'MINUTES')
            timestamps()
            buildDiscarder(logRotator(numToKeepStr: '20'))
            disableConcurrentBuilds(abortPrevious: true)
        }

        environment {
            REGISTRY = registry
            IMAGE_NAME = imageName
            IMAGE_TAG = "${GIT_COMMIT.take(8)}"
        }

        stages {
            stage('Lint') {
                when {
                    expression { return runLint }
                    beforeAgent true
                }
                agent { docker { image buildImage } }
                steps {
                    sh 'npm ci'
                    sh lintCommand
                }
            }

            stage('Test') {
                agent { docker { image buildImage } }
                steps {
                    sh testCommand
                }
                post {
                    always {
                        junit allowEmptyResults: true, testResults: '**/test-results/*.xml'
                    }
                }
            }

            stage('Build Image') {
                when {
                    anyOf { branch 'main'; branch 'develop' }
                    beforeAgent true
                }
                agent { label 'docker' }
                steps {
                    script {
                        dockerBuildPush(
                            image: imageName,
                            registry: registry,
                            credentialsId: registryCreds,
                            additionalTags: [env.BRANCH_NAME, 'latest']
                        )
                    }
                }
            }

            stage('Deploy Staging') {
                when {
                    allOf {
                        branch 'develop'
                        expression { return deployToStaging }
                    }
                    beforeAgent true
                }
                agent { label 'deploy' }
                steps {
                    deployToK8s(
                        image: "${registry}/${imageName}:${env.GIT_COMMIT.take(8)}",
                        deployment: imageName,
                        namespace: 'staging',
                        kubeConfigId: 'kubeconfig-staging'
                    )
                }
            }

            stage('Deploy Production') {
                when {
                    allOf {
                        branch 'main'
                        expression { return deployToProduction }
                    }
                    beforeAgent true
                }
                input {
                    message "Deploy ${imageName} to production?"
                    ok 'Deploy'
                    submitter 'release-team,admin'
                }
                agent { label 'deploy' }
                steps {
                    deployToK8s(
                        image: "${registry}/${imageName}:${env.GIT_COMMIT.take(8)}",
                        deployment: imageName,
                        namespace: 'production',
                        kubeConfigId: 'kubeconfig-prod'
                    )
                }
            }
        }

        post {
            failure {
                notifySlack(status: 'FAILURE', channel: slackChannel)
            }
            success {
                notifySlack(status: 'SUCCESS', channel: slackChannel)
            }
            fixed {
                notifySlack(
                    status: 'SUCCESS',
                    channel: slackChannel,
                    additionalMessage: 'Build is green again!'
                )
            }
        }
    }
}

A team's entire Jenkinsfile becomes:

@Library('my-shared-lib@v2.0.0') _

standardPipeline(
    imageName: 'user-service',
    buildImage: 'node:20-alpine',
    testCommand: 'npm ci && npm test -- --coverage',
    slackChannel: '#team-platform'
)

Three lines of configuration instead of eighty lines of pipeline code.

Extending the Standard Pipeline

For teams that need customization beyond what the template offers, provide escape hatches:

// vars/standardPipeline.groovy (extended version)
def call(Map config) {
    // ... existing setup ...

    // Allow teams to inject additional stages
    def preTestSteps = config.preTestSteps ?: null
    def postDeploySteps = config.postDeploySteps ?: null

    pipeline {
        // ... existing pipeline ...

        stages {
            stage('Pre-Test') {
                when {
                    expression { return preTestSteps != null }
                    beforeAgent true
                }
                agent { docker { image config.buildImage ?: 'node:20-alpine' } }
                steps {
                    script {
                        preTestSteps()
                    }
                }
            }
            // ... rest of stages ...
        }
    }
}

Usage:

standardPipeline(
    imageName: 'my-service',
    preTestSteps: {
        sh 'npm run generate-types'
        sh 'npm run db:migrate'
    }
)

Versioning Strategies

Shared libraries are code and should be versioned like code. The versioning strategy you choose depends on your organization's size, risk tolerance, and release cadence.

Branch-Based Versioning

BranchPurposeWho Uses It
mainStable, production-readyAll pipelines by default
developLatest features, may breakTeams testing new library features
feature/*Experimental changesLibrary developers only

Simple and works for small organizations. The risk: a bad merge to main breaks every pipeline simultaneously.

More rigorous approach for organizations with many teams:

v1.0.0 -- Initial release
v1.1.0 -- Added Kubernetes deployment step
v1.2.0 -- Added Helm deployment step
v1.2.1 -- Fixed timeout bug in deployToK8s
v2.0.0 -- Breaking change: renamed parameters in dockerBuildPush

Teams pin to specific versions and upgrade on their own schedule:

@Library('my-shared-lib@v1.2.1') _

This prevents a library update from breaking 50 pipelines simultaneously. Use semantic versioning:

  • Patch (1.2.x): Bug fixes, no API changes. Safe to auto-upgrade.
  • Minor (1.x.0): New features, backward-compatible. Review and test before upgrading.
  • Major (x.0.0): Breaking changes. Requires Jenkinsfile modifications.

Migration Strategy for Breaking Changes

When you need to make breaking changes:

  1. Release the breaking change as a new major version
  2. Keep the old version supported for a transition period
  3. Provide a migration guide in the changelog
  4. Notify teams via Slack or email
  5. Set a deprecation deadline
  6. Remove old version support after deadline

Maintaining a Changelog

## v2.0.0 (2026-03-15)
### Breaking Changes
- `dockerBuildPush`: renamed `creds` parameter to `credentialsId`
- `deployToK8s`: `environment` parameter renamed to `namespace`

### Migration
- Update `creds: 'my-creds'` to `credentialsId: 'my-creds'`
- Update `environment: 'staging'` to `namespace: 'staging'`

## v1.3.0 (2026-03-01)
### Added
- `deployWithHelm` step for Helm-based deployments
- `notifySlack` now supports `additionalMessage` parameter

### Fixed
- `deployToK8s` timeout was being ignored

Testing Shared Libraries

Untested shared libraries become a liability. When the library breaks, it breaks every pipeline that uses it. Use the JenkinsPipelineUnit framework to test your library functions locally before merging.

Setup with Gradle

// build.gradle
plugins {
    id 'groovy'
}

repositories {
    mavenCentral()
}

dependencies {
    implementation 'org.codehaus.groovy:groovy-all:3.0.19'
    testImplementation 'com.lesfurets:jenkins-pipeline-unit:1.19'
    testImplementation 'junit:junit:4.13.2'
}

sourceSets {
    main {
        groovy {
            srcDirs = ['src', 'vars']
        }
    }
    test {
        groovy {
            srcDirs = ['test/groovy']
        }
    }
}

test {
    testLogging {
        events 'passed', 'skipped', 'failed'
        showStandardStreams = true
    }
}

Writing Tests for Global Variables

// test/groovy/NotifySlackTest.groovy
import com.lesfurets.jenkins.unit.BasePipelineTest
import org.junit.Before
import org.junit.Test
import static org.junit.Assert.*

class NotifySlackTest extends BasePipelineTest {

    def slackCalls = []

    @Override
    @Before
    void setUp() throws Exception {
        super.setUp()
        // Register pipeline steps as allowed methods
        helper.registerAllowedMethod('slackSend', [Map.class]) { Map params ->
            slackCalls.add(params)
        }
    }

    private void setUpBuildContext(Map overrides = [:]) {
        binding.setVariable('currentBuild', [
            currentResult: overrides.result ?: 'SUCCESS',
            durationString: overrides.duration ?: '2 min 30 sec and counting',
            previousBuild: overrides.previousBuild ?: null
        ])
        binding.setVariable('env', [
            JOB_NAME: overrides.jobName ?: 'test-job',
            BUILD_NUMBER: overrides.buildNumber ?: '42',
            GIT_BRANCH: overrides.branch ?: 'main',
            BRANCH_NAME: overrides.branchName ?: null,
            BUILD_URL: overrides.buildUrl ?: 'http://jenkins/job/test-job/42/'
        ])
    }

    @Test
    void 'should send success notification with green color'() {
        setUpBuildContext(result: 'SUCCESS')
        def script = loadScript('vars/notifySlack.groovy')

        script.call(status: 'SUCCESS')

        assertEquals(1, slackCalls.size())
        assertEquals('good', slackCalls[0].color)
        assertTrue(slackCalls[0].message.contains('SUCCESS'))
    }

    @Test
    void 'should send failure notification with red color'() {
        setUpBuildContext(result: 'FAILURE')
        def script = loadScript('vars/notifySlack.groovy')

        script.call(status: 'FAILURE')

        assertEquals(1, slackCalls.size())
        assertEquals('danger', slackCalls[0].color)
    }

    @Test
    void 'should default to ci-cd channel'() {
        setUpBuildContext()
        def script = loadScript('vars/notifySlack.groovy')

        script.call()

        assertEquals('#ci-cd', slackCalls[0].channel)
    }

    @Test
    void 'should use custom channel when provided'() {
        setUpBuildContext()
        def script = loadScript('vars/notifySlack.groovy')

        script.call(channel: '#team-alerts')

        assertEquals('#team-alerts', slackCalls[0].channel)
    }

    @Test
    void 'should include additional message when provided'() {
        setUpBuildContext()
        def script = loadScript('vars/notifySlack.groovy')

        script.call(additionalMessage: 'Deployed to staging')

        assertTrue(slackCalls[0].message.contains('Deployed to staging'))
    }

    @Test
    void 'should strip "and counting" from duration'() {
        setUpBuildContext(duration: '5 min and counting')
        def script = loadScript('vars/notifySlack.groovy')

        script.call()

        assertFalse(slackCalls[0].message.contains('and counting'))
        assertTrue(slackCalls[0].message.contains('5 min'))
    }
}

Testing Class-Based Libraries

// test/groovy/GitUtilsTest.groovy
import com.lesfurets.jenkins.unit.BasePipelineTest
import com.example.GitUtils
import org.junit.Before
import org.junit.Test
import static org.junit.Assert.*

class GitUtilsTest extends BasePipelineTest {

    def shellOutputs = [:]

    @Override
    @Before
    void setUp() throws Exception {
        super.setUp()
        helper.registerAllowedMethod('sh', [Map.class]) { Map params ->
            def cmd = params.script
            return shellOutputs[cmd] ?: ''
        }
    }

    @Test
    void 'getShortCommit returns trimmed output'() {
        shellOutputs['git rev-parse --short HEAD'] = 'abc1234\n'

        def mockSteps = [sh: { Map params -> shellOutputs[params.script] ?: '' }]
        def git = new GitUtils(mockSteps)

        assertEquals('abc1234', git.getShortCommit())
    }

    @Test
    void 'hasChangesIn detects changes in a directory'() {
        shellOutputs['git diff --name-only origin/main...HEAD'] = '''frontend/src/App.js
frontend/package.json
backend/go.mod'''

        def mockSteps = [sh: { Map params -> shellOutputs[params.script] ?: '' }]
        def git = new GitUtils(mockSteps)

        assertTrue(git.hasChangesIn('frontend/'))
        assertTrue(git.hasChangesIn('backend/'))
        assertFalse(git.hasChangesIn('infrastructure/'))
    }
}

Run tests:

./gradlew test

What to Test

Test CategoryExamples
Input validationMissing required parameters should throw clear errors
Default valuesVerify defaults are applied when parameters are omitted
Conditional logicTest each branch in if/else and switch statements
Edge casesEmpty strings, null values, special characters in names
Error handlingVerify graceful failures and helpful error messages
Output formatSlack messages contain expected fields, tags are formatted correctly

CI for the Library Itself

Create a Jenkinsfile in the shared library repo to test on every push:

pipeline {
    agent { docker { image 'gradle:7-jdk17' } }

    stages {
        stage('Test') {
            steps {
                sh './gradlew test'
            }
            post {
                always {
                    junit '**/build/test-results/**/*.xml'
                }
            }
        }
    }
}

Security Considerations

Shared libraries run with elevated trust by default. A library loaded through the global configuration runs outside the Groovy sandbox, meaning it can execute any Java/Groovy code without script approval.

Trust Levels

Library SourceTrust LevelSandboxUse Case
Global Pipeline LibrariesTrustedNo sandboxCore infrastructure libraries
Folder-level LibrariesConfigurableOptionalTeam-specific libraries
Untrusted (loaded dynamically)UntrustedSandboxedThird-party or experimental libraries

Security Best Practices

Restrict who can commit to the library repository. A malicious commit to a trusted shared library can access all credentials, all agents, and the Jenkins controller itself.

Use branch protection rules. Require pull request reviews and status checks before merging to main.

Avoid passing raw credentials through library functions. Instead, accept credential IDs and let the library use withCredentials internally:

// BAD: Accepting raw secrets
def call(String username, String password) {
    sh "docker login -u ${username} -p ${password}"  // Exposed in logs
}

// GOOD: Accepting credential IDs
def call(String credentialsId) {
    withCredentials([usernamePassword(
        credentialsId: credentialsId,
        usernameVariable: 'USER',
        passwordVariable: 'PASS'
    )]) {
        sh 'echo $PASS | docker login -u $USER --password-stdin'
    }
}

Audit library usage. Monitor which pipelines use which library versions and which steps they call.

Troubleshooting Shared Libraries

Common Errors and Solutions

"No such DSL method" after adding a new step: Jenkins caches library code. Either restart Jenkins or go to Manage Jenkins, then Replay the build to force a fresh load. Also verify the file is in vars/ and the filename matches the step name.

NotSerializableException: Your class in src/ does not implement Serializable, or it holds a non-serializable field. Add implements Serializable and mark non-serializable fields as transient.

"Scripts not permitted to use method": This happens when a library loaded in sandbox mode tries to use an unapproved method. Either approve the method in Manage Jenkins, then In-process Script Approval, or configure the library as trusted.

Library changes not taking effect: Jenkins caches library versions. If you are developing actively, use @Library('my-shared-lib@branch') _ and clear the library cache or restart Jenkins.

"Cannot find matching method" errors: Usually a parameter type mismatch. Groovy is loosely typed, but Jenkins steps often expect specific types. Use Map config for flexibility and cast values explicitly when needed.

Debugging Techniques

// In your vars/ step, add verbose logging
def call(Map config = [:]) {
    echo "DEBUG: notifySlack called with config: ${config}"
    echo "DEBUG: currentBuild.currentResult = ${currentBuild.currentResult}"
    echo "DEBUG: env.JOB_NAME = ${env.JOB_NAME}"
    // ... rest of step
}

For class-based libraries, use the Script Console (Manage Jenkins, then Script Console) to test snippets:

import com.example.GitUtils
def git = new GitUtils(this)
println git.getShortCommit()

Best Practices Summary

PracticeWhy
Keep vars/ steps focused -- one step, one jobEasier to test, document, and reuse
Use Map parameters with defaultsSelf-documenting, backward-compatible
Document every step with a .txt fileShows in Jenkins pipeline syntax reference
Version with tags, follow semverPrevents library updates from breaking all pipelines
Test your library in CICatch bugs before they hit production pipelines
Accept credential IDs, not raw secretsSecurity and auditability
Run the library's own CI with its own stepsDogfooding catches usability issues
Provide a standard pipeline templateReduces onboarding from hours to minutes
Maintain a changelogTeams need to know what changed and when
Restrict commit access to the library repoA compromised library compromises everything

Shared libraries are the key to scaling Jenkins across an organization. They turn pipeline code from a copy-paste mess into a maintained, tested, versioned asset. Start with one or two common steps, prove the value, then expand from there.

Share:
Sarah Chen
Sarah Chen

CI/CD Engineering Lead

Automation evangelist who believes no deployment should require a human. I write pipelines, break pipelines, and write about both. Code-first, always.

Related Articles