Rob Moorman
Rob Moorman

Founder, technology consultant, architect and full stack developer
Mei 02

Listen to this blog post

Continuous integration and delivery with Jenkins Pipelines and Blue Ocean

A quick walkthrough how to speed up your development with the Jenkins Pipeline and Blue Ocean plugins

At Moor Interactive we continuously improve our development and deployment workflows in order to provide a very reliable process for getting our code to production as fast as possible. Over the decade we have intensively worked with Jenkins as our CI/CD platform. Travis CI and Drone are also platforms we used a lot in the years as we used to store everything in Git (the thing we missed about Jenkins and it's job configurations).

Last week we took some time to look at the first public version of the Jenkins Blue Ocean plugin (in combination with Pipelines) and decide whether it's worth using it instead of focussing more and more on platforms like Travis CI and Drone.

In this blog post we take you through the basic concepts of Blue Ocean and Pipelines and how to setup your own platform with Docker Compose.

Blue Ocean is a project that rethinks the user experience of Jenkins, modelling and presenting the process of software delivery by surfacing information which is important to development teams with as few clicks as possible.

Dockerize Jenkins Blue Ocean

First create the Dockerfile and docker-compose.yaml configuration file. We also need to specify a docker.conf file in which we say our local Docker host must be used to use in Jenkins (see also Docker for Mac and the official installation guide).


FROM jenkinsci/blueocean:latest
MAINTAINER R. Moorman <>
USER root
COPY docker.conf /etc/init.d/docker


version: '2'
    build: .
      - 8080:8080
      - ./jenkins_home:/var/jenkins_home
      - /var/run/docker.sock:/var/run/docker.sock


DOCKER_OPTS="-H tcp:// -H unix:///var/run/docker.sock --tls=false"

Now run docker-compose up and you should be ready to visit your Jenkins platform at http://localhost:8080/.

Jenkins login

Jenkins login screen after installation

Creating a first Pipeline

After switching to the Blue Ocean view you will be able to create a new Pipeline. You'll be asked which repository you want to use and whether you want to start the Pipeline editor or automatically scan your projects for Jenkinsfiles. The Pipeline editor comes quite handy to make small adjustments and play around a bit. However we want to write down the Jenkinsfile ourselves, so that's what we're going to do.

In most of our projects we use Python code, so we want our Jenkinsfile to checkout some code, execute the flake8 linting tool and report its test results.

This is the Jenkinsfile (checkout the whole repository at

pipeline {
  agent {
    docker 'python:3.6.1'
  stages {
    stage('Install dependencies') {
      steps {
        sh 'pip install -r requirements.txt'
    stage('Test') {
      steps {
        sh 'flake8 src/ --exit-zero --output-file flake8-output.txt'
        sh 'flake8_junit flake8-output.txt flake8-output.xml'
  post {
    always {
      junit 'flake8-output.xml'
    failure {
      echo 'Failed!'
    success {
      echo 'Done!'
Jenkins tests

Test results from our Jenkinsfile

If we break down our configuration file, we come across some very powerful Pipeline features:


The agent tells Jenkins where and how to execute the Pipeline, or subset thereof. As we are very familiar with Travis CI and Drone, the Docker agent works very well. In our scenario we want to run our code in Python 3.6.1, so we take the official python:3.6.1 container. Mixing and matching different agents throughout your Jenkinsfile offers you the right flexibility to build complex and reusable build configurations.

Stages and steps

Splitting up your build into multiple stages and steps gives you not only a visualization in Blue Ocean but you will also be able to configure post actions per stage/step. An example of this would be always publishing cobertura reports during the test stage.

When a step succeeds it moves onto the next step. When a step fails the Pipeline will fail and further steps won't be executed.

We did some tests with conditional steps to automate our deployments and handle pull requests. This results in a single and very declarative Jenkinsfile. For us this is a major improvement if we look at the build configurations in the past.

stage('Deploy') {
  when {
    branch 'develop'
  steps {
    sh 'sh ./etc/'

Post actions and notifications

As we said it's possible to execute actions when things are failing and/or succeeding. The plugins comes by default with quite a lot of these actions, like sending a message to a Slack channel or publishing test results (e.g. junit).

In the classic view of Jenkins the most of them are probably working fine. However with the Blue Ocean view we noticed only a few are yet compatible. Although Jenkins offers more and more pluggable modules (the next-gen js-modules is an great example of this), compatiblity with Blue Ocean feels still a bit of unmature.

Failing pipeline

A more complex Pipeline execution


Especially the Pipeline plugin offers a very flexible and maintainable way of setting up your build configurations and deployments. The declarative Jenkinsfiles with Docker agents is a go for us. Actually, we already moved the most of our projects to be compatible with Pipelines!

Blue Ocean definitely takes away the pain of having a unfriendly GUI. However the compatibility with plugins is still a bit of immature which doesn't make it a drop-in-replacement of your classic view. With the variety of modules we think it's just a matter of time the common used plugins will be working properly, so not a showstopper for us.

Our next blog post on this topic will be a more in-depth one with some reusable deployment setups we use in production. So stay tuned!

Some extra resources about Jenkins Blue Ocean and Pipelines:

Rob Moorman
Rob Moorman

Founder, technology consultant, architect and full stack developer