Cross Team Collaboration

It essentially allows a Pipeline to create a notification event which will be consumed by other Pipelines waiting on it.

cross team diagram
Figure 1. The cross team collaboration diagram

The Cross Team Collaboration feature requires the installation of the following plugins on each CloudBees Core instances:

  • notification-api (required): responsible of sending the messages across teams and jobs.

  • pipeline-event-step (required): provides a Pipeline step to add a new event publisher or trigger.

  • operations-center-notification (optional): provides the router to transfer the messages across different teams and it’s required if you want to implement cross-team collaboration. It is still possible to use the cross-team-collaboration feature without this plugin, by using the local-only mode which allows you to trigger events across different jobs inside the same team.

Configuring

The notification-plugin will provide each Jenkins instance with a notification configuration page (accessible via "<JENKINS_URL>/notifications") where it is possible to enable or disable notifications as well as select the router type.

Note that the Operations Center Messaging option will only show up if the Operations-center-notification plugin is installed on the instance.

Notifications must be enabled both in Operations Center and in all masters that will receive notifications.
cross team config page
Figure 2. The cross team collaboration configuration page

Event Types

An event is the information being sent over the wire to notify a team or a job and it can currently be of two different types:

  • A Simple Event: The simplest event which can only carry over a text event to destination and will always be wrapped in an event JSON key. E.g {"event":"helloWorld"}

  • A JSON Event: Can potentially carry over any information as long as the JSON object string representation is valid. E.g {"eventName":"helloWorld", "eventType":"jar"}

    Jenkins automatically adds some additional fields to your event under the source attribute. These details will vary depending on the source that created the event (for example, an external webhook event, a job inside of a Team organization, etc) and can be used to create more complex queries. This source attribute name is reserved by Jenkins, and it will be overwritten if you include it at the root level of your own schema.

Publishing events

A Pipeline can be configured with a publishEvent step so that each time it gets executed a new event is sent by the internal messaging API to all destinations (meaning all jobs configured with an event trigger inside or outside the same team).

An example of a Pipeline publishing a HelloWorld Simple Event would look like:

// Declarative //
pipeline {
    agent any

    stages {
        stage('Example') {
            steps {
                echo 'sending helloWorld'
                publishEvent simpleEvent('helloWorld')
            }
        }
    }
}

// Script //
node {
    stage("Example")  {
        echo 'Hello World'
        publishEvent event: jsonEvent('{"event":"helloWorld"}')
    }
}

Each simple event is effectively a JSON event which has as outer key event and as its value the string passed as parameter inside the simpleEvent step. Thus the above example generates this event:

{"event":"helloWorld"}

An example of a Pipeline publishing a JSON event would look like (any valid JSON string is a valid input):

// Declarative //
pipeline {
    agent any

    stages {
        stage('Example') {
            steps {
                echo 'sending helloWorld'
                publishEvent jsonEvent('{"eventName":"helloWorld"}')
            }
        }
    }
}

// Script //
node {
    stage("Example")  {
        echo 'Hello World'
        publishEvent jsonEvent('{"eventName":"helloWorld"}')
    }
}

Contrary to what happens for simple events, the JSON passed as parameter inside the jsonEvent step is the final JSON sent as notification.

{"eventName":"helloWorld"}

Both the above Pipelines examples use the declarative Pipeline syntax. Nonetheless, the same step can be added with a scripted syntax.

The Pipeline snippet generator will provide the correct syntax when in doubt:

cross team snippet publisher
Figure 3. Event Publisher snippet generator

When an event is published, that is, when you run a Pipeline which contains the publishEvent step, all the Pipelines listening on events (event triggers) will be queried, but only the ones for which the trigger condition matches the publishEvent will be executed.

Note that when an Event is published, additional information gets added by default to its JsonObject, including:

  • the source Pipeline job name

  • the source Pipeline build number

  • the source Jenkins URL

  • the source jenkins id

The above information can also be used by the consumer (i.e: for security checks) and can be explored by using the verbose option when publishing an event. See the The verbose option section on this matter.

The verbose option

When looking at the snippet generator for the publish event step, you will notice that there is an Advanced section which allows you to enable verbose logging.

cross team snippet verbose
Figure 4. Verbose Option

When selected it will generate the following publish event step
publishEvent event:jsonEvent('{"eventName":"helloWorld"}'), verbose: true.

The verbose parameter defaults to false so you will need to specify it in your step only if you want to enable it. When enabled, the verbose logging will print the generated full event (including the parameters added by default) in the console output of the publisher’s builds

[Pipeline] {
[Pipeline] stage
[Pipeline] { (Example)
[Pipeline] echo
sending helloWorld
[Pipeline] publishEvent
Publishing event notification
Event JSON:
 {
    "eventName": "helloWorld",
    "source":     {
        "type": "JenkinsTeamBuild",
        "buildInfo":         {
            "build": 4,
            "job": "team-name/job-name",
            "instanceId": "b5d5e0e9de1f45d9e2d6815265e069d5",
            "organization": "team-name"
        }
    }
}
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS

The verbose option can be very helpful for testing and troubleshooting event triggers. You will see the entire JSON for your event, which can be copied and pasted into a tool such as JMESPath and used to refine and test your query.

Jenkins automatically adds some additional fields to your event under the source attribute. These details will vary depending on the source that created the event (for example, an external webhook event, a job inside of a Team organization, etc) and can be used to create more complex queries. This source attribute name is reserved by Jenkins, and it will be overwritten if you include it at the root level of your own schema.

The Trigger condition

The trigger step can either be based on a specific Simple Event or on some more generic query patterns which use jmespath syntax to find a single or multiple matching events. Unlike the publishers, triggers can also be configured directly by using the UI form in the Pipeline job configuration page besides being configurable by the use of an ad-hoc Pipeline step.

For instance, the configuration below:

cross team trigger
Figure 5. Event Trigger

is equivalent to writing the following Pipeline:

// Declarative //
pipeline {
    agent any

    triggers {
        eventTrigger simpleMatch("helloWorld")
    }
    stages {
        stage('Example') {
            steps {
                echo 'received helloWorld'
            }
        }
    }
}

// Script //
properties([pipelineTriggers([eventTrigger(simpleMatch("helloWorld"))])])

node {
    stage('Example')  {
        echo 'received helloWorld'
    }
}

Specifying the trigger in a Pipeline script will not register the trigger until the Pipeline is executed at least once. When the Pipeline is executed, the trigger is registered and saved in the job configuration and thus will also appear in the job’s configuration UI.

Note that the Pipeline script will have priority on UI configuration, so make sure you follow one approach or the other but do not mix them up. If a trigger is added in the Pipeline and a different one is added via UI, the one in the UI will be deleted in favor of the scripted one.

A pipeline job containing a trigger must be manually built once in order to properly register the trigger. Subsequent changes to the trigger value do not require a manual build.

When wanting to match a JSON event, the jmespathQuery option is the one you want.

cross team trigger query
Figure 6. Event Trigger

The UI configuration above is equivalent to writing:

// Declarative //
pipeline {
    agent any

    triggers {
        eventTrigger jmespathQuery("eventName=='helloWorld'")
    }

    stages {
        stage('Example') {
            steps {
                echo 'received helloWorld'
            }
        }
    }
}

// Script //
properties([pipelineTriggers([eventTrigger(jmespathQuery("eventName=='helloWorld'"))])])

node {
    stage('Example')  {
        echo 'received helloWorld'
    }
}

String values inside a jmespath query must be enclosed in single quotes, for example eventName=='helloWorld'

This step will look the the eventName key in the notifications and will trigger the job if and only if that property exists and it is equal to helloWorld, as for example in the following JSON event:

 {
    "eventName":"helloWorld",
    "type":"jar",
    "artifacts": [...]
 }

The snippet generator will provide some help in finding the right Pipeline syntax and will provide basic validation on the jmespath query.

Use cases for notification events of Cross Collaboration

Maven Artifacts

Imagine you have a job, which builds and generates a jar, producing a Maven Artifact. Another team is waiting on a new/updated version of an artifact to be generated to do some work.

Using this feature you could configure the upstream job, the one which builds and generates the artifact, to generate an Event and notify the other teams that the artifact has been published.

To do so you could use the Simple type event and configure the upstream Pipeline in this way:

// Declarative //
pipeline {
    agent any

    stages {
        stage('Example') {
            steps {
                echo 'new maven artifact got published'
                publishEvent simpleEvent('com.example:my-jar:0.5-SNAPSHOT:jar')
            }
        }
    }
}

// Script //
node {
    stage("Example")  {
        echo 'new maven artifact got published'
        publishEvent simpleEvent('com.example:my-jar:0.5-SNAPSHOT:jar')
    }
}

In order to extract the specific mvn version the Pipeline is building and put that in a variable, rather than hardcoding it as the example above, you can use readMavenPom step, see Pipeline Utility Steps Plugin.

On the team’s job waiting for the above artifact to be published, you will need to add a trigger that looks like:

// Declarative //
pipeline {
    agent any

    triggers {
        eventTrigger simpleMatch('com.example:my-jar:0.5-SNAPSHOT:jar')
    }

    stages {
        stage('Maven Example') {
            steps {
                echo 'a new maven artifact triggered this Pipeline'
            }
        }
    }
}

// Script //
properties([pipelineTriggers([eventTrigger simpleMatch('com.example:my-jar:0.5-SNAPSHOT:jar'))])])

node {
    stage('Example')  {
        echo 'a new maven artifact triggered this Pipeline'
    }
}

Reminder: you will need to build the downstream Pipeline once for the trigger to be registred.

The above Pipeline will be triggered each time a new event with identifier
com.example:my-jar:0.5-SNAPSHOT:jar is published.

You can also be more flexible and instead of waiting on the specific artifact version, you could be awaiting any new artifact for a mvn project. To accomplish that, you could use jmespath queries and configure your downstream job in this way:

// Declarative //
pipeline {
    agent any

    triggers {
        eventTrigger jmespathQuery("contains(event, 'com.example:my-jar')")
    }
    stages {
        stage('Maven Example') {
            steps {
                echo 'a new maven artifact triggered this Pipeline'
            }
        }
    }
}

// Script //
properties([pipelineTriggers([eventTrigger jmespathQuery("contains(event, 'com.example:my-jar')"))])])

node {
    stage('Example')  {
        echo 'a new maven artifact triggered this Pipeline'
    }
}

The above Pipeline will be triggered each time a new event is published whose identifier contains a string com.example:my-jar.

Docker images

Similarly you may want to trigger a job when a new version of a docker image is available. You can configure the upstream Pipeline to publish the event:

// Declarative //
pipeline {
    agent any

    stages {
        stage('Docker Example') {
            steps {
                echo 'new docker image got published'
                publishEvent simpleEvent('cloudbees/java-build-tools:LATEST')
            }
        }
    }
}

// Script //
node {
    stage("Docker Example")  {
        echo 'new docker image got published'
        publishEvent simpleEvent('cloudbees/java-build-tools:LATEST')
    }
}

Finally, you would configure the downstream job to listen over events with the above identifier:

// Declarative //
pipeline {
    agent any

    triggers {
        eventTrigger simpleMatch('cloudbees/java-build-tools:LATEST')
    }

    stages {
        stage('Docker Example') {
            steps {
                echo 'a new docker image triggered this Pipeline'
            }
        }
    }
}

// Script //
properties([pipelineTriggers([eventTrigger jmespathQuery("contains(event, 'com.example:my-jar')"))])])

node {
    stage('Example')  {
        echo 'a new docker image triggered this Pipeline'
    }
}

Npm packages

At this point it may come naturally that you can couple any kind of publisher/trigger as long as they share the same identifier. As last example, you may want to trigger a Pipeline any time an upstream job has published a new npm package version.

Your upstream will look like:

// Declarative //
pipeline {
    agent any

    stages {
        stage('Npm Example') {
            steps {
                echo 'new npm package got published'
                publishEvent simpleEvent('cloudbees-js@1.0.0')
            }
        }
    }
}

// Script //
node {
    stage("Npm Example")  {
        echo 'new npm package got published'
        publishEvent simpleEvent('cloudbees-js@1.0.0')
    }
}

Your downstream will look like:

// Declarative //
pipeline {
    agent any

    triggers {
        eventTrigger simpleMatch('cloudbees-js@1.0.0')
    }

    stages {
        stage('Docker Example') {
            steps {
                echo 'a new npm package triggered this Pipeline'
            }
        }
    }
}

// Script //
properties([pipelineTriggers([eventTrigger simpleMatch('cloudbees-js@1.0.0'))])])

node {
    stage('Example')  {
        echo 'a new npm artifact triggered this Pipeline'
    }
}

How do I configure my Pipeline downstream job with multiple triggers so that the Pipeline gets built when any of those events are published?

It is not possible to configure your downstream job with multiple event triggers. If you need your Pipeline to be dependent on multiple events, you would need to use a jmespath query trigger. In order to have more flexibility you could use a JSON event type in your publisher (rather than a Simple Event one) and use a specific JSON property which you can match your query against. For instance, as we saw in the maven section above, you can use a trigger query like query("contains(event, 'com.example:my-jar')") to match any event whose event contains the com.example:my-jar groupId:artifact.

Let’s say you want to trigger all events containing a name JSON field matching a specific string, your jmespath query can look like name=='CustomEvent' to match published events like publishEvent jsonEvent('{"name":"CustomEvent", "type":"jar"}').

Using event triggers with external webhook events

The Notification Webhook HTTP Endpoint feature provides the ability to define HTTP endpoints that can receive webhook events from external services. These events are then broadcasted to listening Pipelines where they can be used to trigger builds.

See External HTTP endpoints for more information about integrating with external webhook events.

See Triggering jobs with a simple webhook for a tutorial on setting up a simple webhook to trigger jobs.

Copyright © 2010-2019 CloudBees, Inc.Online version published by CloudBees, Inc. under the Creative Commons Attribution-ShareAlike 4.0 license.CloudBees and CloudBees DevOptics are registered trademarks and CloudBees Core, CloudBees CodeShip, CloudBees Jenkins Enterprise, CloudBees Jenkins Platform, CloudBees Jenkins Operations Center and DEV@cloud are trademarks of CloudBees, Inc. Oracle and Java are registered trademarks of Oracle and/or its affiliates. Jenkins is a registered trademark of the non-profit Software in the Public Interest organization. Used with permission. See here for more info about the Jenkins project. The registered trademark Jenkins® is used pursuant to a sublicense from the Jenkins project and Software in the Public Interest, Inc. Read more at www.cloudbees.com/jenkins/about. Apache, Apache Ant, Apache Maven, Ant and Maven are trademarks of The Apache Software Foundation. Used with permission. No endorsement by The Apache Software Foundation is implied by the use of these marks.Other names may be trademarks of their respective owners. Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks. Where those designations appear in this content, and CloudBees was aware of a trademark claim, the designations have been printed in caps or initial caps. While every precaution has been taken in the preparation of this content, the publisher and authors assume no responsibility for errors or omissions, or for damages resulting from the use of the information contained herein.