How To Authenticate to AWS with the Pipeline AWS Plugin

Article ID:360027893492
3 minute readKnowledge base

Issue

I would like to interact with AWS in a Pipeline but I don’t know how to manage the credentials. For instance I would like to upload to an S3 bucket from a Jenkins Pipeline.

Resolution

We will cover two ways of authentication, either using an IAM user or an IAM role.

Using an IAM user

Prerequisites

As a prerequisite, you will need to have an IAM user with programmatic access as well as the correct permissions to access the AWS resources you wish to interact with. You can refer to Amazon’s Creating an IAM User in Your AWS Account page to create this IAM user. Expectations for the next section is that you have access to an Access key ID and a Secret access key.

Setting up the credentials

Once this is done, you want to add new credentials of type Aws Credentials, specifying your Access key ID and a Secret access key. Please refer to Using Credentials if you are unsure on how to add credentials.

Using the credentials in a Pipeline

You are now ready to use those credentials in a Pipeline. You can inject the credentials with withAWS step and then call whatever AWS related step you need. Here is an example uploading and then downloading a file from an S3 bucket called kb-bucket, using credentials aws-credentials:

pipeline {
    agent any
    stages {
        stage('hello AWS') {
            steps {
                withAWS(credentials: 'aws-credentials', region: 'us-east-1') {
                    sh 'echo "hello KB">hello.txt'
                    s3Upload acl: 'Private', bucket: 'kb-bucket', file: 'hello.txt'
                    s3Download bucket: 'kb-bucket', file: 'downloadedHello.txt', path: 'hello.txt'
                    sh 'cat downloadedHello.txt'
                }
            }
        }
    }
}

Note that you are not restricted to AWS pipeline steps in the closure passed to withAWS, you could for instance decide to use the aws cli (provided it is accessible to your build):

withAWS(credentials: 'aws-credentials', region: 'us-east-1') {
    sh 'aws iam get-user'
}

Using an IAM role

There are several cases here.

Step is running on an EC2 machine attached to the role you need

If you want to use this exact role in your Pipeline, then you just have to use AWS steps, or the aws cli. No need to inject any credentials.

You need to assume a role

If you are not in the first case, then you need to assume a role. This is done providing the AWS step with a role, a role-name and an externalId. You should have those information after creating the role. You can find more details in Delegate Access Across AWS Accounts Using IAM Roles.

Once you have those parameters, you can use them in the withAWS step:

pipeline {
    agent any
    stages {
        stage('hello AWS') {
            steps {
                withAWS(role:'role-name', roleAccount:'roleAccount', externalId: 'roleExternalId', duration: 900, roleSessionName: 'jenkins-session') {
                    sh 'echo "hello KB">hello.txt'
                    s3Upload acl: 'Private', bucket: 'kb-bucket', file: 'hello.txt'
                    s3Download bucket: 'kb-bucket', file: 'downloadedHello.txt', path: 'hello.txt'
                    sh 'cat downloadedHello.txt'
                }
            }
        }
    }
}

Two remarks:

  1. The externalId might not be needed, to be checked with how your AWS team is issuing the roles. See How to Use an External ID When Granting Access to Your AWS Resources to a Third Party for more details.

  2. There is a known limitation in the Pipeline: AWS Steps Plugin where the withAWS step will use the master instance profile security token to assume role and not the one of the agent where the step is running. See WithAWS step always use controller’s instance profile security token even running on agent for more details.

Tested product/plugin versions