Integration with AWS S3 as bucket for your Jenkins artifacts

Article ID:360041265712
2 minute readKnowledge base

Issue

  • I want to configure the artifact S3 manager plugin

  • I would like to store the artifacts into AWS S3

Resolution

As stated in CloudBees Core Performance Best Practices: Do not use the standard Archive the Artifacts post-build step for large artifacts (> 100KB), they should be sent to your favorite Artifact Repository Manager e.g. Artifactory, Nexus, S3, etc.).

Following is a step-by-step guide to store the Jenkins artifacts into AWS S3 by using Artifact Manager on S3.

Prerequisites in AWS

A. A user (e.g. s3-artifacts) with the right policies for S3. For this example we are using arn:aws:iam::aws:policy/AmazonS3FullAccess but you can be more restrictive as explained in the plugin page.

{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "s3:*", "Resource": "*" } ] }

B. An S3 bucket (e.g. artifact-bucket-example). Ideally, a separate folder to store your artifacts (e.g. acme-artifacts).

Steps in CloudBees Core

1. Create an AWS Crendential type in Jenkins using id of the user (s3-artifacts) plus its access_key_id and its aws_secret_access_key.

2. Configure the controller to store all the artifacts in S3 via Manage Jenkins > Configure System > Artifact Management for Builds > Cloud Artifact Storage > Cloud Provider > Amazon S3 and Save the configuration.

3. Configure the plugin via Manage Jenkins > AWS

  • S3 Bucket Name: artifact-bucket-example

  • Base Prefix: acme-artifacts/

  • Amazon Credentials (step 1)

Then, Validate S3 Bucket configuration. Expected value is success.

4. Run a demo Pipeline as the following exmaple:

pipeline { agent { kubernetes { containerTemplate { name 'agent' image 'jenkins/agent:alpine' ttyEnabled true } } } stages { stage('archiving artifacts into AWS s3') { steps { script { sh "dd if=/dev/urandom of=artifact.txt bs=5MB count=1" } archiveArtifacts artifacts: "*.*", fingerprint: true } } } }

Expected console output:

[Pipeline] { (archiving artifacts into AWS s3) [Pipeline] script [Pipeline] { [Pipeline] sh + dd 'if=/dev/urandom' 'of=artifact.txt' 'bs=5MB' 'count=1' 1+0 records in 1+0 records out [Pipeline] } [Pipeline] // script [Pipeline] archiveArtifacts Archiving artifacts Uploaded 1 artifact(s) to https://artifact-bucket-example.s3.amazonaws.com/acme-artifacts/pipeline-demo/1/artifacts/ Recording fingerprints [Pipeline] } [Pipeline] // stage [Pipeline] } [Pipeline] // container [Pipeline] } [Pipeline] // node [Pipeline] } [Pipeline] // podTemplate [Pipeline] End of Pipeline Finished: SUCCESS

Troubleshooting

1. Ensure that Amazon S3 has been defined correctly, Manage Jenkins > Configure System > Artifact Management for Builds > Cloud Artifact Storage > Cloud Provider > Amazon S3

2. Create a custom logger for Artifact Manager on S3 (important: delete this log once it is not needed)

  • io.jenkins.plugins.artifact_manager_jclouds - ALL

  • org.jclouds.rest.internal.InvokeHttpMethod - FINE

3. Use the AWS CLI locally

  • Create the file

$> dd if=/dev/urandom of=artifact.txt bs=5MB count=1 1+0 records in 1+0 records out 5000000 bytes (5.0 MB, 4.8 MiB) copied, 0.0481165 s, 104 MB/s
  • Include ~/.aws/config

[profile s3-artifacts] aws_access_key_id = AKI************* aws_secret_access_key = Uce**************************
aws --profile cloudbees-support-s3 s3 cp artifact.txt s3://artifact-bucket-example/my-artifacts/artifact-yeah.txt upload: ./artifact.txt to s3://artifacts-bucket-carlosrodlop/my-artifacts/artifact-yeah.txt

Tested products/plugins version

The latest update of this article was tested with: