Amazon Simple Storage Service (S3)

9 minute readExtensibilityDeveloper productivity

Amazon Simple Storage Service (S3) is storage for the Internet. It is designed to make web-scale computing easier for developers.

Amazon S3 has a simple web-services interface where you can store and retrieve any amount of data, at any time, from anywhere on the web. It gives developers access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of websites. The goal is to maximize benefits of scale and pass those benefits on to developers.

For more information about Amazon S3, go to the Amazon Web Services website.

The EC-S3 plugin uses the Amazon S3 application programming interface (API) that lets developers choose where their objects are stored on the Amazon cloud. Using this secure API, your application is automatically scaled up and down as needed. This integration allows ElectricFlow to manage S3 buckets and objects.

The plugin interacts with S3 data by using AWS Java SDK to perform the following tasks:

  • Create configurations with connection information.

  • Create buckets and folders.

  • Store objects in existing buckets and folders.

  • Download objects or entire folders.

  • List all the buckets and folders.

Plugin Version 1.1.2.2020102201 Revised on December 19, 2018

This plugin was developed and tested against Amazon Simple Storage Service (Amazon S3).

For all parameter descriptions in this document, required parameters are shown in bold italics.

Setting up the plugin configuration

Plugin configurations are sets of parameters that apply across some or all of the plugin procedures. They reduce repetition of common values, create predefined sets of parameters for end users, and store credentials securely. Each configuration has a unique name that is automatically entered in designated parameters in the procedures.

Input

  1. Go to Administration Plugins to open the Plugin Manager.

  2. Find the EC-S3 row.

  3. Click Configure to open the EC-S3 Configurations page.

  4. Click Create Configuration.

  5. To create a S3 configuration, enter the following information and click OK.

Remember that you may need to create additional configurations later.

ParameterDescription

Configuration Name

Name of the S3 configuration.The default is S3 integration.

Description

A description for this configuration.

Service URL

The service URL for the S3 service. For the Amazon public S3, this should be https://s3.amazonaws.com. (Required)

Resource Pool

The name of the pool of resources on which the integration steps can run. (Required)

Workspace

The workspace to use for resources dynamically created by this configuration. (Required)

Access IDs (Credential Parameters)

The two access IDs that are required for communicating with S3 (Access ID and Secret Access ID). The configuration stores these as a credential, putting the Access ID in the user field of the credential and the Secret Access ID in the password field of the credential. (Required)

Attempt Connection?

If the check box is selected, the system tries a connection to check credentials. (Required)

Debug Level

Provide the debug level for the output: 0=errors only, 1=normal headers and responses, 2+=debugging information included. (Required)

Output

The EC-S3 Configurations page now shows the new configuration.

You can also manage your S3 configurations in this page. Click Edit to modify an existing configuration or Delete to remove an existing configuration.

Plugin procedures

CreateBucket

A bucket is a container for objects stored in Amazon S3.

To ensure a single, consistent naming approach for Amazon S3 buckets across regions and to ensure bucket names conform to DNS naming conventions, bucket names must comply with the following requirements:

  • Can contain lowercase letters, numbers, periods (.), and hyphens (-).

  • Must start with a number or letter.

  • Must be between 3 and 63 characters long.

  • Must not be formatted as an IP address (e.g., 192.168.5.4).

  • Must not contain underscores (_).

  • Must not end with a hyphen.

  • Cannot contain two, adjacent periods.

  • Cannot contain dashes next to periods (e.g., my-.bucket.com and my.-bucket are invalid).

Input

  1. Go to the CreateBucket procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration.

Bucket Name

Name of the bucket to create.

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the CreateBucket step, click the Log button to see the diagnostic information.

CreateFolder

This procedure create nested folders within the specified bucket. Folders help to organize the S3 Objects.

Input

  1. Go to the CreateFolder procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Bucket Name

Name of the bucket in which to create folder. (Required)

Folder Name

Name of the Folder to create. (Required)

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the CreateFolder step, click the Log button to see the diagnostic information.

DeleteBucketContents

This procedure deletes the contents of the specified bucket.

Input

  1. Go to the DeleteBucketContents procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Bucket Name

Name of the bucketof which to clear the contents. (Required)

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the DeleteBucketContents step, click the Log button to see the diagnostic information.

DeleteObject

This procedure deletes the S3 object in specified bucket or folder.

Input

  1. Go to the DeleteObject procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Bucket Name

Name of the bucketwhere the object is. (Required)

Key

Key of the object to delete. (Required)

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the DeleteObject step, click the Log button to see the diagnostic information.

DownloadFolder

This procedure downloads the contents of the specified folder to local filesystem.

Input

  1. Go to the DownloadFolder procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Bucket Name

Name of the bucketwhere the folder is. (Required)

Key Prefix - Folder

Key prefix of the folder to download.

Download Location

Path of the download location.For example, '/path/to/downloadLocation' or 'C:\path\to\downloadLocation'. (Required)

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the DownloadFolder step, click the Log button to see the diagnostic information.

After the folder is successfully downloaded, CloudBees CD stores the key names and download paths of the objects in the property sheet.(Default location is /myJob/S3Output.)

DownloadObject

This procedure downloads the S3 object specified by the key to the local file system.

Input

  1. Go to the DownloadObject procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Bucket Name

Name of the bucketwhere the folder is. (Required)

Key

Key of the object to download

Download Location

Path of the download location.For example, '/path/to/downloadLocation' or 'C:\path\to\downloadLocation'. (Required)

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the DownloadObject step, click the Log button to see the diagnostic information.

CloudBees CD stores the key names and the download locations of the objects in property sheet.(Default location is /myJob/S3Output.)

ListBucket

This procedure lists all the buckets.

Input

  1. Go to the ListBucket procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the ListBucket step, click the Log button to see the diagnostic information.

CloudBees CD stores the list of buckets in the property sheet (Default location is /myJob/S3Output) as follows:

ListFolder

This procedure lists the contents of the folders, either recursively or nonrecursively.

Input

  1. Go to the ListFolder procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Bucket Name

Name of the bucketof which to list the folders. (Required)

Folder Name

Name of the folder or prefix to include in the list.

List Objects in this folder or Include all sub folders?

If selected, all objects in this folder and all subfolders will be in the list.

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the ListFolder step, click the Log button to see the diagnostic information.

CloudBees CD stores the list of all the objects in the folder in the property sheet.(Default location is /myJob/S3Output.)

UploadFolder

This procedure uploads the specified local filesystem folder to the Amazon S3 service.

Input

  1. Go to the UploadFolder procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Bucket Name

Name of the bucketof which to list the folders. (Required)

Key

The key prefix of the virtual directory to which the folder is uploaded. Keep this field empty to upload files to the root of the bucket.

Folder to Upload

Name of the folder to upload.For example, '/opt/folderToUpload' or 'C:\path\to\folderToUpload'. (Required)

Make the object public

If selected, the uploaded object will be publicly accessible.

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the UploadFolder step, click the Log button to see the diagnostic information.

After a folder is successfully uploaded, CloudBees CD stores the key names and AWS access URLs for the objects in this folder in the property sheet.(Default location is /myJob/S3Output.)

UploadObject

This procedure uploads the specified local filesystem folder to Amazon S3 service.

Input

  1. Go to the UploadObject procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required)

Bucket Name

Name of the bucketto which upload the object. (Required)

Key

Key of the object to upload. This value will be used as the key for the object that is uploaded. (Required)

File to Upload

Path for file to upload. For example, '/path/to/fileToUpload.txt' or 'C:\mydir\fileToUpload.txt'. (Required)

Make the object public

If checked, the uploaded object will be publicly accessible.

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the UploadObject step, click the Log button to see the diagnostic information.

After an object is successfully uploaded, CloudBees CD stores the key name and AWS link to the object in the property sheet.(Default location is /myJob/S3Output.)

WebsiteHosting

You can use Amazon Simple Storage Service (S3) to host a website that uses client-side technologies (such as HTML, CSS, and JavaScript) and does not require server-side technologies (such as PHP and ASP.NET). This is called a static website and is used to display content that does not change frequently.

To host your static website, use this procedure to configure an Amazon S3 bucket for website hosting. It is then available at the region-specific website endpoint of the bucket:_<bucket-name>.s3-website-<AWS-region>.amazonaws.com_

Input

  1. Go to the WebsiteHosting procedure.

  2. Enter the following parameters:

ParameterDescription

Configuration

The name of the configuration that has all the connection information. This must refer to a valid existing configuration.

Bucket Name

Name of the bucket to create.

Enable website hosting

After you enable your bucket for static website hosting, all your content is accessible to web browsers through the Amazon S3 endpoint for your bucket.

Index Document

Name of the index document.

Error Document

Name of the error document.

Output

After the job runs, you can view the results on the Job Details page in CloudBees CD. Every job step was completed successfully.

In the WebsiteHosting step, click the Log button to see the diagnostic information.

After the bucket is successfully configured for static website hosting, CloudBees CD stores the bucket name as a key and Amazon S3 website endpoint for your bucket as a value in the property sheet.(Default location is /myJob/S3Output.)

Examples and use cases

Use case 1: static website hosting

One of the common use case of this plugin is to host an publicly accessible website.To achieve this, create a bucket on S3 and then upload the contents to that folder.To do this, you must:

  1. Create a plugin configuration.

  2. Create a bucket on S3.

  3. Upload the contents of the folder to the bucket.

  4. Configure bucket for website hosting.

Create a plugin configuration

In CloudBees CD, go to Administration > Plugins to open the Plugin Manager. Then click Configure and enter the values for the parameters in the S3 Configuration page.

After the configuration is created, you can see it in "S3 Configurations".

Create a bucket on s3

Go to the CreateBucket procedure, enter the values in the parameter fields:

This procedure calls the CreateBucket procedures to create a new bucket 'ecwebsitehosting'.

Upload the contents to the s3 bucket

Go to the UploadFolder procedure, enter the values in the parameter fields:

This procedure calls the UploadFolder procedures to upload the contents of the 'C:\Electric Cloud\electricCloud\Website' directory to 'ecwebsitehosting' bucket.

Configure bucket for website hosting

Go to the WebsiteHosting procedure, enter the values in the parameter fields:

This procedure calls the WebsiteHosting procedures to configure the bucket 'ecwebsitehosting' for website hosting.

View the results and output

The following output appears during the procedures:

CreateBucket

UploadFolder

WebsiteHosting

Release notes

EC-S3 1.1.2

  • The documentation has been migrated to the main documentation site.

EC-S3 1.1.1

  • The plugin icon has been updated.

EC-S3 1.1.0

  • AWS SDK Version has been changed to 1.11.10.

EC-S3 1.0.0

  • Added support to create new buckets and folders in buckets.

  • Added support to clean the bucket contents.

  • Added support to delete specific objects in a bucket or folder.

  • Added support to upload or download objects or the entire content of the bucket or folder.

  • Added support to list buckets and folders.