Amazon Simple Storage Service (S3) is storage for the internet. It is designed to make web-scale computing easier for developers.
Amazon S3 has a simple web-services interface where you can store and retrieve any amount of data, at any time, from anywhere on the web. It gives developers access to the same highly scalable, reliable, fast, inexpensive data storage infrastructure that Amazon uses to run its own global network of websites. The goal is to maximize benefits of scale and pass those benefits on to developers. For more information about Amazon S3, go to the Amazon Web Services website.
The S3 plugin uses the Amazon S3 application programming interface (API) that allows developers to choose where their objects are stored on the Amazon cloud. Using this secure API, your application is automatically scaled up and down as needed. This integration allows ElectricFlow to manage S3 buckets and objects.
The S3 plugin interacts with S3 data by using AWS Java SDK to perform the following tasks:
-
Create configurations with connection information.
-
Create buckets and folders.
-
Store objects in existing buckets and folders.
-
Download objects or entire folders.
-
List all the buckets and folders.
Plugin Version 1.2.0.2023112867
Revised on November 16, 2023
This plugin was developed and tested against Amazon Simple Storage Service (Amazon S3).
For all parameter descriptions in this document, required parameters are shown in bold italics. |
Setting up the plugin configuration
Plugin configurations are sets of parameters that apply across some or all of the plugin procedures. They reduce repetition of common values, create predefined sets of parameters for end users, and store credentials securely. Each configuration has a unique name that is automatically entered in designated parameters in the procedures.
Input
-
Go to
to open the Plugin Manager. -
Find the EC-S3 row.
-
Select Configure to open the EC-S3 Configurations page.
-
Select Create Configuration.
-
To create a S3 configuration, enter the following information and select OK.
Remember that you may need to create additional configurations later.
Parameter | Description |
---|---|
Configuration Name |
Name of the S3 configuration.The default is S3 integration. |
Description |
A description for this configuration. |
Service URL |
The service URL for the S3 service. For the Amazon public S3, this should be https://s3.amazonaws.com. (Required) |
Resource Pool |
The name of the pool of resources on which the integration steps can run. (Required) |
Workspace |
The workspace to use for resources dynamically created by this configuration. (Required) |
Access IDs (Credential Parameters) |
The two access IDs that are required for communicating with S3 (Access ID and Secret Access ID). The configuration stores these as a credential, putting the Access ID in the user field of the credential and the Secret Access ID in the password field of the credential. (Required) |
Attempt Connection? |
If the checkbox is selected, the system tries a connection to check credentials. (Required) |
Debug Level |
Provide the debug level for the output: 0=errors only, 1=normal headers and responses, 2+=debugging information included. (Required) |
Plugin procedures
CreateBucket
A bucket is a container for objects stored in Amazon S3.
To ensure a single, consistent naming approach for Amazon S3 buckets across regions and to ensure bucket names conform to DNS naming conventions, bucket names must comply with the following requirements:
-
Can contain lowercase letters, numbers, periods (.), and hyphens (-).
-
Must start with a number or letter.
-
Must be between 3 and 63 characters long.
-
Must not be formatted as an IP address (e.g., 192.168.5.4).
-
Must not contain underscores (_).
-
Must not end with a hyphen.
-
Cannot contain two, adjacent periods.
-
Cannot contain dashes next to periods (e.g., my-.bucket.com and my.-bucket are invalid).
CreateFolder
This procedure create nested folders within the specified bucket. Folders help to organize the S3 Objects.
Input
-
Go to the CreateFolder procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Bucket Name |
Name of the bucket in which to create folder. (Required) |
Folder Name |
Name of the Folder to create. (Required) |
DeleteBucketContents
This procedure deletes the contents of the specified bucket.
Input
-
Go to the DeleteBucketContents procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Bucket Name |
Name of the bucketof which to clear the contents. (Required) |
DeleteObject
This procedure deletes the S3 object in specified bucket or folder.
Input
-
Go to the DeleteObject procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Bucket Name |
Name of the bucketwhere the object is. (Required) |
Key |
Key of the object to delete. (Required) |
DownloadFolder
This procedure downloads the contents of the specified folder to local filesystem.
Input
-
Go to the DownloadFolder procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Bucket Name |
Name of the bucketwhere the folder is. (Required) |
Key Prefix - Folder |
Key prefix of the folder to download. |
Download Location |
Path of the download location.For example, |
Output
After the job runs, you can view the results on the Job Details page in CloudBees CD/RO. Every job step was completed successfully.
In the DownloadFolder step, select the Log button to see the diagnostic information.
After the folder is successfully downloaded, CloudBees CD/RO stores the key names and download paths of the objects in the property sheet.(Default location is /myJob/S3Output.)
DownloadObject
This procedure downloads the S3 object specified by the key to the local file system.
Input
-
Go to the DownloadObject procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Bucket Name |
Name of the bucketwhere the folder is. (Required) |
Key |
Key of the object to download |
Download Location |
Path of the download location.For example, |
Output
After the job runs, you can view the results on the Job Details page in CloudBees CD/RO. Every job step was completed successfully.
In the DownloadObject step, select the Log button to see the diagnostic information.
CloudBees CD/RO stores the key names and the download locations of the objects in property sheet.(Default location is /myJob/S3Output.)
ListBucket
This procedure lists all the buckets.
Input
-
Go to the ListBucket procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Output
After the job runs, you can view the results on the Job Details page in CloudBees CD/RO. Every job step was completed successfully.
In the ListBucket step, select the Log button to see the diagnostic information.
CloudBees CD/RO stores the list of buckets in the property sheet (Default location is /myJob/S3Output) as follows:
ListFolder
This procedure lists the contents of the folders, either recursively or nonrecursively.
Input
-
Go to the ListFolder procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Bucket Name |
Name of the bucketof which to list the folders. (Required) |
Folder Name |
Name of the folder or prefix to include in the list. |
List Objects in this folder or Include all sub folders? |
If selected, all objects in this folder and all subfolders will be in the list. |
Output
After the job runs, you can view the results on the Job Details page in CloudBees CD/RO. Every job step was completed successfully.
In the ListFolder step, select the Log button to see the diagnostic information.
CloudBees CD/RO stores the list of all the objects in the folder in the property sheet.(Default location is /myJob/S3Output.)
UploadFolder
This procedure uploads the specified local filesystem folder to the Amazon S3 service.
Input
-
Go to the UploadFolder procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Bucket Name |
Name of the bucketof which to list the folders. (Required) |
Key |
The key prefix of the virtual directory to which the folder is uploaded. Keep this field empty to upload files to the root of the bucket. |
Folder to Upload |
Name of the folder to upload.For example, |
Make the object public |
If selected, the uploaded object will be publicly accessible. |
Output
After the job runs, you can view the results on the Job Details page in CloudBees CD/RO. Every job step was completed successfully.
In the UploadFolder step, select the Log button to see the diagnostic information.
After a folder is successfully uploaded, CloudBees CD/RO stores the key names and AWS access URLs for the objects in this folder in the property sheet.(Default location is /myJob/S3Output.)
UploadObject
This procedure uploads the specified local filesystem folder to Amazon S3 service.
Input
-
Go to the UploadObject procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration which holds all the connection information. This must reference a valid existing configuration. (Required) |
Bucket Name |
Name of the bucketto which upload the object. (Required) |
Key |
Key of the object to upload. This value will be used as the key for the object that is uploaded. (Required) |
File to Upload |
Path for file to upload. For example, |
Make the object public |
If selected, the uploaded object will be publicly accessible. |
Output
After the job runs, you can view the results on the Job Details page in CloudBees CD/RO. Every job step was completed successfully.
In the UploadObject step, select the Log button to see the diagnostic information.
After an object is successfully uploaded, CloudBees CD/RO stores the key name and AWS link to the object in the property sheet.(Default location is /myJob/S3Output.)
WebsiteHosting
You can use Amazon Simple Storage Service (S3) to host a website that uses client-side technologies (such as HTML, CSS, and JavaScript) and does not require server-side technologies (such as PHP and ASP.NET). This is called a static website and is used to display content that does not change frequently.
To host your static website, use this procedure to configure an Amazon S3 bucket for website hosting. It is then available at the region-specific website endpoint of the bucket:_<bucket-name>.s3-website-<AWS-region>.amazonaws.com_
Input
-
Go to the WebsiteHosting procedure.
-
Enter the following parameters:
Parameter | Description |
---|---|
Configuration |
The name of the configuration that has all the connection information. This must refer to a valid existing configuration. |
Bucket Name |
Name of the bucket to create. |
Enable website hosting |
After you enable your bucket for static website hosting, all your content is accessible to web browsers through the Amazon S3 endpoint for your bucket. |
Index Document |
Name of the index document. |
Error Document |
Name of the error document. |
Output
After the job runs, you can view the results on the Job Details page in CloudBees CD/RO. Every job step was completed successfully.
In the WebsiteHosting step, select the Log button to see the diagnostic information.
After the bucket is successfully configured for static website hosting, CloudBees CD/RO stores the bucket name as a key and Amazon S3 website endpoint for your bucket as a value in the property sheet.(Default location is /myJob/S3Output.)
Examples and use cases
Use case 1: static website hosting
One of the common use case of this plugin is to host a publicly accessible website.To achieve this, create a bucket on S3 and then upload the contents to that folder.To do this, you must:
-
Create a plugin configuration.
-
Create a bucket on S3.
-
Upload the contents of the folder to the bucket.
-
Configure bucket for website hosting.
Create a plugin configuration
In CloudBees CD/RO, navigate to
to open the Plugin Manager. Then select Configure and enter the values for the parameters in the S3 Configuration page.
After the configuration is created, you can see it in "S3 Configurations".
Create a bucket on s3
Go to the CreateBucket procedure, enter the values in the parameter fields:

This procedure calls the CreateBucket procedures to create a new bucket ecwebsitehosting
.
Upload the contents to the s3 bucket
Go to the UploadFolder procedure, enter the values in the parameter fields:

This procedure calls the UploadFolder procedures to upload the contents of the C:\Electric Cloud\electricCloud\Website
directory to ecwebsitehosting
bucket.
Release notes
EC-S3 1.2.0
-
Added support for a new plugin configuration.
-
Upgraded from Perl 5.8 to Perl 5.32.
-
Starting with EC-S3 1.2.0, CloudBees CD/RO agents running v10.3 and later are required to run plugin procedures.
-
Removed CGI scripts.
EC-S3 1.1.4
-
Fixed the following Java error:
java.lang.NoClassDefFoundError: javax/xml/bind/DatatypeConverter
.
EC-S3 1.0.0
-
Added support to create new buckets and folders in buckets.
-
Added support to clean the bucket contents.
-
Added support to delete specific objects in a bucket or folder.
-
Added support to upload or download objects or the entire content of the bucket or folder.
-
Added support to list buckets and folders.