Summary
Creates a newarchiveConnector
object.archiveConnectorNameStringrequiredUnique name of the archive connector. archiveDataFormatStringoptionalThe data format that the connector consumes the archived data in. Possible values: "JSON" , "XML" archiveScriptStringoptionalScript registered to connect to the archive system and store the data being archived. descriptionStringoptionalComment text describing this object that is not interpreted at all by CloudBees CD/RO. enabledBooleanoptionalWhether the connector is enabled. If true, then any previously enabled archive connector will be disabled. |
Usage
Perl
$cmdr->createArchiveConnector( "test-archiveConnectorName" # archiveConnectorName # optionals );
ectool
ectool createArchiveConnector \ "test-archiveConnectorName" `# archiveConnectorName` \ # optionals
Examples
ectool
ectool createArchiveConnector "MyArchiveConnector" --archiveScript 'July2019Archive'
Archive scripts
CloudBees CD/RO provides DSL for two archive connectors. Use these as starting points to customize based on your own requirements. When ready to implement your connector, save the DSL script to a file (MyArchiveConnector.dsl
is used below) and run the following from the command line:
ectool evalDSL --dsl MyArchiveConnector.dsl
File archive connector
This connector writes data to an absolute archive directory in your file system. You can use this script or customize it with your own logic. For example, you can customize it to store data in subdirectories by month or year.
If you customize the logic, update the example DSL and apply it to the CloudBees CD/RO server by using the following command, where fileConnector.dsl
is the name of your customized DSL script:
ectool evalDsl --dslFile fileConnector.dsl
Now, enable it with the following command:
ectool modifyArchiveConnector "File Archive Connector" --actualParameter archiveDirectory="C:/archive" --enabled true
This is the archive connector source:
archiveConnector 'File Archive Connector', { enabled = true archiveDataFormat = 'JSON' // Arguments available to the archive script // 1. args.entityName: Entity being archived, e.g., release, job, flowRuntime // 2. args.archiveObjectType: Object type defined in the data retention policy, // e.g., release, job, deployment, pipelineRun // 3. args.entityUUID: Entity UUID of the entity being archived // 4. args.serializedEntity: Serialized form of the entity data to be archived based on // the configured archiveDataFormat. // 5. args.archiveDataFormat: Data format for the serialized data to be archived // // The archive script must return a boolean value. // true - if the data was archived // false - if the data was not archived archiveScript = ''' def archiveDirectory = 'SET_ABSOLUTE_PATH_TO_ARCHIVE_DIRECTORY_LOCATION_HERE' def dir = new File(archiveDirectory, args.entityName) dir.mkdirs() File file = new File(dir, "${args.entityName}-${args.entityUUID}.json") // Connectors can choose to handle duplicates if they needs to. // This connector implementation will not process a record if the // corresponding file already exists. if (file.exists()) { return false } else { file << args.serializedEntity return true }''' }
DevOps Insight archive connector
This connector configures archiving to the CloudBees Analytics server.
If you customize the logic, update the example DSL and apply it to the CloudBees CD/RO server by using the following command, where fileConnector.dsl
is the name of your customized DSL script:
ectool evalDsl --dslFile fileConnector.dsl
Enable it with the following command:
ectool modifyArchiveConnector "DevOps Insight Server Connector" --enabled true
Apply the DSL script below to create a report object type for each object that can be archived.
// Create the report objects for the archived data before creating the // archive connector for {SDA-ANALYTICS} server connector. reportObjectType 'archived-release', displayName: 'Archived Release' reportObjectType 'archived-job', displayName: 'Archived Job' reportObjectType 'archived-deployment', displayName: 'Archived Deployment' reportObjectType 'archived-pipelinerun', displayName: 'Archived Pipeline Run'
This DSL script creates the following report object types:
-
archived-release
-
archived-job
-
archived-deployment
-
archived-pipelinerun
archiveConnector 'DevOps Insight Server Connector', { // the archive connector is disabled out-of-the-box enabled = true archiveDataFormat = 'JSON' // Arguments available to the archive script // 1. args.entityName: Entity being archived, e.g., release, job, flowRuntime // 2. args.archiveObjectType: Object type defined in the data retention policy, // e.g., release, job, deployment, pipelineRun // 3. args.entityUUID: Entity UUID of the entity being archived // 4. args.serializedEntity: Serialized form of the entity data to be archived based on // the configured archiveDataFormat. // 5. args.archiveDataFormat: Data format for the serialized data to be archived // // The archive script must return a boolean value. // true - if the data was archived // false - if the data was not archived archiveScript = ''' def reportObjectName = "archived-${args.archiveObjectType.toLowerCase()}" def payload = args.serializedEntity // If de-duplication should be done, then add documentId to the payload // args.entityUUID -> documentId. This connector implementation does not // do de-duplication. Documents in DOIS may be resolved upon retrieval // based on archival date or other custom logic. sendReportingData reportObjectTypeName: reportObjectName, payload: payload return true ''' }