This solution is a Command line based solution. The same can be achieved using Oracle Enterprise Monitor.
The following steps are to be taken so that the target DB is ready for import before downtime is scheduled.
Make sure the schema and user are the same between the source and target database (password can be different)
Make sure the tablespace in the target database is defined the same way as the source database. Otherwise, manual mapping will be needed during import.
The following command will give you a list of tablespaces accessible by the user you have logged in as in your current instance. Use the same command in your new instance to make sure the tablespaces are the alike.
SQL> SELECT TABLESPACE_NAME, STATUS, CONTENTS 2 FROM USER_TABLESPACES;
You may need to use the instructions in this link to create your tablespaces
To check freespace for tablespace, the following command will give you an approximate result.
SQL> SELECT TABLESPACE_NAME,SUM(BYTES)/1024/1024 "FREE SPACE(MB)" FROM DBA_FREE_SPACE GROUP BY TABLESPACE_NAME;
Once these have been confirmed, you will have to schedule some downtime for your Production Commander server.
The following steps are to be followed only after you have the target Database set up as mentioned in the Pre-Export Steps. The commander server should be stopped before starting the following steps.
Creating database directories
|Oracle introduced a default directory from 10g R2, called DATA_PUMP_DIR, that can be used:
SQL> SELECT directory_path FROM dba_directories WHERE directory_name = 'DATA_PUMP_DIR'; > Output = DIRECTORY_PATH
Otherwise, follow the steps below to create the dump directory.
Execute the following commands to create a database directory where you want to export the database dump. This directory must point to a valid directory on the same server as the database
SQL> CREATE DIRECTORY dmpdir AS '/opt/oracle'; > Directory created.
Grant Read and Write permissions to this directory
SQL> GRANT read, write ON DIRECTORY dmpdir TO ; > Grant succeeded
Export and Import of the Oracle schema
Run below select statement: to get a snapshot of the database as it is at that point. All data that is being written to the database after this snapshot will not get copied to dump file.
SQL> select dbms_flashback.get_system_change_number from dual;
If the above command does not work, try the following command to get the SCN number
SQL> select current_scn from v$database;
Write or copy that SCN number as you will need it for the export command.
Template of Command to run to export:
SQL> expdp userid/pwd schemas=dbschema DIRECTORY=DATA_PUMP_DIR dumpfile=filename.dmp VERSION=11.2 logfile=file.log FLASHBACK_SCN=SCNNumber compression=all
Assuming the right tablesspaces exist (ecdata, ecindex) you can import it with
SQL> impdp userid/pwd schemas=dbschema DIRECTORY=DATA_PUMP_DIR dumpfile= filename.dmp version=126.96.36.199.0 LOGFILE=file.log
Once this process is completed, you have migrated the data to a new instance. To point your Commander server to the newly created Oracle DB, please follow the instructions in the Restore a CloudBees CD/RO server - Keep the same CloudBees CD/RO server but switch the database