Hello,
In this blog post, I want to describe necessary steps to import data using Oracle Data Pump on Autonomous Database. You need to use Oracle Data Pump Export to export your existing Oracle Database schemas to migrate them to Autonomous Database using Oracle Data Pump Import. Oracle Data Pump Export provides several export modes, Oracle recommends using the schema mode for migrating to Autonomous Database. For a faster migration, export your schemas into multiple Data Pump files and use parallelism. You can specify the dump file name format you want to use with the dumpfile parameter. Set the parallel parameter to at least the number of CPUs you have in your Autonomous database.
With encryption_pwd_prompt=yes Oracle Data Pump export prompts for an encryption password to encrypt the dump files. The following example exports the HR schema from a source Oracle Database for migration to an Autonomous database with 4 CPUs
expdp HR/HR@MYORCL exclude=index,cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_link data_options=group_partition_table_data parallel=4 schemas=HR dumpfile=export%u.dmp encryption_pwd_prompt=yes
If during the export with expdp you use the encryption_pwd_prompt=yes parameter then also use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files.
1- Store your Cloud Object Storage credential using DBMS_CLOUD.CREATE_CREDENTIAL.
BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => ‘ADMIN’,
username => ‘talip…..’,
password => ‘Zgkz:2lE)zw:duc}K1>8’
);
END;
/
2- Set the credential as the default credential for your Autonomous Database, as the ADMIN user.
ALTER DATABASE PROPERTY SET DEFAULT_CREDENTIAL = ‘ADMIN.ADMIN’;
3- Run Data Pump Import with the dumpfile parameter set to the list of file URLs on your Cloud Object Storage, and set the default_credential keyword.
impdp admin/********@dbmasteratp_high directory=data_pump_dir dumpfile=default_credential:https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/export%u.dmp parallel=4 encryption_pwd_prompt=yes partition_options=merge transform=segment_attributes:n exclude=index,cluster,indextype,materialized_view,materialized_view_log,materialized_zonemap,db_link logfile=import.log tables=TALIP
If during the export with expdp you used the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes and input the same password at the impdp prompt that you specified during the export.
In this example;
my bucket name: my_dumpfiles
my namespace: frskjd0nbatp
my cloud server: eu-frankfurt-1
my credential_name : ADMIN
For the best import performance use the HIGH database service for your import connection and set the PARALLEL parameter to the number of OCPUs in your Autonomous Database.
The log files for Data Pump Import operations are stored in the directory you specify with the data pump impdp directory parameter.To view the log file you need to move the log file to your Cloud Object Storage using the procedure DBMS_CLOUD.PUT_OBJECT
BEGIN
DBMS_CLOUD.PUT_OBJECT(
credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/import.log’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘import.log’);
END;
/
If required, download the import logfile from Cloud Object Store