Over 10 years we help companies reach their financial and branding goals. Engitech is a values-driven technology agency dedicated.

Gallery

Contacts

Adress Here

contact@dbgates.com

+90 555 555 55 55

Data Insight

Hello,

In this blog post, I will describe the necessary steps to export data from Autonomous Database for Other Oracle Databases. Oracle recommends using the latest Oracle Data Pump version for exporting data from Autonomous Database to other Oracle databases.

Firstly we need create a directory in Autonomous Database.

CREATE DIRECTORY creates the database directory object and also creates the file system directory if it does not already exist. If the file system directory exists then CREATE DIRECTORY only creates the database directory object. For example, the following command creates the database directory called mydir and creates the file system directory mydir

CREATE DIRECTORY mydir AS ‘mydir’;

Or you can also specify sub directories

CREATE DIRECTORY mydir AS ‘veridata/mydir’;

To add a directory, you must have the CREATE ANY DIRECTORY system privilege. The ADMIN user is granted the CREATE ANY DIRECTORY system privilege. After you create the MYDIR directory, use the following command to list all files:

SELECT * FROM DBMS_CLOUD.list_files(‘MYDIR’);

After creating the directory, you can export data as following steps

1- You must download client connection credential as I explained my previous blog post.

2- Write your Autonomous Database TNS alias to your tnsnames.ora
for example,

myatp_high = (description= (retry_count=20)(retry_delay=3)(address=(protocol=tcps)(port=1522)(host=adb.eu-frankfurt-1.oraclecloud.com))(connect_data=(service_name=vjdijq7k4au1w9x_myatp_high.atp.oraclecloud.com))(security=(ssl_server_cert_dn=”CN=adwc.eucom-central-1.oraclecloud.com,OU=Oracle BMCS FRANKFURT,O=Oracle Corporation,L=Redwood City,ST=California,C=US”)))

3- Run Data Pump Export with the dumpfile parameter set, the filesize parameter set to less than 5G, and the directory parameter set.

expdp ADMIN/*******@myatp_high directory=data_pump_dir dumpfile=exp%U.dmp parallel=4 encryption_pwd_prompt=yes filesize=1G logfile=export.log

If during the export with expdp you use the encryption_pwd_prompt=yes parameter then use encryption_pwd_prompt=yes with your import and input the same password at the impdp prompt to decrypt the dump files.

4- After the export is finished you can see the generated dump files by running the following query.

SELECT * FROM DBMS_CLOUD.LIST_FILES(‘DATA_PUMP_DIR’);

5- Now, let me move the dump file set to your Cloud Object Store, upload the files from the Autonomous Database database directory to your Cloud Object Store.

6- Connect to your Autonomous Database.

7- Store your object store credentials using the procedure DBMS_CLOUD.CREATE_CREDENTIAL.

BEGIN
DBMS_CLOUD.CREATE_CREDENTIAL(
credential_name => ‘ADMIN’,
username => ‘talip…..’,
password => ‘Zgkz:2lE)zw:duc}K1>8’
);
END;
/

This operation stores the credentials in the database in an encrypted format. You can use any name for the credential name. Note that this step is required only once unless your object store credentials change. Once you store the credentials you can then use the same credential name.

8- Move the dump files from the Autonomous Database to your Cloud Object Store by calling DBMS_CLOUD.PUT_OBJECT.
In my previous blog post, I had explained the setting up a bucket to store Autonomous Database.

BEGIN
DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/exp01.dmp’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘exp01.dmp’);
DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/exp02.dmp’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘exp02.dmp’);
DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/exp03.dmp’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘exp03.dmp’);
DBMS_CLOUD.PUT_OBJECT(credential_name => ‘ADMIN’,
object_uri => ‘https://objectstorage.eu-frankfurt-1.oraclecloud.com/n/frskjd0nbatp/b/my_dumpfiles/o/exp04.dmp’,
directory_name => ‘DATA_PUMP_DIR’,
file_name => ‘exp04.dmp’);
END;
/

In this example;
my bucket name: my_dumpfiles
my namespace: frskjd0nbatp
my cloud server: eu-frankfurt-1
my credential_name : ADMIN (Specified in step 7)

9- Now let me check the dump files in BUCKET. Open the navigation menu. Under Core Infrastructure, click Object Storage.

10- Select a compartment from the Compartment list on the left side of the page. A list of existing buckets is displayed.

11- Click the bucket name to see bucket details.Under the Objects, you will see moved dump files.

12- If required, download the dump files from Cloud Object Store and use Oracle Data Pump Import to import the dump file set to the target database.

Select the dump files and click the download

13- Run Data Pump Import to import the dump file set to the target database.

Leave a comment

Your email address will not be published. Required fields are marked *