Oracle Data Pump - expdp, impdp (2024)

Oracle Data Pump is a fast data movement utility provided by Oracle. It’s an upgrade to old export and import utility. The Data Pump utility has been built from scratch and it has a completely different architecture.

  • Create Data Pump Directory

  • Table Export and Import

  • Schema Export and Import

  • Rows Export and Import

  • Full Database Export and Import

Advance Topics

Create Data Pump Directory

The first step in Oracle Data Pump is to create an OS level directory which will be used by Oracle for performing exports and imports. Create directory at OS level

mkdir -p /u02/dp_exp_dir

Create directory inside the database

SQL> create directory datapump as '/u02/dp_exp_dir';

Grant permissions on directory

SQL> grant read,write on directory datapump to scott; 

View directory information

SQL> select * from dba_directories;

Get help on expdp or impdp utility

expdp help=yimpdp help=y

Table Export and Import

Take table level export

expdp directory=datapump dumpfile=emp_bkp.dmp logfile=emp_bkp.log tables='SCOTT.EMP'

Import table where source and target schema are same

$ impdp directory=datapump dumpfile=emp_bkp.dmp logfile=imp_emp.log tables='EMP'

Import table to another schema

$ impdp directory=datapump dumpfile=emp_bkp.dmp logfile=imp_emp.log tables='EMP' remap_schema='SCOTT:HR'

Import tables to another tablespace (only in datapump)

$ impdp directory=datapump dumpfile=emp_bkp.dmp logfile=imp_emp.log tables='EMP' remap_schema='SCOTT:HR' remap_tablespace='USERS:MYTBS'

Import table to a different name or rename table or remap_table

$ impdp directory=datapump dumpfile=emp_bkp.dmp logfile=imp_emp.log tables='EMP' remap_table='SCOTT.EMP:HR.EMPLOYEE'

Import only the rows from an exported table without loading table any table definitions

$ impdp directory=datapump dumpfile=emp_bkp.dmp logfile=imp_emp.log tables='EMP' content=DATA_ONLY

Schema Export and Import

Take schema level export

$ expdp directory=datapump dumpfile=scott_bkp.dmp logfile=scott_bkp.log schemas='SCOTT'

Import source schema objects into same schema on target

$ impdp directory=datapump dumpfile=scott_bkp.dmp logfile=imp_schema.log remap_schema='SCOTT:SCOTT'

Import source schema objects into a different schema on target

$ impdp directory=datapump dumpfile=scott_bkp.dmp logfile=imp_schema.log remap_schema='SCOTT:HR'

Rows Export and Import

Take row level export

$ expdp directory=datapump dumpfile=emprows_bkp.dmp logfile=emprows_bkp.log tables='SCOTT.EMP' query=\"where deptno=10\"

Import rows where source and target schema are same

$ impdp directory=datapump dumpfile=emprows_bkp.dmp logfile=imp_emprows.log tables='SCOTT.EMP'

Full Database Export and Import

Take database level export

$ expdp directory=datapump dumpfile=fullprod.dmp logfile=fullprod.log full=y

Import full database

-- On sourceSQL> select name from v$tablespace;-- On targetSQL> select name from v$tablespace;-- Create missing tablespaces on target-- Make sure target tablespace has enough free space-- Drop all non-oracle schemas (done during refresh)-- DROP USER <username> CASCADE;$ impdp directory=datapump dumpfile=fullprod.dmp logfile=imp_fullprod.log full=y

Data Pump Import Over Network

When you try to move large tables or schema between two Oracle databases, datapump export might take lot of disk space. The exported dump files may take lot of space on the disk. The best workaround is to use dblink with Oracle datapump to move data from one oracle database to another.

Note: when you use datapump with dblink, there are no dumpfiles created on source. The data is transferred from one database to another over network

Add source database TNS entry into tnsnames.ora of the target database

devdb = (DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.0)(PORT = 1521)) ) (CONNECT_DATA = (SERVICE_NAME = devdb) ))

On target, we need to create a database link using the TNS entry created above

create database link SOURCE_DB connect to scott identified by tiger using 'devdb';

It's time to import source schema on target database via db link. Run below command on target database to start import

create or replace directory MY_DUMP_DIR as '/u01/dump_files';impdp directory=MY_DUMP_DIR LOGFILE=dblink_transfer.log network_link=SOURCE_DB remap_schema=scott:hr

To import multiple schemas, make sure to use a DBA user (sys) to perform this action

impdp sys directory=MY_DUMP_DIR LOGFILE=dblink_transfer.log network_link=SOURCE_DB schemas=IJS,scott,hr

Data Pump Performance Tuning

You can always use DIRECT=y parameter to perform faster exports and imports. You can also use PARALLEL parameter to start multiple export and import process for faster performance.

Make sure to use %U with the dumpfile name so multiple dumpfiles can be read/write simultaneously

expdp directory=DUMP_DIR DIRECT=y dumpfile=SCOTT_%U.dmp logfile=expdp_SCOTT.log schemas=SCOTT parallel=4impdp directory=DUMP_DIR DIRECT=y dumpfile=SCOTT_%U.dmp logfile=impdp_SCOTT.log schemas=SCOTT parallel=4

EXPDP PAR File Example

Data Pump jobs can be automated using PAR file. You basically create one par file which contains all the export or import parameters and just call the par file at expdp utility

vi exp.parUsername=scott/tigertables=scott.empdirectory=EXP_DIRdumpfile=QUERY_EXP_%U.dmplogfile=QUERY_EXP.logparallel=7

And! Its very simple to call the above export PAR file

expdp parfile=exp.par

You can specify any extension but it is recommended to use .par

Schedule Data Pump Export in crontab

Create a file which contains expdp script

vi daily_export.shexport DATE=$(date +%m_%d_%y_%H_%M)export ORACLE_SID=orclexport ORACLE_HOME=/u01/app/oracle/product/11.2.0/db_1$ORACLE_HOME/bin/expdp username/password@sid directory=export_dir dumpfile=backup_$DATE.dmp logfile=backup_$DATE.log full=y

Give permissions to execute on above file

chmod 755 daily_export.sh

Schedule export under crontab

crontab –e –u oracle0020***/home/oracle/backup_script

Data Pump Export Progress %

When you run a data pump export in the background and want to know the progress status, use below query to get the percentage (%) completion of the export process

SELECT SID,SERIAL#,USERNAME,CONTEXT,SOFAR,TOTALWORK,ROUND(SOFAR/TOTALWORK*100,2)"%_COMPLETE"FROM V$SESSION_LONGOPS WHERE TOTALWORK !=0AND SOFAR <>TOTALWORK;

DBMS_DATAPUMP Export/Import From SQL*Plus

You can invoke DataPump export / import inside SQL*Plus even if you do not have access to OS level EXPDP / IMPDP utility. It replaces traditional export/import utilities and provides a much more efficient way to move large amounts of data. Let's start the Export and Import using DBMS_DATAPUMP PL/SQL API.

Export Schema Using DBMS_DATAPUMP

We will export the table from testuser1from ORCL Database to testuser1to PROD Database. Create a directory for dumps and log

mkdir -p /u01/dp

Create an Oracle directory object pointing to the physical location

create or replace directory test_dir AS'/u01/dp';grant read,write on directory test_dir to testuser1

Let's export EMP table from testuser1 schema

declare l_dp_handle number;begin --Open a table exportjob. l_dp_handle :=dbms_datapump.open( operation=>'EXPORT', job_mode=>'TABLE', remote_link=>NULL, job_name=>'TESTUSER1_EMP_EXPORT', version=>'LATEST'); --Specify the dump file name and directory object name. dbms_datapump.add_file( handle=>l_dp_handle, filename=>'TESTUSER1_EMP.dmp', directory=>'TEST_DIR'); --Specify the log file name and directory object name. dbms_datapump.add_file( handle=>l_dp_handle, filename=>'expdpTESTUSER1_EMP.log', directory=>'TEST_DIR', filetype=>DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); --Specify the table to be exported,filtering the schema and table. dbms_datapump.metadata_filter( handle=>l_dp_handle, name=>'SCHEMA_EXPR', value=>'= ''TESTUSER1'''); dbms_datapump.metadata_filter( handle=>l_dp_handle, name=>'NAME_EXPR', value=>'= ''EMP'''); dbms_datapump.start_job(l_dp_handle); dbms_datapump.detach(l_dp_handle);end;/
Oracle Data Pump - expdp, impdp (1)

Let's review the log file to verify the export status and be sure.

Oracle Data Pump - expdp, impdp (2)

Import Schema Using DBMS_DATAPUMP

Create an Oracle directory object pointing to the physical location on target server

create or replace directory test_dir AS'/u01/dp';grant read,write on directory test_dir to testuser1;

Let's import EMP table in testuser1 schema and change table name to EMP2

declare l_dp_handle number;begin --Open a schema importjob. l_dp_handle :=dbms_datapump.open( operation=>'IMPORT', job_mode=>'TABLE', remote_link=>NULL, job_name=>'TESTUSER1_EMP_IMPORT', version=>'LATEST'); --Specify the dump file name and directory object name. dbms_datapump.add_file( handle=>l_dp_handle, filename=>'TESTUSER1_EMP.dmp', directory=>'TEST_DIR'); --Specify the log file name and directory object name. dbms_datapump.add_file( handle=>l_dp_handle, filename=>'impdpTESTUSER1_EMP_IMPORT.log', directory=>'TEST_DIR', filetype=>DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE); --Perform a REMAP_TABLEfromEMPto EMP2. dbms_datapump.metadata_remap( handle=>l_dp_handle, name=>'REMAP_TABLE', old_value=>'EMP', value=>'EMP2'); dbms_datapump.start_job(l_dp_handle); dbms_datapump.detach(l_dp_handle);end;/
Oracle Data Pump - expdp, impdp (3)

Note: As it is a table import, make sure the schema is already created.

Oracle Data Pump - expdp, impdp (2024)
Top Articles
Latest Posts
Article information

Author: Clemencia Bogisich Ret

Last Updated:

Views: 6309

Rating: 5 / 5 (60 voted)

Reviews: 91% of readers found this page helpful

Author information

Name: Clemencia Bogisich Ret

Birthday: 2001-07-17

Address: Suite 794 53887 Geri Spring, West Cristentown, KY 54855

Phone: +5934435460663

Job: Central Hospitality Director

Hobby: Yoga, Electronics, Rafting, Lockpicking, Inline skating, Puzzles, scrapbook

Introduction: My name is Clemencia Bogisich Ret, I am a super, outstanding, graceful, friendly, vast, comfortable, agreeable person who loves writing and wants to share my knowledge and understanding with you.