Data Pump In Oracle 10g

The dumpfile are created on the server side so we need to use the directory parameter to specify the location of the dump file


SQL> Create directory test_dir as '/opt/app/oracle/dump_files';

Directory created.

The directory /opt/app/oracle/dump_files must exist on the server In order to for the user to use the directory, you need to grant priviledge to the user on the directory

SQL> grant read, write on directory test_dir to scott;

Data Pump Export Parameters
Following is the brief list of parameters that can be passed with data pump export job. For complete list you can do expdp help=y

DIRECTORY – Used to specify the location for dump file or logfile or SQL file.

DUMPFILE – Used to specify the name of dump file. And yes it replaces the FILE parameter in traditional export

FILESIZE – Used to specify the size of the dump file that it should create. If the size exceeds, then it will write the dump in another file if you have given multiple filenames in DUMPFILE parameter or used %U while giving the name of dump file.

PARFILE – Used to provide the name and location of parameter file which has all the required parameter for the export job to run

LOGFILE and NOLOGFILE – You can specify LOGFILE parameter to to create log file for export job. If you dont specify LOGFILE parameter, Oracle will still create the log file named export.log. Subsequent export jobs will overwrite it as this is the default name. You can specify NOLOGFILE parameter if you want to totally avoid creation of log file for your export job.

CONTENT – Using content parameter, you can filter what goes into the dump file. CONTENT parameter can take 3 values:
ALL, DATA_ONLY, METADATA_ONLY

EXCLUDE and INCLUDE – Used for fine grained filtering of the objects that can go in export dump file

QUERY – Used to control the rows of the table that can go into the export dump file

ESTIMATE – Used to estimate the time and space required for export dump file in addition to doing export

ESTIMATE_ONLY – Used to only estimate the time and space requirement for the export job. This parameter will not export and data into the dump file.

NETWORK_LINK – Used to get the export dump from some other database into the current database server. This parameter expects a
database network link. Example   expdp hr/hr DIRECTORY=test_dir NETWORK_LINK=finance@prod1 … Here finance is the remote database on prod1 server from which you are exporting the data.

ATTACH – Used to connect to the currently running job.

JOBNAME – Used to provide custom name to the export jobs.

STATUS – Used to display the status of the job. This parameter takes the numeric values in seconds.

Data Pump Import Parameters

Some of the parameters in import job are same in export like PARFILE, DIRECTORY, DUMPFILE, LOGFILE, NOLOGFILE etc

Some of the other parameters in import are

SQLFILE – Used to extract DDL from the export dump file. If you specify the SQLFILE parameters it extracts the DDLs, no import takes place.

REUSE_DATAFILES – This parameter tells data pump whether it should existing datafile for creating tablespace during import. If REUSE_DATAFILES is set to Y then import utility will write over your existing datafiles.

CONTENT – Using content parameter, you can filter what goes into the dump file. CONTENT parameter can take 3 values: ALL, DATA_ONLY, METADATA_ONLY

TABLE_EXISTS_ACTION – In case the table already exist in the database which a import job is trying to import from the export dump file, you can use this parameter to tell import job what to do. The valid values are SKIP, APPEND, TRUNCATE, REPLACE

TRANSPORTABLE_TABLESPACE – used when you want to transport the tablespace from one database to another

REMAP_SCHEMA – Used to move objects from one user to another. Replacement of old TOUSER parameter.

REMAP_DATAFILE – Used to remap the data file names specially when transferring the data from one platform to another. Example from windows to UNIX

REMAP_TABLESPACE – Sometimes you want the tablespace in which you are importing the data to be different from the source database tablespace. This parameter comes handy for doing that.

NETWORK_LINK – This parameters is used to import the data directory into the database from some other database without having to create a dump file.

TRANSFORM – In case you don’t want to import the objects storage attributes but just the content, you can use this parameter. TRANSFORM parameter instruct the data pump import job to modify the DDL that creates the object during import. You can modify 2 basic attributes SEGMENT_ATTRIBUTE and STORAGE.

FLASHBACK_TIME – Enables you to import data consistent as of flashback time you specify in the import job. Replacement of old CONSISTENT parameter in traditional import utility.

Examples

Table exports/Imports

expdp ds_repos/dsrepos@db10g tables=ACCOUNTS,DEPT directory=TEST_DIR dumpfile=DS_REPOS_TBS.dmp logfile=expdpDS_REPOS_TBS.log
impdp ds_repos/dsrepos@db10g tables=ACCOUNTS,DEPT directory=TEST_DIR dumpfile=DS_REPOS_TBS.dmp logfile=impdpDS_REPOS_TBS.log

The TABLE_EXISTS_ACTION=APPEND parameter allows data to be imported into existing tables.

Schemas exports/imports

expdp ds_repos/dsrepos@db10g schemas=DS_REPOS directory=TEST_DIR dumpfile=DS_REPOS.dmp logfile=expdpDS_REPOS.log
impdp ds_repos/dsrepos@db10g schemas=DS_REPOS directory=TEST_DIR dumpfile=DS_REPOS.dmp logfile=impdpDS_REPOS.log

Database exports/imports

expdp system/password@db10g full=Y directory=TEST_DIR dumpfile=NESPETS.dmp logfile=expdpNESPETS.log
impdp system/password@db10g full=Y directory=TEST_DIR dumpfile=NESPETS.dmp logfile=impNESPETS.log

Monitoring data Pump Jobs


SQL> select owner_name , job_name , operation , job_mode, state , degree, attached_sessions from dba_datapump_jobs;

OWNER_NAME JOB_NAME                       OPERATION  JOB_MODE                       STATE                      DEGREE ATTACHED_SESSIONS
---------- ------------------------------ ---------- ------------------------------ ------------------------------ ---------- -----------------
SYSTEM     SYS_EXPORT_FULL_01             EXPORT     FULL                           EXECUTING                       1                 1

SQL> select owner_name, job_name, saddr from dba_datapump_sessions;

OWNER_NAME JOB_NAME                       SADDR
---------- ------------------------------ ----------------
SYSTEM     SYS_EXPORT_FULL_01             000000007F496CB8
SYSTEM     SYS_EXPORT_FULL_01             000000007F49ECE0
SYSTEM     SYS_EXPORT_FULL_01             000000007F48EC90
SQL> select username , opname, target_desc, sofar, totalwork,message from v$session_longops where target_desc='EXPORT';

USERNAME                       OPNAME               TARGET_DES      SOFAR  TOTALWORK MESSAGE
------------------------------ -------------------- ---------- ---------- ---------- --------------------------------------------------
SYSTEM                         SYS_EXPORT_FULL_01   EXPORT             34        150 SYS_EXPORT_FULL_01: EXPORT : 34 out of 150 MB done

Data pump performance can be improved by using the PARALLEL parameter. This should be used in conjunction with the “%U”
wildcard in the DUMPFILE parameter to allow multiple dumpfiles to be created or read.

expdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR parallel=4 dumpfile=SCOTT_%U.dmp logfile=expdpSCOTT.log

The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects.
When the INCLUDE parameter is used, only those objects specified by it will be included in the export.
When the EXCLUDE parameter is used all objects except those specified by it will be included in the export.

expdp scott/tiger@db10g schemas=SCOTT include=TABLE:”IN (‘EMP’, ‘DEPT’)” directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

expdp scott/tiger@db10g schemas=SCOTT exclude=TABLE:”= ‘BONUS'” directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

Leave a comment