EXP / IMP Utility Help

Using the EXP / IMP utility in Oracle. Here is an easy way to find all the options for Data Pump Export or Import. Open up a command window and type the following.

C:\>expdp help=y
or
C:\>impdp help=y

You will get the following information.

C:\>expdp help=y

Export: Release 11.1.0.6.0 – Production on Wednesday, 11 March, 2009 12:54:28

Copyright (c) 2003, 2007, Oracle.  All rights reserved.

The Data Pump export utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:

Example: expdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp

You can control how Export runs by entering the ‘expdp’ command followed
by various parameters. To specify parameters, you use keywords:

Format:  expdp KEYWORD=value or KEYWORD=(value1,value2,…,valueN)
Example: expdp scott/tiger DUMPFILE=scott.dmp DIRECTORY=dmpdir SCHEMAS=scott
or TABLES=(T1:P1,T1:P2), if T1 is partitioned table

USERID must be the first parameter on the command line.

Keyword               Description (Default)
——————————————————————————
ATTACH                Attach to existing job, e.g. ATTACH [=job name].
COMPRESSION           Reduce size of dumpfile contents where valid keyword.
values are: ALL, (METADATA_ONLY), DATA_ONLY and NONE.
CONTENT               Specifies data to unload where the valid keyword
values are: (ALL), DATA_ONLY, and METADATA_ONLY.
DATA_OPTIONS          Data layer flags where the only valid value is:
XML_CLOBS-write XML datatype in CLOB format
DIRECTORY             Directory object to be used for dumpfiles and logfiles.
DUMPFILE              List of destination dump files (expdat.dmp),
e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION            Encrypt part or all of the dump file where valid keyword
values are: ALL, DATA_ONLY, METADATA_ONLY,
ENCRYPTED_COLUMNS_ONLY, or NONE.
ENCRYPTION_ALGORITHM  Specify how encryption should be done where valid
keyword values are: (AES128), AES192, and AES256.
ENCRYPTION_MODE       Method of generating encryption key where valid keyword
values are: DUAL, PASSWORD, and (TRANSPARENT).
ENCRYPTION_PASSWORD   Password key for creating encrypted column data.
ESTIMATE              Calculate job estimates where the valid keyword
values are: (BLOCKS) and STATISTICS.
ESTIMATE_ONLY         Calculate job estimates without performing the export.
EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FILESIZE              Specify the size of each dumpfile in units of bytes.
FLASHBACK_SCN         SCN used to set session snapshot back to.
FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
FULL                  Export entire database (N).
HELP                  Display Help messages (N).
INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME              Name of export job to create.
LOGFILE               Log file name (export.log).
NETWORK_LINK          Name of remote database link to the source system.
NOLOGFILE             Do not write logfile (N).
PARALLEL              Change the number of active workers for current job.
PARFILE               Specify parameter file.
QUERY                 Predicate clause used to export a subset of a table.
REMAP_DATA            Specify a data conversion function,
e.g. REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
REUSE_DUMPFILES       Overwrite destination dump file if it exists (N).
SAMPLE                Percentage of data to be exported;
SCHEMAS               List of schemas to export (login schema).
STATUS                Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
TABLES                Identifies a list of tables to export – one schema only.
TABLESPACES           Identifies a list of tablespaces to export.
TRANSPORTABLE         Specify whether transportable method can be used where
valid keyword values are: ALWAYS, (NEVER).
TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
TRANSPORT_TABLESPACES List of tablespaces from which metadata will be unloaded.
VERSION               Version of objects to export where valid keywords are:
(COMPATIBLE), LATEST, or any valid database version.

The following commands are valid while in interactive mode.
Note: abbreviations are allowed

Command               Description
——————————————————————————
ADD_FILE              Add dumpfile to dumpfile set.
CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT           Quit client session and leave job running.
FILESIZE              Default filesize (bytes) for subsequent ADD_FILE commands.

HELP                  Summarize interactive commands.
KILL_JOB              Detach and delete job.
PARALLEL              Change the number of active workers for current job.
PARALLEL=<number of workers>.
REUSE_DUMPFILES       Overwrite destination dump file if it exists (N).
START_JOB             Start/resume current job.
STATUS                Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
STATUS[=interval]
STOP_JOB              Orderly shutdown of job execution and exits the client.
STOP_JOB=IMMEDIATE performs an immediate shutdown of the
Data Pump job.

———————————————————————————————-

C:\>impdp help=y

Import: Release 11.1.0.6.0 – Production on Wednesday, 11 March, 2009 12:57:46

Copyright (c) 2003, 2007, Oracle.  All rights reserved.

The Data Pump Import utility provides a mechanism for transferring data objects
between Oracle databases. The utility is invoked with the following command:

Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp

You can control how Import runs by entering the ‘impdp’ command followed
by various parameters. To specify parameters, you use keywords:

Format:  impdp KEYWORD=value or KEYWORD=(value1,value2,…,valueN)
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp

USERID must be the first parameter on the command line.

Keyword               Description (Default)
——————————————————————————
ATTACH                Attach to existing job, e.g. ATTACH [=job name].
CONTENT               Specifies data to load where the valid keywords are:
(ALL), DATA_ONLY, and METADATA_ONLY.
DATA_OPTIONS          Data layer flags where the only valid value is:
SKIP_CONSTRAINT_ERRORS-constraint errors are not fatal.
DIRECTORY             Directory object to be used for dump, log, and sql files.
DUMPFILE              List of dumpfiles to import from (expdat.dmp),
e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ENCRYPTION_PASSWORD   Password key for accessing encrypted column data.
This parameter is not valid for network import jobs.
ESTIMATE              Calculate job estimates where the valid keywords are:
(BLOCKS) and STATISTICS.
EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FLASHBACK_SCN         SCN used to set session snapshot back to.
FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
FULL                  Import everything from source (Y).
HELP                  Display help messages (N).
INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME              Name of import job to create.
LOGFILE               Log file name (import.log).
NETWORK_LINK          Name of remote database link to the source system.
NOLOGFILE             Do not write logfile.
PARALLEL              Change the number of active workers for current job.
PARFILE               Specify parameter file.
PARTITION_OPTIONS     Specify how partitions should be transformed where the
valid keywords are: DEPARTITION, MERGE and (NONE)
QUERY                 Predicate clause used to import a subset of a table.
REMAP_DATA            Specify a data conversion function,
e.g. REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO
REMAP_DATAFILE        Redefine datafile references in all DDL statements.
REMAP_SCHEMA          Objects from one schema are loaded into another schema.
REMAP_TABLE           Table names are remapped to another table,
e.g. REMAP_TABLE=EMP.EMPNO:REMAPPKG.EMPNO.
REMAP_TABLESPACE      Tablespace object are remapped to another tablespace.
REUSE_DATAFILES       Tablespace will be initialized if it already exists (N).
SCHEMAS               List of schemas to import.
SKIP_UNUSABLE_INDEXES Skip indexes that were set to the Index Unusable state.
SQLFILE               Write all the SQL DDL to a specified file.
STATUS                Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
STREAMS_CONFIGURATION Enable the loading of Streams metadata
TABLE_EXISTS_ACTION   Action to take if imported object already exists.
Valid keywords: (SKIP), APPEND, REPLACE and TRUNCATE.
TABLES                Identifies a list of tables to import.
TABLESPACES           Identifies a list of tablespaces to import.
TRANSFORM             Metadata transform to apply to applicable objects.
Valid transform keywords: SEGMENT_ATTRIBUTES, STORAGE,
OID, and PCTSPACE.
TRANSPORTABLE         Options for choosing  transportable data movement.
Valid keywords: ALWAYS and (NEVER).
Only valid in NETWORK_LINK mode import operations.
TRANSPORT_DATAFILES   List of datafiles to be imported by transportable mode.
TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
TRANSPORT_TABLESPACES List of tablespaces from which metadata will be loaded.
Only valid in NETWORK_LINK mode import operations.
VERSION               Version of objects to export where valid keywords are:
(COMPATIBLE), LATEST, or any valid database version.
Only valid for NETWORK_LINK and SQLFILE.

The following commands are valid while in interactive mode.
Note: abbreviations are allowed

Command               Description (Default)
——————————————————————————
CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT           Quit client session and leave job running.
HELP                  Summarize interactive commands.
KILL_JOB              Detach and delete job.
PARALLEL              Change the number of active workers for current job.
PARALLEL=<number of workers>.
START_JOB             Start/resume current job.
START_JOB=SKIP_CURRENT will start the job after skipping
any action which was in progress when job was stopped.
STATUS                Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
STATUS[=interval]
STOP_JOB              Orderly shutdown of job execution and exits the client.
STOP_JOB=IMMEDIATE performs an immediate shutdown of the
Data Pump job.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s