Question:
I want to know how to use all of the impdp options. I am
familiar with all of the older imp directives, but I need to
understand how to use impdp commands.
Answer:
The impdp (data pump import) has superseded the
older imp utility but the base functionality remains much
the same, but with different impdp syntax and enhancements.
Here are some of the most common impdp functions:
How do I read and write files in import?
You need to create a directory for both expdp and impdp
to be able to write to the OS:
create directory
dmp_dir as 'c:\Users\Don\Wm';
grant read, write on directory dmp_dir
to wm;
How do I do a parameter file in impdp?
impdp ...
PARFILE=my_imp.par
How do I display the table
and index definitions from a database?
expdp ...
CONTENT=metadata_only DUMPFILE=my_metadata
How do I display the table
and index definitions from a expdp .dmp file?
impdp ...
SQLFILE=myddl.sql DUMPFILE=my_metadata
What is the equivalent of the "ignore-y" imp utility
command. I want to pre-create the tables and indexes.
impdp . . . content=data_only
How do I see all of the impdp commands?
impdp help=y
Import: Release
10.1.0.2.0 - Production on Saturday, 11 September, 2004
17:22
Copyright (c) 2003, Oracle. All
rights reserved.
The Data Pump Import utility provides a mechanism for
transferring data objects
between Oracle databases. The utility
is invoked with the following command:
Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
You can control how
Import runs by entering the 'impdp' command followed
by various parameters. To specify
parameters, you use keywords:
Format: impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
Example: impdp
scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
USERID must be the
first parameter on the command line.
Keyword
Description (Default)
------------------------------------------------------------------------------
ATTACH
Attach to existing job, e.g. ATTACH [=job name].
CONTENT
Specifies data to load where the valid keywords are:
(ALL), DATA_ONLY, and METADATA_ONLY.
DIRECTORY
Directory object to be used for dump, log, and sql files.
DUMPFILE
List of dumpfiles to import from (expdat.dmp),
e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
ESTIMATE
Calculate job estimates where the valid keywords are:
(BLOCKS) and STATISTICS.
EXCLUDE
Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
FLASHBACK_SCN
SCN used to set session snapshot back to.
FLASHBACK_TIME
Time used to get the SCN closest to the specified time.
FULL
Import everything from source (Y).
HELP
Display help messages (N).
INCLUDE
Include specific object types, e.g. INCLUDE=TABLE_DATA.
JOB_NAME
Name of import job to create.
LOGFILE
Log file name (import.log).
NETWORK_LINK
Name of remote database link to the source system.
NOLOGFILE
Do not write logfile.
PARALLEL
Change the number of active workers for current job.
PARFILE
Specify parameter file.
QUERY
Predicate clause used to import a subset of a table.
REMAP_DATAFILE
Redefine datafile references in all DDL statements.
REMAP_SCHEMA
Objects from one schema are loaded into another schema.
REMAP_TABLESPACE
Tablespace object are remapped to another tablespace.
REUSE_DATAFILES
Tablespace will be initialized if it already exists (N).
SCHEMAS
List of schemas to import.
SKIP_UNUSABLE_INDEXES Skip indexes
that were set to the Index Unusable state.
SQLFILE
Write all the SQL DDL to a specified file.
STATUS
Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
STREAMS_CONFIGURATION Enable the
loading of Streams metadata
TABLE_EXISTS_ACTION
Action to take if imported object already exists.
Valid keywords: (SKIP), APPEND, REPLACE and TRUNCATE.
TABLES
Identifies a list of tables to import.
TABLESPACES
Identifies a list of tablespaces to import.
TRANSFORM
Metadata transform to apply (Y/N) to specific objects.
Valid transform keywords: SEGMENT_ATTRIBUTES and STORAGE.
ex. TRANSFORM=SEGMENT_ATTRIBUTES:N:TABLE.
TRANSPORT_DATAFILES
List of datafiles to be imported by transportable mode.
TRANSPORT_FULL_CHECK Verify
storage segments of all tables (N).
TRANSPORT_TABLESPACES List of
tablespaces from which metadata will be loaded.
Only valid in NETWORK_LINK mode import operations.
VERSION
Version of objects to export where valid keywords are:
(COMPATIBLE), LATEST, or any valid database version.
Only valid for NETWORK_LINK and SQLFILE.
The following
commands are valid while in interactive mode.
Note: abbreviations are allowed
Command
Description (Default)11g
------------------------------------------------------------------------------
CONTINUE_CLIENT
Return to logging mode. Job will be re-started if idle.
EXIT_CLIENT
Quit client session and leave job running.
HELP
Summarize interactive commands.
KILL_JOB
Detach and delete job.
PARALLEL
Change the number of active workers for current job.
PARALLEL=.
START_JOB
Start/resume current job.
START_JOB=SKIP_CURRENT will start the job after skipping
any action which was in progress when job was stopped.
STATUS
Frequency (secs) job status is to be monitored where
the default (0) will show new status when available.
STATUS=[interval]
STOP_JOB
Orderly shutdown of job execution and exits the client.
STOP_JOB=IMMEDIATE performs an immediate shutdown of the
Data Pump job.