Call now: 252-767-6166  
Oracle Training Oracle Support Development Oracle Apps

 
 Home
 E-mail Us
 Oracle Articles
New Oracle Articles


 Oracle Training
 Oracle Tips

 Oracle Forum
 Class Catalog


 Remote DBA
 Oracle Tuning
 Emergency 911
 RAC Support
 Apps Support
 Analysis
 Design
 Implementation
 Oracle Support


 SQL Tuning
 Security

 Oracle UNIX
 Oracle Linux
 Monitoring
 Remote s
upport
 Remote plans
 Remote
services
 Application Server

 Applications
 Oracle Forms
 Oracle Portal
 App Upgrades
 SQL Server
 Oracle Concepts
 Software Support

 Remote S
upport  
 Development  

 Implementation


 Consulting Staff
 Consulting Prices
 Help Wanted!

 


 Oracle Posters
 Oracle Books

 Oracle Scripts
 Ion
 Excel-DB  

Don Burleson Blog 


 

 

 


 

 

 

 

 

11g Parameters for Data Pump Import

Oracle Tips by Burleson Consulting

 

The following are descriptions of some of the new commands in Data Pump import:

* flashback_scn: This specifies the system change number (SCN) that import uses to enable the Flashback utility. The import operation is performed with data that is consistent as of this SCN. This parameter is only relevant when importing through a database link.

* flashback_time: This will use a SCN that is most closely matched to the specified time to enable the Flashback utility.  The import operation is performed with data that is consistent as of this SCN. This parameter is only relevant when importing through a database link.

* query: This allows the filtering of data that is imported by applying a query to SELECT statements executed on the source system.

* network_link: This specifies a database link to the source database and enables import?s network mode.  This parameter is required if any of the following parameters are specified:

* flashback_scn

* flashback_time

* estimate

* transport_tablespace

* query

* remap_datafile: This changes the name of the source datafile to the target datafile name in all DDL statements in which the source datafile is referenced.  This is useful when performing database migration to another system with different file naming conventions.

* remap_schema: This replaces the FROMUSER and TOUSER parameters in the original import utility, and loads all objects from the source schema into the destination schema (the default for this parameter is NONE).

* remap_tablespace: It re-maps all objects selected for import with persistent data in the source tablespace to be created in the destination tablespace on the target system.  This is very useful when the DBA wants to change the default tablespace for a user.

* reuse_datafiles: If the datafile specified in the CREATE TABLESPACE statement already exists and the value of this parameter is set to N, which is the default, an error message is issued.  However, the import job will continue.  If the value is set to Y, a warning message is issued and the existing datafile is reinitialized.  However, this may result in a loss of data.

* schemas: This specifies a list of schemas to import. The schemas themselves are first created, including system and object grants, password history, and so on. Then all objects contained within the schemas are imported. Non-privileged users can specify only their own schemas and no information about the schema definition is imported.

* sqlfile: This specifies a file that will capture all the SQL DDL that the Import utility will be executing based on other parameters. The SQL statement is not executed, and the target database remains unchanged. The new SQL file can first be edited with modifications and then run on the target database. This is similar to the method used in the original export and import utilities in which export is run with ROWS=N and then import is run with SHOW=Y. 

* table_exists_action:  This instructs the import utility on what to do if the table it is trying to import already exists. This parameter is similar to the IGNORE parameter in the original import utility, yet it has more options as follows:

* SKIP: This leaves the table unchanged and moves on to the next table.  SKIP is the default value.

* APPEND: This appends new rows from the source to the existing table.

* TRUNCATE: This deletes existing rows and then loads the row from the source.

* REPLACE: This drops the existing table, creates the new table according to the source table definition, then loads the source data.

When using APPEND or TRUNCATE, checks are made to ensure that rows from the source are compatible with the existing table prior to performing any action.

 
 
 
Get the Complete
Oracle Utility Information 

The landmark book "Advanced Oracle Utilities The Definitive Reference"  contains over 600 pages of filled with valuable information on Oracle's secret utilities. This book includes scripts and tools to hypercharge Oracle 11g performance and you can buy it for 30% off directly from the publisher.
 


 

 
��  
 
 
Oracle Training at Sea
 
 
 
 
oracle dba poster
 

 
Follow us on Twitter 
 
Oracle performance tuning software 
 
Oracle Linux poster
 
 
 

 

Burleson is the American Team

Note: This Oracle documentation was created as a support and Oracle training reference for use by our DBA performance tuning consulting professionals.  Feel free to ask questions on our Oracle forum.

Verify experience! Anyone considering using the services of an Oracle support expert should independently investigate their credentials and experience, and not rely on advertisements and self-proclaimed expertise. All legitimate Oracle experts publish their Oracle qualifications.

Errata?  Oracle technology is changing and we strive to update our BC Oracle support information.  If you find an error or have a suggestion for improving our content, we would appreciate your feedback.  Just  e-mail:  

and include the URL for the page.


                    









Burleson Consulting

The Oracle of Database Support

Oracle Performance Tuning

Remote DBA Services


 

Copyright © 1996 -  2017

All rights reserved by Burleson

Oracle ® is the registered trademark of Oracle Corporation.

Remote Emergency Support provided by Conversational