Call now: 252-767-6166  
Oracle Training Oracle Support Development Oracle Apps

 
 Home
 E-mail Us
 Oracle Articles
New Oracle Articles


 Oracle Training
 Oracle Tips

 Oracle Forum
 Class Catalog


 Remote DBA
 Oracle Tuning
 Emergency 911
 RAC Support
 Apps Support
 Analysis
 Design
 Implementation
 Oracle Support


 SQL Tuning
 Security

 Oracle UNIX
 Oracle Linux
 Monitoring
 Remote s
upport
 Remote plans
 Remote
services
 Application Server

 Applications
 Oracle Forms
 Oracle Portal
 App Upgrades
 SQL Server
 Oracle Concepts
 Software Support

 Remote S
upport  
 Development  

 Implementation


 Consulting Staff
 Consulting Prices
 Help Wanted!

 


 Oracle Posters
 Oracle Books

 Oracle Scripts
 Ion
 Excel-DB  

Don Burleson Blog 


 

 

 


 

 

 
 

expdp remote db_link tables

Expert Oracle Database Tips by Donald BurlesonMarch 25, 2015

RMAN Exporting Remote Database Schema Objects

To export remote database schema objects, use the network_link parameter.

For this, you need to have a database link created. Just create a database link from the first database to the second one, change the parameter file and add the network_link=db_link_name parameter.

Thus, you will export objects from the second database to the dumpfile that is created on the first server.

NETWORK_LINK - You can initiate an export job from your server and have Data Pump export data from a remote database to dump files located on the instance from which you initiate the Data Pump export job.

expdp hr/hr DIRECTORY=dpump_dir1
NETWORK_LINK=source_database_link
DUMPFILE=network_export.dmp



Export and Import Over the Network (by Kim Floss)

Typically, Oracle Data Pump Export results in a binary format dumpfile that is then loaded (using Oracle Data Pump Import) into another database. However, in addition to this file-based Oracle Data Pump Export and Import, Oracle Data Pump provides a network-based approach for direct loading to or unloading from one database to another.

Most DBAs are probably familiar with using the legacy Oracle Export and Import utilities over a network, using named pipes as the mechanism through which to pass the data. One of the problems with that approach can be that named pipes aren't supported on every platform.

Instead of using named pipes, Oracle Data Pump uses database links for exporting and importing data between source and target over the network. And, as with the named pipe mechanism, the network_link feature entirely bypasses dumpfile creation, transmitting data from one database to another across the network.

This feature should be of keen interest to any DBAs spending lots of time doing production-to-test refreshes for development or extracting data from a read-only data warehouse. (For a read-only database, you must use Oracle Data Pump Export's network_link mechanism, initiating the process on the target database, because Oracle Data Pump won't be able to create the necessary master table on the read-only database.)

Because network bandwidth is the limiting factor, the network_link capability is best used in conjunction with the filtering capability, as a means of siphoning off a subset of data (rather than an entire multigigabyte or -terabyte database) from one database to another.

  • Set up a remote link from a satellite database (database 2) to the master database (database 1).
  • Create the import as a SQL script to run weekly from the satellite database to import selected tables from the source system, filtering and transforming as needed (see Listing 2).

Code Listing 2: Sample script using DBMS_DATAPUMP API for import

DECLARE
   ind NUMBER; - Loop index number
   jobhandle NUMBER; - Data Pump job handle
   percent_done NUMBER; - Percentage of job complete
   job_state VARCHAR2(30); - Keeps track of job state
   le ku$_LogEntry; - work-in-progress and error messages
   js ku$_JobStatus; - Job status from get_status
   jd ku$_JobDesc; - Job description from get_status
   sts ku$_Status; - Status object returned by get_status
 
BEGIN
 
? Create a (user-named) Data Pump job to do a "table-level" import
 
? using a network link
jobhandle := DBMS_DATAPUMP.OPEN('IMPORT','TABLE', 'RM_TECHLAB', 'DEV_REFRESH');
 
? Set parallelism to 1? network import/export always uses 1 anyway, may not
? be necessary to explicitly set...
DBMS_DATAPUMP.SET_PARALLEL(jobhandle, 1); 
 
? Remap schema objects from source named BATCH_JOBS to target KFLOSS.
DBMS_DATAPUMP.METADATA_REMAP(jobhandle,'REMAP_SCHEMA','BATCH_JOBS','KFLOSS');
 
? If a table already exists in the destination schema, don't override (this
? is default behavior.)
DBMS_DATAPUMP.SET_PARAMETER(jobhandle,'TABLE_EXISTS_ACTION','SKIP');
 
? Use statistics (rather than blocks) to estimate time.
DBMS_DATAPUMP.SET_PARAMETER(jobhandle,'ESTIMATE','STATISTICS');
 
? Start the job. An exception is returned if something is not set up properly.
DBMS_DATAPUMP.START_JOB(jobhandle);
 
? The import job should now be running. This loop monitors the job until 
? it completes, displaying progress information if there are problems
 
percent_done := 0;
job_state := 'UNDEFINED';
 
while (job_state != 'COMPLETED') and (job_state != 'STOPPED') loop
DBMS_DATAPUMP.get_status(jobhandle,
DBMS_DATAPUMP.ku$_status_job_error +
DBMS_DATAPUMP.ku$_status_job_status +
DBMS_DATAPUMP.ku$_status_wip,-1,job_state,sts);
js := sts.job_status;
 
? As the percentage-complete changes in this loop, the new value displays.
if js.percent_done != percent_done
then
DBMS_OUTPUT.PUT_LINE('*** Job percent done = ' || to_char(js.percent_done));
percent_done := js.percent_done;
end if;
 
? Displays any work-in-progress or error messages received for the job.
 
if (bitand(sts.mask,dbms_datapump.ku$_status_wip) != 0)
then
le := sts.wip;
else
if (bitand(sts.mask,dbms_datapump.ku$_status_job_error) != 0)
then
le := sts.error;
else
le := null;
end if;
end if;
if le is not null
then
ind := le.FIRST;
while ind is not null loop
DBMS_OUTPUT.PUT_LINE(le(ind).LogText);
ind := le.NEXT(ind);
end loop;
end if;
end loop;
 
? When the job finishes, display status before detaching from job.
DBMS_OUTPUT.PUT_LINE('Job has completed');
DBMS_OUTPUT.PUT_LINE('Final job state = ' || job_state);
DBMS_DATAPUMP.DETACH(jobhandle);
END;
/

When you run the script on the satellite system, it connects to the source system and brings the data over directly to the target. The master table is created in the schema of the user on the initiating system and not in the database source. This approach works well for read-only databases and also as a means of refreshing development databases with extracts from a production system.

 

 
 
 
Get the Complete
Oracle Backup & Recovery Details 

The landmark book "Oracle Backup & Recovery: Expert secrets for using RMAN and Data Pump " provides real world advice for resolving the most difficult Oracle performance and recovery issues. Buy it for 40% off directly from the publisher.
 


 

 

Burleson is the American Team

Note: This Oracle documentation was created as a support and Oracle training reference for use by our DBA performance tuning consulting professionals.  Feel free to ask questions on our Oracle forum.

Verify experience! Anyone considering using the services of an Oracle support expert should independently investigate their credentials and experience, and not rely on advertisements and self-proclaimed expertise. All legitimate Oracle experts publish their Oracle qualifications.

Errata?  Oracle technology is changing and we strive to update our BC Oracle support information.  If you find an error or have a suggestion for improving our content, we would appreciate your feedback.  Just  e-mail:  

and include the URL for the page.


                    









Burleson Consulting

The Oracle of Database Support

Oracle Performance Tuning

Remote DBA Services


 

Copyright © 1996 -  2020

All rights reserved by Burleson

Oracle ® is the registered trademark of Oracle Corporation.

 

 

��  
 
 
Oracle Training at Sea
 
 
 
 
oracle dba poster
 

 
Follow us on Twitter 
 
Oracle performance tuning software 
 
Oracle Linux poster