Oracle Backup and Recovery expdp/impdp Data Pump Remote Import and Export
Oracle has a very useful export tool expdp (data pump) at 10g.
However, while this tool is useful, it has a limitation that only local users can export data, that is to say, data pumps can only export data from local databases.
But if the business requirement is to export all tables of a user, it does not need t1,t2,t3 and the database is not local, it is a database in another place.
The normal idea is that exp firstname.lastname@example.org/orcl file=xxxx.dmp log=xxxx.log can be exported from other places, but it takes a lot of time and has a hard requirement that some tables must be removed, so expdp must be used to complete it.
One parameter in expdp is exclude, which allows you to add indexes or tables that you don't want to export.
But think again, expdp must be used, and it is a remote database, the host can not login, it is very contradictory.
After thinking for a long time, I finally came up with a solution.
Make an intermediate link with dblink and export it with expdp
With the completion of the project, we should start:
Create local users
Create a user, a directory, and authorization locally
$ mkdir /home/oracle/dmpfile CREATE USER TEST IDENTIFIED BY TEST DEFAULTE TABLESPACE XXX; CREATE DIRECTORY DMPFILE AS '/oracle/dmpfile'; GRANT resource,connect,imp_full_database to TEST; GRANT read,write on directory dmpfile to test;
There is also a preparatory work to be done:
If you need dblink to export data, you need to give remote users exp_full_database permission
GRANT EXP_FULL_DATABASE TO USERXX;
Note here that the dblink created must be public
Otherwise, the dblink will not be found after it is created in time.
create public database link db_local connect to userxx IDENTIFIED BY passwd using '(DESCRIPTION = (ADDRESS_LIST = (ADDRESS = (PROTOCOL = TCP)(HOST = 192.168.1.100)(PORT = 1521)) ) (CONNECT_DATA = (SERVER = DEDICATED) (SERVICE_NAME = orcl ) ) )';
When the creation is complete, check to see if the data exists
select * from userxx.test@db_local;
If you find the data of the user XX multiple test table, it proves that dblink was created successfully.
Export data with expdp
This is the most critical step, because there are many parameters, so use parfile to complete, the format is better.
vim test.par userid=test/test --Exported User, Local User!! directory=dmpfile --The exported directory must be in oracle Create and authorize the user to read and write dumpfile=userxx.dmp --The name of the exported data file, if you want to be in the specified location, can be written as dumpfile=/home/oracle/userxx.dmp logfile=userxx.log --The default name of the log file if this parameter is not written is export.log，It can be found in the local folder. SCHEMAS=userxx --Use dblink The exported user is not a local user and needs to be added schema Determine the exported user, similar to exp Medium owner，But there are still some differences. EXCLUDE=TABLE:"IN('T1','T2','T3')" --exclude You can specify things that are not exported. table，index Wait a minute, followed by a table name you don't want to export network_link=db_local --This parameter is used dblink To export remotely, you need to specify dblink Name of
Exit save after completion.
You can see the final export.
Import requires this: you must have a directory in oracle, read and write permissions, and imp_full_database permissions
impdp testuser/testuser dumfile=userxx.dmp logfile=imp_userxx.log directory=dmpfile remap_schema=userxx:testuser
The last parameter needs to be added remap_schema=old:new
To import from one user to another, you need to add the last parameter.
So you can import successfully.