Thursday, July 23, 2009

Data Pump.....New Feature in Oracle 10g

Data Pump:
Data Pump is a server side application therefore the dump files will be generated on the server.It is about 5-6 times faster than export/import.

Steps for configuration of Data Pump:

1.Login as sysdba

2.Create directory at OS and database level with the following commands
mkdir d:\datapump
SQL>CREATE DIRECTORY test_dir AS 'd:\datapump';

3.Grant read and write access to the user.
SQL>GRANT READ, WRITE ON DIRECTORY dmpdir TO ;

4.Ensure that the user have the EXP_FULL_DATABASE and IMP_FULL_DATABASE privileges. To see a list of all roles assigned to you within your security domain, issue the following command
Connect to
SQL> SELECT * FROM SESSION_ROLES;

Now the required setting for Data Pump has been done.

The following are some of the different options that we can use in Data Pump.

Table Exports/Imports:

The TABLES parameter is used to specify the tables that are to be exported. The following is an example of the table export and import syntax:

expdp scott/tiger@db10g tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=expdpEMP_DEPT.log

impdp scott/tiger@db10g tables=EMP,DEPT directory=TEST_DIR dumpfile=EMP_DEPT.dmp logfile=impdpEMP_DEPT.log

The TABLE_EXISTS_ACTION=APPEND parameter allows data to be imported into existing tables.

Schema Exports/Imports:

The OWNER parameter of exp has been replaced by the SCHEMAS parameter which is used to specify the schemas to be exported. The following is an example of the schema export and import syntax:

expdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

impdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR dumpfile=SCOTT.dmp logfile=impdpSCOTT.log

Database Exports/Imports:

The FULL parameter indicates that a complete database export is required. The following is an example of the full database export and import syntax:

expdp system/password@db10g full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=expdpDB10G.log

impdp system/password@db10g full=Y directory=TEST_DIR dumpfile=DB10G.dmp logfile=impdpDB10G.log

Data pump performance can be improved by using the PARALLEL parameter. This should be used in conjunction with the "%U" wildcard in the DUMPFILE parameter to allow multiple dumpfiles to be created or read:

expdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR parallel=4 dumpfile=SCOTT_%U.dmp logfile=expdpSCOTT.log

impdp scott/tiger@db10g schemas=SCOTT directory=TEST_DIR parallel=4 dumpfile=SCOTT_%U.dmp logfile=impdpSCOTT.log

The INCLUDE and EXCLUDE parameters can be used to limit the export/import to specific objects. When the INCLUDE parameter is used, only those objects specified by it will be included in the export. When the EXCLUDE parameter is used all objects except those specified by it will be included in the export:

expdp scott/tiger@db10g schemas=SCOTT include=TABLE:"IN ('EMP', 'DEPT')" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

expdp scott/tiger@db10g schemas=SCOTT exclude=TABLE:"= 'BONUS'" directory=TEST_DIR dumpfile=SCOTT.dmp logfile=expdpSCOTT.log

Suspending and Resuming Export Jobs (Attaching and Re-Attaching to the Jobs):

You can suspend running export jobs and later on resume these jobs or kill these jobs using Data Pump Export. You can start a job in one client machine and then, if because of some work, you can suspend it. Afterwards when your work has been finished you can continue the job from the same client, where you stopped the job, or you can restart the job from another client machine.
For Example, suppose a DBA starts a full database export by typing the following command at one client machine CLNT1 by typing the following command

$expdp scott/tiger@mydb FULL=y DIRECTORY=data_pump_dir DUMPFILE=full.dmp LOGFILE=myfullexp.log JOB_NAME=myfullJob

After some time, the DBA wants to stop this job temporarily. Then he presses CTRL+C to enter into interactive mode. Then he will get the Export> prompt where he can type interactive commands
Now he wants to stop this export job so he will type the following command
Export> STOP_JOB=IMMEDIATE
Are you sure you wish to stop this job ([y]/n): y
The job is placed in a stopped state and exits the client.

After finishing his other work, the DBA wants to resume the export job and the client machine from where he actually started the job is locked because, the user has locked his/her cabin. So now the DBA will go to another client machine and he reattach to the job by typing the following command
$expdp hr/hr@mydb ATTACH=myfulljob

After the job status is displayed, he can issue the CONTINUE_CLIENT command to resume logging mode and restart the myfulljob job.
Export> CONTINUE_CLIENT
A message is displayed that the job has been reopened, and processing status is output to the client.

Note: After reattaching to the Job a DBA can also kill the job by typing KILL_JOB, if he doesn’t want to continue with the export job.

There is one impdp parameter "SQLFILE", if you use this in impdp command it will not import the data into the database instead of that it will create one SQL file for the data that has to be imported.

No comments: