Thursday, May 22, 2008

ORACLE DATA PUMP Part #2

Export of an entire database to a dump file with all GRANTS, INDEXES, and data

> exp username/password FULL=y FILE=dba.dmp GRANTS=y INDEXES=y ROWS=y

> expdp username/password FULL=y INCLUDE=GRANT INCLUDE=INDEX DIRECTORY=dpump_dir1 DUMPFILE=dba.dmp CONTENT=ALL

Data Pump offers much greater metadata filtering than original Export and Import. The INCLUDE parameter allows you to specify which object (and its dependent objects) you want to keep in the export job. The EXCLUDE parameter allows you to specify which object (and its dependent objects) you want to keep out of the export job. You cannot mix the two parameters in one job. Both parameters work with Data Pump Import as well, and you can use different INCLUDE and EXCLUDE options for different operations on the same dump file.

Tuning Parameters

Unlike original Export and Import, which used the BUFFER, COMMIT, COMPRESS, CONSISTENT, DIRECT, and RECORDLENGTH parameters, Data Pump needs no tuning to achieve maximum performance. Data Pump chooses the best method to ensure that data and metadata are exported and imported in the most efficient manner. Initialization parameters should be sufficient upon installation.

Moving data between versions

The Data Pump method for moving data between different database versions is different from the method used by original Export and Import. With original Export, you had to run an older version of Export to produce a dump file that was compatible with an older database version. With Data Pump, you use the current Export version and simply use the VERSION parameter to specify the target database version. You cannot specify versions earlier than Oracle Database 10g (since Data Pump did not exist before 10g).

> expdp username/password TABLES=hr.employees VERSION=10.1 DIRECTORY =dpump_dir1 DUMPFILE=emp.dmp

Data Pump Import can always read dump file sets created by older versions of Data Pump Export. Note that Data Pump Import cannot read dump files produced by original Export.


Maximizing the Power of Oracle Data Pump

Data Pump works great with default parameters, but once you are comfortable with Data Pump, there are new capabilities that you will want to explore.

Parallelism

Data Pump Export and Import operations are processed in the database as a Data Pump job, which is much more efficient that the client-side execution of original Export and Import. Now Data Pump operations can take advantage of the server’s parallel processes to read or write multiple data streams simultaneously (PARALLEL is only available in the Enterprise Edition of Oracle Database.) The number of parallel processes can be changed on the fly using Data Pump’s interactive command-line mode. You may have a certain number of processes running during the day and decide to change that number if more system resources become available at night (or vice versa).

For best performance, you should do the following:
  1. Make sure your system is well balanced across CPU, memory, and I/O.
  2. Have at least one dump file for each degree of parallelism. If there aren’t enough dump files, performance will not be optimal because multiple threads of execution will be trying to access the same dump file.
  3. Put files that are members of a dump file set on separate disks so that they will be written and read in parallel.
  4. For export operations, use the %U variable in the DUMPFILE parameter so multiple dump files can be automatically generated.
> expdp username/password DIRECTORY=dpump_dir1 JOB_NAME=hr DUMPFILE=par_exp%u.dmp PARALLEL=4

REMAP

REMAP_TABLESPACE – This allows you to easily import a table into a different tablespace from which it was originally exported. The databases have to be 10.1 or later.

> impdp username/password REMAP_TABLESPACE=tbs_1:tbs_6 DIRECTORY =dpumpdir1 DUMPFILE=employees.dmp

REMAP_DATAFILES – This is a very useful feature when you move databases between platforms that have different file naming conventions. This parameter changes the source datafile name to the target datafile name in all SQL
statements where the source datafile is referenced. Because the REMAP_DATAFILE value uses quotation marks, it’s best to specify the parameter within a parameter file.

The parameter file, payroll.par, has the following content:

DIRECTORY=dpump_dir1
FULL=Y
DUMPFILE=db_full.dmp
REMAP_DATAFILE=”’C:\DB1\HRDATA\PAYROLL\tbs6.dbf’:’/db1/hrdata/pa
yroll/tbs6.dbf’”

You can then issue the following command:

> impdp username/password PARFILE=payroll.par

Continued in ORACLE DATA PUMP Part #3

No comments: