Below are some enhancements done in Oracle 11g:
-Compression: parameter now can use to compress both data and Metadata with the help of values "ALL".
expdp quest/quest DIRECTORY= quest_pump_dir DUMPFILE=quest_comp.dmp compression=ALL
-Data Pump Encryption Enhancements:
Different encryption modes are available during the export. These include Dual, Password, and Transparent. The mode that is used is set using the encryption_mode parameter.
Here is an example of using the new encryption parameters for a data pump export:
expdp quest/quest DIRECTORY=quest_pump_dir DUMPFILE=quest_comp.dmp encryption=all encryption_password=not_quest encryption_algorithm=AES128 encryption_mode=PASSWORD
-Data Pump Rename Table(or remap_table)
Sometimes its desirable to be able to rename a table during the import process. In Oracle 11g you can now do this with Oracle Data Pump. To rename a table during the import process use the remap_table parameter. Using the remap_table parameter define the original table schema and name, and then indicate the new name for the table. Here is an example where the table QUEST.NAMES is renamed to COPY_NAMES:
impdp quest/quest DIRECTORY=quest_pump_dir DUMPFILE=quest.dmp tables=QUEST.NAMES remap_table=QUEST.NAMES:COPY_NAMES
-Data Pump and Partitioned Tables:
If you want to have some control over partitioning of tables during a Oracle Data Pump import you can use the partition_options parameter which is a new feature available in impdp in Oracle 11g. The following optioins are available when using the partition_options parameter:
None - Tables will be imported such that they will look like those on the system on which the export was created.
Departition - Partitions will be created as individual tables rather than partitions of a partitioned table.
Merge - Causes all partitions to be merged into one, unpartitioned table.
In this example we use the partition_options parameter to take all partitions of the NAMES table and combine them into one unpartitioned table on the destination side:
impdp quest/quest DIRECTORY=quest_pump_dir DUMPFILE=quest.dmp tables=QUEST.NAMES partition_options=merge
-Overwrite Dump Files
When using impdp in Oracle 11g you can now overwrite any dump file that might already exist by using the new reuse_dumpfiles parameter as seen in this example:
expdp quest/quest DIRECTORY=quest_pump_dir DUMPFILE=quest.dmp tables=QUEST.NAMES reuse_dumpfiles
Data Pump Data_Options Parameter
Have you ever loaded records into a table with Oracle Data Pump Import, only to have the entire load fail because a few duplicate records were in the import file. Assume that your EMP table has three rows in it. EMPNO is the primary key and here is what a query of the table looks like:
SQL> select empno from emp;
EMPNO
----------
7369
8000
8001
3 rows selected.
When you try to load your Data Pump Export file the load fails because it has these three rows in it and they will violate the primary key of the table. The load and the resulting error messages would look like this:
C:\oracle\product\11.1.0\db_1\NETWORK\ADMIN>impdp scott/robert dumpfile=new_emp.dmp directory=data_pump_dir tables=emp table_exists_action=append
Import: Release 11.1.0.6.0 - Production on Tuesday, 01 January, 2008 23:51:32
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SCOTT"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "SCOTT"."SYS_IMPORT_TABLE_01": scott/******** dumpfile=new_emp.dmp directory=data_pump_dir tables=emp table_exists_action=
append
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39152: Table "SCOTT"."EMP" exists. Data will be appended to existing table but all dependent metadata will be skipped due to tab
le_exists_action of append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
ORA-31693: Table data object "SCOTT"."EMP" failed to load/unload and is being skipped due to error:
ORA-00001: unique constraint (SCOTT.PK_EMP) violated
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/USER_PREF_STATISTICS
Job "SCOTT"."SYS_IMPORT_TABLE_01" completed with 2 error(s) at 23:51:35
And the resulting table looks just the same:
SQL> select empno from emp;
EMPNO
----------
7369
8000
8001
3 rows selected.
The load of the EMP table has completely failed. Data Pump does not come with a reject limit clause like SQL*Loader does, so the whole load of the table just fails, even if it's just one record causing the problem.
Oracle 11g solves this problem with a new parameter for use with Data Pump Import. This parameter is data_options=skip_constraint_errors. When this parameter is used Data Pump Import will ignore the rows that generate a constraint error (while still not loading them of course) and will continue to load those rows that do not generate an error. Here is an example of an import using this new parameter:
SQL> select empno from emp;
EMPNO
----------
7369
8000
8001
3 rows selected.
C:\oracle\product\11.1.0\db_1\NETWORK\ADMIN>impdp scott/robert dumpfile=new_emp.dmp directory=data_pump_dir tables=emp table_exists_action=append data_options=skip_constraint_errors
Import: Release 11.1.0.6.0 - Production on Tuesday, 01 January, 2008 23:55:33
Copyright (c) 2003, 2007, Oracle. All rights reserved.
Connected to: Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
With the Partitioning, OLAP, Data Mining and Real Application Testing options
Master table "SCOTT"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "SCOTT"."SYS_IMPORT_TABLE_01": scott/******** dumpfile=new_emp.dmp directory=data_pump_dir tables=emp table_exists_action=
append data_options=skip_constraint_errors
Processing object type TABLE_EXPORT/TABLE/TABLE
ORA-39152: Table "SCOTT"."EMP" exists. Data will be appended to existing table but all dependent metadata will be skipped due to tab
le_exists_action of append
Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
. . imported "SCOTT"."EMP" 8.187 KB 13 out of 16 rows
3 row(s) were rejected with the following error:
ORA-00001: unique constraint (SCOTT.PK_EMP) violated
Rejected rows with the primary keys are:
Rejected row #1:
column EMPNO: 8000
Rejected row #2:
column EMPNO: 8001
Rejected row #3:
column EMPNO: 7369
Processing object type TABLE_EXPORT/TABLE/INDEX/INDEX
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/INDEX/STATISTICS/INDEX_STATISTICS
Processing object type TABLE_EXPORT/TABLE/CONSTRAINT/REF_CONSTRAINT
Processing object type TABLE_EXPORT/TABLE/STATISTICS/TABLE_STATISTICS
Processing object type TABLE_EXPORT/TABLE/STATISTICS/USER_PREF_STATISTICS
Job "SCOTT"."SYS_IMPORT_TABLE_01" completed with 1 error(s) at 23:55:36
Note in this case that 3 rows were rejected, but the remaining 13 rows were successfully loaded. Here is a select query against the table. This looks much better:
SQL> select empno from emp;
EMPNO
----------
7369
7499
7521
7566
7654
7698
7782
7788
7839
7844
7876
7900
7902
7934
8000
8001
16 rows selected.
There is one main restriction with this feature and that is that the Data Pump import must use the external table method when loading data. Data Pump can use two methods of importing (or exporting) data, direct path and external tables. When you import using data_options=skip_constraint_errors Data Pump must use external tables. Normally this will happen as a default, so you don't have to worry about it. However, if you do get an error because Oracle Data Pump is not using the external table method you can use the undocumented parameter access_method=external_table to try to force Data Pump to use external tables.
The new Oracle Data Pump parameter data_options also provides for special handling of other types of data-related issues. Use the XML_CLOBS setting of the data_options parameter of the Data Pump Export utility to override the default behavior of Oracle Data Pump, which is to compress the format of the XMLType CLOB. If you use this option be aware that the XML schemas at the source and destination must be the same.
-The Transportable Parameter:
When the transportable parameter is used in impdp or expdp only the metadata associated with specific tables, partitions, or sub-partitions will be extracted, rather than all metadata. You can then proceed to transport the associated data files as you normally would. You export the data with expdp:
expdp quest/quert DIRECTORY=quest_pump_dir DUMPFILE=quest.dmp tables=QUEST.NAMES transportable=always
You then copy the data files and the dump file set to the destination and plug in the database. You would use impdp as a part of this process to import the metadata into the database, as seen in this example:
impdp quest/quert DIRECTORY=quest_pump_dir DUMPFILE=quest.dmp tables=QUEST.NAMES remap_schema=Robert:new_Robert
Enjoy:-)