Data pump - "ORA-39786: Number of columns does not match between export and import databases"

Today I was asked if it was possible to import data for a specific table from a data pump export file, which was taken from when a table had a different structure to what it has now. e.g. when the export was taken, there were 10 columns.  Now there are 11.

I thought that was simple, and I was sure I'd heard of this being done before.  Simply import using the "CONTENT=DATA_ONLY" parameter.  Right?

Apparently not:
Centos-/home/oracle/demo> impdp DEMO_USER/yyyyyyy directory=INBOUND dumpfile=demo_data.dmp logfile=impdpdemo.log content=data_only tables=xxxxxxxx
Import: Release 10.2.0.4.0 - 64bit Production on Tuesday, 08 July, 2014 14:32:20
Copyright (c) 2003, 2007, Oracle.  All rights reserved.
Connected to: Oracle Database 10g Release 10.2.0.4.0 - 64bit Production
Master table "DEMO_USER"."SYS_IMPORT_TABLE_01" successfully loaded/unloaded
Starting "DEMO_USER"."SYS_IMPORT_TABLE_01":  xxxxxxxx/******** directory=INBOUND dumpfile=demo_data.dmp logfile=impdpdemo_log content=data_only tables=xxxxxxxxx
Processing object type SCHEMA_EXPORT/TABLE/TABLE_DATA
ORA-31693: Table data object "DEMO_USER"."xxxxxxxxxxx" failed to load/unload and is being skipped due to error:
ORA-29913: error in executing ODCIEXTTABLEOPEN callout
ORA-29400: data cartridge error
ORA-39786: Number of columns does not match between export and import databases
Job "DEMO_USER"."SYS_IMPORT_TABLE_01" completed with 1 error(s) at 14:32:22
A spot of googling and checking My Oracle Support led me to this note: 435193.1.  Turns out that if data pump chooses to use the "External Tables" method of importing, this won't work because the table it's importing from has different number of columns - it's not doing a straight insert.

Reasons why data pump will choose the "External Tables" method of importing are documented here (these are 11.2 docs - I know I'm using 10.2 here - http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_overview.htm#SUTIL805) but I'll list them below for reference:

If any of the following conditions exist for a table, then Data Pump uses external tables rather than direct path to load the data for that table:
  • A global index on multipartition tables exists during a single-partition load. This includes object tables that are partitioned.
  • A domain index exists for a LOB column.
  • A table is in a cluster.
  • There is an active trigger on a preexisting table.
  • Fine-grained access control is enabled in insert mode on a preexisting table.
  • A table contains BFILE columns or columns of opaque types.
  • A referential integrity constraint is present on a preexisting table.
  • A table contains VARRAY columns with an embedded opaque type.
  • The table has encrypted columns.
  • The table into which data is being imported is a preexisting table and at least one of the following conditions exists:
    • There is an active trigger
    • The table is partitioned
    • Fine-grained access control is in insert mode
    • A referential integrity constraint exists
    • A unique index exists
  • Supplemental logging is enabled and the table has at least one LOB column.
  • The Data Pump command for the specified table used the QUERYSAMPLE, or REMAP_DATA parameter.
  • A table contains a column (including a VARRAY column) with a TIMESTAMP WITH TIME ZONE datatype and the version of the time zone data file is different between the export and import systems.

I've bolded the ones that I found relevant to my problem.

The options to work round this limitation are:
  1. Rename the existing table, import the data to a new table, then select into the renamed table.
  2. Drop the unique indexes and referential integrity constraints, import with data pump, then re-enable the constraints.
We chose option 1.  Seemed simpler.  And we're all sorted.

Comments

Popular posts from this blog

Data Guard with Transparent Application Failover (TAF)

RMAN-05531 During RMAN Duplicate from Active Data Guard Standby