Professional Documents
Culture Documents
In this Document
Goal
Solution
References
Applies to:
Oracle Database - Standard Edition - Version 10.1.0.2 to 12.1.0.2 [Release 10.1 to 12.1]
Enterprise Manager for Oracle Database - Version 10.1.0.2 to 12.1.0.6.0 [Release 10.1 to 12.1]
Oracle Database - Personal Edition - Version 10.1.0.2 to 12.1.0.2 [Release 10.1 to 12.1]
Oracle Database - Enterprise Edition - Version 10.1.0.2 to 12.1.0.2 [Release 10.1 to 12.1]
Information in this document applies to any platform.
Goal
This document explains how to resolve the following errors during an Export DataPump (expdp)
or Import DataPump job (impdp):
Connected to: Oracle Database 11g Enterprise Edition Release 11.2.0.1.0 - Production
1
With the Partitioning, OLAP, Data Mining and Real Application Testing options
-- or: --
-- or: --
There are several possible reasons why a Data Pump cannot be started. Each root cause has its
own solution.
Solution
1. First check the value for the STREAMS_POOL_SIZE in the database:
connect / as sysdba
If the STREAMS_POOL_SIZE is too small, then a Data Pump job will fail. This can also
happen when using Automatic Shared Memory Management (ASMM), or Automatic Memory
2
Management (AMM) and there is not sufficient memory to increase the
STREAMS_POOL_SIZE.
Manual settings for the STREAMS_POOL_SIZE of 64M, 128M or even to 256M have proven
to be successful.
connect / as sysdba
show parameter aq
col owner for a10
col object_name for a30
analyze table kupc$datapump_quetab validate structure cascade;
analyze table kupc$datapump_quetab_1 validate structure cascade;
select object_id, owner, object_name, status from dba_objects
where object_name like 'KUPC$DATAPUMP_QUETAB%';
set lines 100
col status for a9
col object_type for a20;
col owner.object for a50
select status, object_id, object_type, owner||'.'||object_name "OWNER.OBJECT"
from dba_objects
where object_name like '%DATAPUMP_QUETAB%' order by 3,4;
If there are any invalid queue objects, then a Data Pump job will fail. This usually also results in
the following error in the alert.log file:
ORA-00600: internal error code, arguments: [kwqbgqc: bad state], [1], [1], [], [], [], [], []
3. Check for any invalid registry components (CATALOG, CATPROC and JAVAVM), and
invalid sys owned objects:
connect / as sysdba
set lines 90
col version for a12
col comp_id for a8
3
col schema like version
col comp_name format a35
col status for a12
select comp_id,schema,status,version,comp_name from dba_registry order by 1;
If the registry components CATALOG, CATPROC and/or JAVAVM, and/or objects like
SYS.KUPW$WORKER or SYS.KUPP$PROC are invalid, then a Data Pump job will likely fail.
To resolve this problem, reload Data Pump in the database:
connect / as sysdba
shutdown immediate
-- for 9.2, use: startup migrate
startup migrate
@?/rdbms/admin/catalog.sql
@?/rdbms/admin/catproc.sql
@?/rdbms/admin/utlrp.sql
spool off
spool registry.out
-- Registry status:
set lines 90
col version for a12
col comp_id for a8
col schema like version
4
col comp_name format a35
col status for a12
select comp_id,schema,status,version,comp_name from dba_registry order by 1;
-- Invalid objects:
set lines 120
col status for a9
col object_type for a20;
col owner.object for a50
select status, object_id, object_type, owner||'.'||object_name "OWNER.OBJECT"
from dba_objects
where status != 'VALID' and owner='SYS' and object_name not like 'BIN$%'
order by 4,2;
shutdown immediate
startup
spool off
connect / as sysdba
If this hidden parameter is set, then a Data Pump job will fail.
5
5. If the Data Pump job is started through a package, check if the package was created with
invoker's right (AUTHID clause):
connect / as sysdba
If the package was created with an invoker's right, then a Data Pump job will fail when started
through this package.
6. If the Data Pump job is started in DBConsole / OEM, and the job is selected to be re-run (or
you want to edit the job), then the Data Pump job will fail and following errors will be reported:
-- or --
Edit is not supported for this job type, only general information
7. If parameter LOGTIME is being used, Data Pump export/import with LOGTIME parameter
crashes if the environment variable NLS_DATE_FORMAT is set.
6
References
NOTE:754401.1 - Errors ORA-31623 And ORA-600 [kwqbgqc: bad state] During DataPump Export Or
Import
NOTE:1080775.1 - UDE-31623 Error With DataPump Export
NOTE:308388.1 - Error ORA-31623 When Submitting A DataPump Job
NOTE:1150733.1 - DataPump Export (EXPDP) Fails With Errors ORA-31623 ORA-6512 If Partameter
_FIX_CONTROL='6167716:OFF' Has Been Set
NOTE:1579091.1 - DataPump Job Fails With Error ORA-31623 A Job Is Not Attached To This Session Via
The Specified Handle
NOTE:788301.1 - Error ORA-31623 On DataPump Export Via DBScheduler After First Run Was Successful
NOTE:461307.1 - How To Export Database Using DBConsole/OEM In 10G
NOTE:863312.1 - Best Practices for running catalog, catproc and utlrp script
NOTE:430221.1 - How To Reload Datapump Utility EXPDP/IMPDP
NOTE:1936319.1 - Data Pump Export Or Import Throws ORA-31623 When Using LOGTIME Parameter