Question

Let me summarize the problem first and I'll give details of the SQL I used to get where I'm at after the summary.

I'm exporting a schema from a production AWS RDS Oracle instance, using a database link to download the file to my local development database, then running an import locally on an empty database of a freshly installed Oracle in a Docker container. The export and import use Datapump. I get a very ambiguous error message "invalid operation" with equally ambiguous details suggesting I call "DBMS_DATAPUMP.GET_STATUS" to "further describe the error". When I do, I get exactly the same ambiguous "invalid operation" with a suggestion to call "GET_STATUS" to further describe the error.

I'm at a loss of where to even begin in diagnosing and solving this problem.

Here are the detailed steps I took. I have substituted our schema name with "MY_SCHEMA" to protect the identity of our client... and if there's any mismatch in that text, I assure you it is correct in my console and just a mistake in the substitution for this question. I used SQLDeveloper to run these commands.

  1. On the AWS RDS Oracle instance running 19g
    DECLARE
    hdnl NUMBER;
    BEGIN
    hdnl := DBMS_DATAPUMP.OPEN( operation => 'EXPORT', job_mode => 'SCHEMA', job_name=>null, version=> '18.4.0.0.0');
    DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'my_schema.dmp', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_dump_file);
    DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'my_schema.log', directory => 'DATA_PUMP_DIR', filetype => dbms_datapump.ku$_file_type_log_file);
    DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''MY_SCHEMA'')');
    DBMS_DATAPUMP.START_JOB(hdnl);
    END;
    /
  1. Connect from my local dev database (18g) to the AWS RDS instance and download the dmp file. And yes, here I connect as the schema owner and not "master". This seems to work to download the file and connecting as "master" does not, where dumping as the schema owner doesn't work in step one; unless you can instruct me how to do that and if that would solve my problem.
    create database link to_rds connect to my_schema identified by password using '(DESCRIPTION=(ADDRESS=(PROTOCOL=TCP)(HOST=my_schema.aljfjske.us-west-1.rds.amazonaws.com)(PORT=1521))(CONNECT_DATA=(SID=ORCL)))';
    
    BEGIN
    DBMS_FILE_TRANSFER.GET_FILE(
    source_directory_object       => 'DATA_PUMP_DIR',
    source_file_name              => 'my_schema.dmp',
    source_database               => 'to_rds',
    destination_directory_object  => 'DATA_PUMP_DIR',
    destination_file_name         => 'my_schema.dmp'
    );
    END;
    /
  1. Start the import while logged in as "sys" with role "sysdba" on my local database (connected to the pluggable database called "my_schema").
    DECLARE
    hdnl NUMBER;
    BEGIN
    hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);
    DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'my_schema.dmp', directory => 'DATA_PUMP_DIR');
    DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''MY_SCHEMA'')');
    DBMS_DATAPUMP.START_JOB(hdnl);
    end;
    /

And I get the following error:

DECLARE
hdnl NUMBER;
BEGIN
hdnl := DBMS_DATAPUMP.OPEN( operation => 'IMPORT', job_mode => 'SCHEMA', job_name=>null);
DBMS_DATAPUMP.ADD_FILE( handle => hdnl, filename => 'my_schema.dmp', directory => 'DATA_PUMP_DIR');
DBMS_DATAPUMP.METADATA_FILTER(hdnl,'SCHEMA_EXPR','IN (''MY_SCHEMA'')');
DBMS_DATAPUMP.START_JOB(hdnl);
end;
Error report -
ORA-39002: invalid operation
ORA-06512: at "SYS.DBMS_DATAPUMP", line 7297
ORA-06512: at "SYS.DBMS_SYS_ERROR", line 79
ORA-06512: at "SYS.DBMS_DATAPUMP", line 4932
ORA-06512: at "SYS.DBMS_DATAPUMP", line 7291
ORA-06512: at line 7
39002. 00000 -  "invalid operation"
*Cause:    The current API cannot be executed because of inconsistencies
           between the API and the current definition of the job.
           Subsequent messages supplied by DBMS_DATAPUMP.GET_STATUS
           will further describe the error.
*Action:   Modify the API call to be consistent with the current job or
           redefine the job in a manner that will support the specified API.

I've spent 6+ hrs working on this already reading Oracle docs, guides, trying things, printing more information to the console, and nothing. I get exactly the same error message with no more information. The dump file is on the system and I'm pretty sure it's being read properly because I can call utl_file.fgetattr to get its size. I've also tried exporting and importing with different users. nothing. I'm totally in the dark here. Even suggestions on what to try to diagnose this would be much appreciated. This is a fresh install of Oracle Database 18g Express Edition using Oracle's Docker container files on their GitHub account (which is pretty slick, BTW). The production system on RDS has been up for several years and I've exported Datapump dozens of times during those years and successfully imported it into my local 11g Express Edition installation on Fedora Linux. (Which no longer works since the production database was upgraded from 12g to 19g recently. That started me on this whole path.)

Était-ce utile?

La solution

Public information:

https://mikedietrichde.com/2019/05/14/data-pump-the-time-zone-pitfalls/

https://oracle-base.com/blog/2020/02/19/data-pump-between-database-versions-its-not-just-about-the-version-parameter/

Further information on Oracle support site:

Impdp Fails With ORA-39002: Invalid Operation (Doc ID 2482971.1)

Updated DST Transitions and New Time Zones in Oracle RDBMS and OJVM Time Zone File Patches (Doc ID 412160.1)

Oracle 19c by default uses timezone version 32, but maybe Amazon patched it even further (currently highest available version is 35).

Oracle 18c uses timezone version 31. Patches are available for download.

To be honest, I have never used XE and can not remember ever trying to apply any patch on any version of it. I would not bother. It is your local personal development database, just download and use 19c.

Licencié sous: CC-BY-SA avec attribution
Non affilié à dba.stackexchange
scroll top