Loading data of one table into another residing on different databases - Netezza

StackOverflow https://stackoverflow.com//questions/22031614

  •  21-12-2019
  •  | 
  •  

Question

I have a big file which I have loaded in a table in a netezza database using an ETL tool, lets call this database Staging_DB. Now, post some verifications, the content of this table needs to be inserted into similar structured table residing in another netezza DB, lets call this one PROD_DB. What is the fastest way to transfer data from staging_DB to PROD_DB?

  1. Should I be using the ETL tool to load the data into PROD_DB? Or,
  2. Should the transfer be done using external tables concept?
Was it helpful?

Solution

If there is no transformation need to be done, then better way to transfer is cross database data transfer. As described in Netezza documentation that Netezza support cross database support where the user has object level permission on both databases.

You can check permission with following command -

dbname.schemaname(loggenin_username)=> \dpu username

Please find below working example -

INSERT INTO Staging_DB..TBL1 SELECT * FROM PROD_DB..TBL1

If you want to do some transformation and than after you need to insert in another database then you can write UDT procedures (also called as resultset procedures).

Hope this will help.

OTHER TIPS

One way you could move the data is by using Transient External Tables. Start by creating a flat file from your source table/db. Because you are moving from Netezza to Netezza you can save time and space by turning on compression and using internal formatting.

CREATE EXTERNAL TABLE 'C:\FileName.dat'
USING (
delim 167
datestyle 'MDY'
datedelim '/'
maxerrors 2
encoding 'internal'
Compress True
REMOTESOURCE 'ODBC'
logDir 'c:\' )  AS
SELECT * FROM source_table;

Then create the table in your target database using the same DDL in the source and just load it up.

INSERT INTO target SELECT * FROM external  'C:\FileName.dat'
USING (
delim 167
datestyle 'MDY'
datedelim '/'
maxerrors 2
encoding 'internal'
Compress True
REMOTESOURCE 'ODBC'
logDir 'c:\' );

I would write a SP on production db and do a CTAS from stage to production database. The beauty of SP is you can add transformations as well. One other option is NZ migrate utility provided by Netezza and that is the fastest route I believe.

A simple SQL query like

INSERT INTO Staging_DB..TBL1 SELECT * FROM PROD_DB..TBL1

works great if you just need to do that.

Just be aware that you have to be connected to the destination database when executing the query, otherwise you will get an error code

HY0000: "Cross Database Access not supported for this type of command"

even if you have read/write access to both databases and tables.

In most cases you can simply change the catalog using a "Set Catalog" command

https://www-304.ibm.com/support/knowledgecenter/SSULQD_7.0.3/com.ibm.nz.dbu.doc/r_dbuser_set_catalog.html

set catalog='database_name';
insert into target_db.target_schema.target_table select source_db.source_schema.source_table;
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top