Question

I have a table having more than 10 mil records and I have to update multiple columns of this table from the data which I get from few other tables which again have large number of records. All the tables are indexed

I tried doing it in batches and updating in concurrent sessions but it it taking very long time. Is there any alternative approach for this problem?

Was it helpful?

Solution

Solution is to build a brand new table(with nologging) and to parallelize.

1) First create your dummy hold table (with no rows):

create table xyz_HOLD as select * from xyz where rownum<1
Avoid logging all these operations*...

Alter table xyz_HOLD nologging

*With nologging, if the system aborts, you simply re-run the 'update' again, as you have the original data in the main table

2)

    insert /*+ append parallel (xyzhold,12) */ 
      into xyz_hold xyzhold (field1, field2, field3) 
    select /*+ parallel (x,12) */ xyz.field1, my_new_value_for_field2, xyz.field3 
      from xyz x 
     where blah blah blah 

3) when done, either rename the table, or swap the partition if your original table is partitioned, and you only updated one partition as we do.

+) Obviously you also need to rebuild indecies, etc as required afterwards.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top