Deleting records in batches can be done in a PL/SQL loop, but is generally considered bad practice as the entire delete should normally be considered as a single transaction; and that can't be done from within the SQL*Loader control file. Your DBA should size the UNDO
space to accommodate the work you need to do.
If you're deleting the entire table you'll almost certainly be better off truncating anyway, either in the control file:
options(skip=1,load=250000,errors=0,ROWS=30000,BINDSIZE=10485760)
load data
infile 'G:1.csv' "str '^_^'"
truncate
into table IMPORT_ABC
...
Or as a separate truncate
statement in SQL*Plus/SQL Developer/some other client before you start the load:
truncate table import_abc;
The disadvantage is that your table will appear empty to other users while the new rows are being loaded, but if it's a dedicated import area (guessing from the name) that may not matter anyway.
If your UNDO
is really that small then you may have to run multiple loads, in which case - probably obviously - you need to make sure you only have the truncate
in the control file for the first one (or use the separate truncate
statement), and have append
instead in subsequent control files as you noted in comments.
You might also want to consider external tables if you're using this data as a base to populate something else, as there is no UNDO
overhead on replacing the external data source. You'll probably need to talk to your DBA about setting that up and giving you the necessary directory permissions.