Question

I have a requirement to insert bulk data into an Oracle database from a CSV file. Now table columns specs match those of the CSV file's header with the exception of three additional fields in database:

  1. A Primary Key field (for which a simple SEQUENCE.NEXTVAL is called)
  2. A field for the name of the CSV file
  3. A field for the last modified date+time of the file

The following stack question address an extra column issue, but the solution is pretty easy because it used Oracle sysdate which is internally available. I need to pass a parameter from either batch script/shell script.

Insert actual date time in a row with SQL*loader

Can PARFILE help here somehow?

My other alternative would be to do the whole task in two steps by writing a small java code:

  1. Use SQL Loader for bulk upload leaving out data for the filename and modified time
  2. And then run a separate update statement to populate the newly created rows

But I'm looking for something which will get the job done in one shot. Any advice??

Was it helpful?

Solution

I'm affraid it's not possible with sqlldr alone. There is no tools for this in sqlldr. You'd need some sort of script or a program to dynamically create a .ctl file for each load. Here is a bash script to help you get started:

#!/bin/bash -xv
readonly MY_FILENAME=$1
readonly DB_BUF_TABLE=$2
readonly SQLLDR_CTL="LOAD DATA
CHARACTERSET UTF8
APPEND INTO TABLE $DB_BUF_TABLE
FIELDS TERMINATED BY ';'(
  filename \"$MY_FILE_NAME\",
  col_foo,
  col_bar
)"

echo "$SQLLDR_CTL" > "loader.ctl"
sqlldr control=loader.ctl parfile=loader.par data="$MY_FILENAME"
sqlldrReturnValue=$?

You'd needsome locking with this.. or path separation for concurrent loads to be sure sqlldr starts with proper ctl file

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top