Question

I am trying to insert a large number of records from a CSV file to a table in Oracle DB, in a Linux environment. The insert is simple - the CSV has IDs that need to be inserted to a single column table, for further processing.

Following was my line of thought: 1. Read the file into an array 2. Loop through each item in the array and insert it into the table

I cannot use SQL *Loader or DBI, the access to the DB server is restricted and I am not allowed to install any extra packages on the host that I use to connect to the DB. So I need to use the simplest things available.

Here is the piece that works:

#!/usr/bin/perl
use warnings;
use strict;

my $returnVar;
my $inputfile = $ARGV[0];

# Read file into an array
open(FILE, "<", $inputfile) or die("Unable to open file: $inputfile");
my @idlist = <FILE>;
close(FILE);

#insert array into table
foreach my $id(@idlist)
{
   chomp($id);
   $returnVar = `sqlplus -s cvo/cvo123\@CVOSTAGE\ <<END
        set heading off
                    set echo off
                    set feedback off
                    set pagesize 0
        insert into tmp_tx_id values('$id');
        exit`;
}

This is faulty in the sense that you need to open/close a connection every time you insert records. I need to insert a large number of records (> 1 million) and this, i agree, is a horrible way to do it. I tried to modify it in the following way, but am not successful:

.
.
<File has been read into the array>

$returnVar1 = `sqlplus -s cvo/cvo123\@CVOSTAGE\ <<END
                      set heading off
                      set echo off
                      set feedback off
                      set pagesize 0`;

foreach my $id(@idlist)
{
    chomp($id);
    $returnVar2 = `insert into tmp_tx_id values(\'$id\');`;
}
system("exit");

Apparently, the insert statement is not executed on the SQL Plus prompt - it goes to the command shell. This is the error I get:

sh: -c: line 0: syntax error near unexpected token `('
sh: -c: line 0: `insert into tmp_tx_id values('664436565');'

Is there any way I can execute multiple Insert statements, without having to open/close the connection each time? Any other suggestions(even outside Perl) are really welcome - I am a programmer in learning.

Thanks

Was it helpful?

Solution

You could shoehorn the shell's heredoc notation into a Perl backticks expression, but consider the alternative of writing your sql commands to a file:

open my $fh, '>', '/tmp/my-inserts.sql';
print $fh "set heading off\n", "set echo off\n", ... ;
foreach my $id(@idlist)
{
    chomp($id);
    print $fh "insert into tmp_tx_id values('$id');";
}
close $fh;

... and then ...

$returnVar1 = `sqlplus -s cvo/cvo123\@CVOSTAGE  < /tmp/my-inserts.sql`;

This has the bonus of making debugging easier -- when sqlplus tells you there was an error with line 4683, you have a record of what you were trying to insert on line 4683.

You should also consider that installing modules on your system is not as impossible as it seems, and then use DBI with placeholders like every other Perl programmer that works with databases.

OTHER TIPS

You can prepare a script with the sql inserts (maybe not the whole file, you could split it up in chunks of 100K inserts) and execute them directly from the command line with sqlplus, as so :

sqlplus <user@connect> @inserts.sql

At the beginning of the script file you can specify a log file

spool '<logfile>'

plus any options you want (set trimspool on is useful for triming the spaces at the end of lines) and at the end :

commit ;
spool off

It will be easy to prepare the script(s) in perl.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top