Question

I have an existing MySQL database which I'm trying to migrate to PostgreSQL using the following steps. The database is fairly simple - it has a few foreign keys and other constraints but no triggers, procedures etc.

  1. Use DBIx::Class::Schema::Loader to produce a set of Result classes from the existing MySQL database.
  2. Use the Result classes to produce a set of CREATE TABLE statements for PostgreSQL.
  3. Run the CREATE TABLE statements using psql to set up the tables (I haven't got as far as data importing yet).

The script I am using is as follows (with credentials and DB name removed):

#!/usr/bin/perl

use Modern::Perl;
use DBIx::Class::Schema::Loader qw/ make_schema_at /;

my $dsn = 'dbi:mysql:dbname=database';
my $user = '';
my $pass = '';

make_schema_at(
  'MyDB::Schema',
  { debug => 1, dump_directory => './lib' },
  [ $dsn, $user, $pass
  ],
);

my $schema = MyDB::Schema->connect($dsn, $user, $pass);
$schema->create_ddl_dir(['PostgreSQL'], '0.1', './', undef, { add_drop_table => 0 });

The script runs successfully, and both the Result classes and the .sql file (containing all the CREATE TABLE statements) look as I would expect.

However, several tables have a slug column which is marked as UNIQUE in the original MySQL schema, and results in the following lines as part of the CREATE TABLE statement:

"slug" character varying(50) NOT NULL,
CONSTRAINT "slug" UNIQUE ("slug")

When I attempt to import the data (using psql < tables.sql), I get the following error on every table with a unique slug column after the first:

NOTICE:  CREATE TABLE / UNIQUE will create implicit index "slug" for table "mytable"
ERROR:  relation "slug" already exists

My understanding is that index names have to be unique within a given database. I don't have this problem with MySQL as I just declare slug VARCHAR(50) NOT NULL UNIQUE without specifying an index name.

Is there a way to make DBIx::Class (or SQL::Translator, which the create_ddl_dir function uses) generate a unique index name in the data it outputs? I don't particularly care what the indexes are called - though something based on the table name would be sensible. I've looked through the documentation but I can't see any parameters which allow this.

I could just edit every constraint manually before importing the .sql file, but there are over 250 tables and a lot of clashes - plus I would have to do this every time I tweaked the migration process and had to regenerate the classes and SQL.

Was it helpful?

Solution

In the end the only automated way to fix this was with a Perl script that forced every constraint name (and thus the automatic index) to be unique by parsing the output from SQL::Translator and appending _$i to each name, where $i is incremented each time a new constraint is seen.

Code is as follows - assumes you will pipe the schema in on STDIN:

#!/usr/bin/perl

use Modern::Perl;

my $i = 1;

while (my $line = <>)
{
  if ($line =~ m/CONSTRAINT\s+"([a-zA-Z_]+)"\s+UNIQUE\s+\("([a-zA-Z_]+)"\)/)
  {    
    my $constraint_name = $1 . '_' . $i;
    my $column_name = $2;

    my $replace_str = 'CONSTRAINT "' . $constraint_name . '" UNIQUE ("' . $column_name . '")';

    $line =~ s/CONSTRAINT\s+"([a-zA-Z_]+)"\s+UNIQUE\s+\("([a-zA-Z_]+)"\)/$replace_str/;
    $i++;
  }

  print $line;
}
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top