Question

I'm using perl with an externally supplied sqlite database to pre-generate about 9k pages for a statically-served website.

I have some algorithmic improvements to the original data, but for consistency sake, I think it'll make more sense if my scripts leave the original database alone, and modify the data only in memory (if needed, even the whole database should fit into RAM without any issues), prior to generating the web-pages.

How do I do this?

I smell that I could benefit from some kind of an SQL transaction feature here (which should be abandoned after my scripts are done), is that correct? Does it sound like a good idea, given the constraint of wanting to keep the original database file unmodified? (E.g., where and how the data from ongoing transactions is stored when using SQLite from Perl?) Any other way to accomplish what I need? (I'm perl and sql novice -- looking for annotated code samples.)

I guess as a last resort I could always make a copy of the database file within the filesystem, but that would seem like an ugly and inefficient solution.

Was it helpful?

Solution

I haven't tested this, but you should be able to use the SQLite backup API to load your database into memory. You'd likely open :memory: (or "" if you want a temporary, file backed DB), and then call sqlite_backup_from_file to load the database in one shot.

e.g. (untested)

my $db = DBI->connect("dbi:SQLite:dbname=:memory:", '', '', {});
$db->sqlite_backup_from_file($ENV{DATABASE});
#Database should be loaded into the current memory DB at this point.

OTHER TIPS

You could create a RAM disk and put the database path on that. The details will depend on your OS.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top