Question

I access several tables remotely via DB Link. They are very normalized and the data in each is effective-dated. Of the millions of records in each table, only a subset of ~50k are current records.

The tables are internally managed by a commercial product that will throw a huge fit if I add indexes or make alterations to its tables in any way.

What are my options for speeding up access to these tables?

Was it helpful?

Solution

You could try to create a materialized view of some subset of the tables over the DB link and then query from those.

OTHER TIPS

I think you're stuck between a rock and a hard place here, but in the past the following has worked for me:

You can pull down a snapshot of the current data at specified intervals, every hour or nightly or whatever works, and add your indexes to your own tables as needed. If you need realtime access the data, then you can try pulling all the current records into a temp table and indexing as needed.

The extra overhead of copying from one database into your own may dwarf the actual benefit, but its worth a shot.

You will need to look at the plans. You may be able to change the order of the join, add criteria, or provide hints to make it faster, but without the explain plan, you do not know why it is slow, so you don't even know IF you can make it faster.

Could you take a daily dump of the records you need into your own database / tables?

Archive data that's no longer current. (Or if that's not acceptable, data that exceeds some staleness threshold suitable for your requirements.)

What about creating a materialized/indexed view? That might help a bit.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top