Question

I have some big database with hundreds of functions and tables.
They also have a lot of sequences, custom types and some views and triggers.

There is a point when the quantity of objects will be a problem? Affecting for example parsing?

Did you ever notice some kind of impact for these reasons?

Was it helpful?

Solution

It's a gradual process and the effect is comparatively small. But of course, looking up entries in system tables gets slower with lots of rows. Those are just regular tables. Highly optimized, but regular tables. There are indexes to keep the effect small.

I have never actually noticed an effect in my biggest database with a couple of hundred tables and close to 1000 functions. Sequences, views, triggers .. everything. Your DB seems to have similar numbers - I'd guess you cannot report any observations yourself? Or you would surely have added that in your question.

The cost for other things like backup and VACUUM rises, too, obviously.

Licensed under: CC-BY-SA with attribution
Not affiliated with dba.stackexchange
scroll top