Domanda

I have a relatively wide table (50 columns). Of those, the first 5 represent the scope of a certain subset of data and the last 45 represent properties (mapped at deployment time, so static). The users have the possibility (through a query builder) to query against such table and any combination of its property columns.

I am now wondering what would be the best strategy to build indexes for this table. Performance-wise, would it be better to create a large number of "small" indexes or a small number of indexes spanning many columns?

È stato utile?

Soluzione

If the table is not heavily updated(updates, inserts, deletes) the beste strategy is to create bitmap indexes for every row. The bitmap indexes are small and are the best for combining where conditions.

If the table is updated considerable you should probably rely on creating normal(btree) indexes on the columns most queryed(after an analysis on all queries.)

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top