Domanda

We have a table that we denormalized because we have a big risk that the joins would be too slow for the amount of data of our users. So we created 10 columns info (INFO0, INFO1... INFO9). Most of the time, only the 2-3 first column are use, the others are null.

But now, we need to add two more type of infos with 10 columns each (for a total of 20 new columns). The tricky part, is that our design will make impossible for the users to use all of the 30 denormalized columns. At all time, they will always be able to use a maximum of 10 on each row. Moreover, we could need to add even more new denormalized columns, but we will never be able to use more than 10 on each row.

I know it is not a good design, but we don't really have the choice. So my question is : can this design become inefficient? Can having a lot of columns with null values slow down my queries? If yes, can it become a big deal?

È stato utile?

Soluzione

Yes it could. You don't say what database you're using or what data type the extra columns are, but adding more columns is going to increase the 'width' of your table which means that more logical reads are needed to retrieve the same number of records, more reads equals slower speed. So what you gain by denormalisation may eventually be lost by adding too many columns, but the extent of this will depend on your database design.

If it does affect performance an intermediate solution could be to vertically split the table placing infrequently referenced columns in a second table.

Autorizzato sotto: CC-BY-SA insieme a attribuzione
Non affiliato a StackOverflow
scroll top