Subject Re: [firebird-support] Normalisation + performance
Author Ann W. Harrison
vishy anand wrote:
>
> I know that one should go for normalization.

Pragmatically, I wouldn't bother with degrees of normalization over 4.

> But how does it affect performance?

Unfortunately, that's very much application specific. If you have a
data warehousing application in which data is stored (sometimes) and
selected (often) but never updated, then you can manage a level of data
duplication that would be a problem in an update intensive application.

As a general thing, selecting a lot of fields from a single table is
more efficient than select a few fields from each of a lot of tables.
On the other hand, updating a few fields in a small table is much more
efficient that update a few fields in each of a lot of large records.

> How can I test performance of a normalized table vs
> denormalised table?

Probably by simulation...

> How does a database is affected..physically?

Some kinds of unnormalized data will lead to long duplicate chains in
indexes which have a bad effect on garbage collection, again a problem
for update intensive applications but much less of a problem for those
that only insert and select.

> Why having columns which can have "null" in many rows should be avoided?
> thanks

As above... Lots of duplicates in an index. And a general feeling of
not having bathed in a long time...

Regards,


Ann