Subject Changes on live database
Author Christian Kaufmann

I have a database with the biggest table about 3 million records,
growing with 10000 records a week.

The database can be queried live live on a webserver. Currently I have
always a copy where I do the changes. Then I rebuilt part of a redundant
information table, copy the database file to the webserver and change
the alias to the updated database.

During the rebuilt of the redundant table I deactivate the 5 indexes on
that table to have better speed.

In the future I want to do the changes on the live database on the
webserver. This means, I cannot deactivate the indexes during the
rebuilt of the redundant table, because this would slow down certain web

When I run the rebuilt in a background task with keeping the indexes
live it is dramatically slower (about 5 - 10 times). Of course currently
I do everything local on my development computer, but my question is,
how this task will scale in the future.

I checked the indexes for duplicates:
66% dups. , max dups goes from 25 to 500, one index has 15'000

The rebuilt is normally about 50 - 100 times a task, that deletes 10'000
records in the table and then inserts the same number again.

Probably the index with a lot of duplicates causes the problem and this
should be fixed with FB 2.0? Or is there same thing else that I should
change? Will having the database server and the webserver on different
machines help to improve speed? Or is it not worth it, because it
wouldn't be local access any more, but through the network?

cu Christian