Subject Problem with deleting many records in a big database
Author Josef Gschwendtner
we have one special problem: We have to insert a huge amount of data
(500.000 records a day) into one table of a firebird 1.03 database. The
whole database has about 60 other tables with significant less data.
The data of this special table is needed only 30 days, so we try to
delete older data periodically, otherwise the database is getting to
The system is kind of a 24/7 system.

Our problem is that the garbage collection (GC) deleting the records
takes to long (several hours) and new users cann't connect to the system
as long as the gc takes.

We found out that the time GC needs increases superproportional with the
size of the database (or the table?).

We have to use Firebird 1.03 because the system has to run on RedHat
At the moment we use Firebird SuperServer because Classic didn't seem to
improve the situation (with Classic the delete itself did take much
longer, otherwise Classic did not block the login of new users).

Has anybody any ideas to improve the situation.

Is it "normal" that GC takes more then 10 hours after delete 400.000
records (out from 10 Mio) on a P4 3GHz system with no other job to do?
(it only takes 5 min. if 400.000 out from 900.000 records are deleted)

Any help is much appreciated,
Josef Gschwendtner