Subject | Re: [firebird-support] Problem with deleting many records in a big database |
---|---|
Author | Ann W. Harrison |
Post date | 2005-04-09T15:51:24Z |
Josef Gschwendtner wrote:
to get the index distribution information, and search the output for
indexes with a max_dup value greater than ten thousand. Make such
indexes more selective by appending the record's primary key to the key
for the index. Removing entries from long duplicate chains is a serious
bottleneck - one that Firebird2 addresses.
Regards,
Ann
>Check that table for non-selective indexes. Specifically, use gstat -a
> Our problem is that the garbage collection (GC) deleting the records
> takes to long (several hours) and new users cann't connect to the system
> as long as the gc takes.
to get the index distribution information, and search the output for
indexes with a max_dup value greater than ten thousand. Make such
indexes more selective by appending the record's primary key to the key
for the index. Removing entries from long duplicate chains is a serious
bottleneck - one that Firebird2 addresses.
Regards,
Ann