Subject Re: Massive INSERT test
Author Aage Johansen
I set the Database with forced write off and local connection, page
size 4096 matches my NTFS cluster size.

ForcedWrites=OFF is ok if you can just run the complete batch again from
the beginning (starting with a fresh database). This may be ok with a new
empty database as any corruption can be mended by starting again from sqare
As your db will get BIG, maybe a bigger page size is appropriate.
How many buffer pages do you use? (maybe not very important when just

Adding 10,000 records took 34 seconds at first.
After emptying the table it took 50 seconds. I read some place that
it was more efficient for FireBird to reuse allocated disk space than
allocate new space. I cannot confirm it with my setup.

Did you do GarbageCollection between DELETE and new INSERTS?
After the DELETE you should have committed, and then a SELECT count(*) will
clean up any debris.

Also, do mind the comments (from others) on committing.
If you're using Delphi+IBO, use prepared DSQL.

You have 3 indexes. Inserts will be fast, but if the selectivity is low
you may be "killed" by long response times for updates and deletes. So,
with low selectivity you should sonsider adding the PK column to the index
(making them in fact unique).

There may still be some speed problems with Fb1.5beta2 (mentioned in the
firebird-devel NG).

Aage J.