Subject Poor query performance with large tables - Please help
Author arhampshire
Hi,

I'm developing an database that has some relatively large tables of
about 9GB. To start I have tried the default installation of
Firebird 1.5 Classic Server on Windows XP. The database file is
located on a separate disk.

The table I have consisting of columns A, B, C, D and it has about
120,000,000 rows. The column types are:
A Numeric(18,0)
B Date
C Numeric(18,0)
D Numeric(9,5)

I created a composite index on A, B, C. If I execute the following
query:

select A, min(B), max(B), count(*)
from myTable
where A = 10
group by A;

the time for the query seems to vary quite widely, ranging from 3
seconds upwards.

Are there any tips, or pointers to improve performance of queries?

Some thoughts I had, if anyone has got any comments on them are:
1. Does the Numeric datatype have performance implications?
2. Is is better to have several separate indexes to improve
performance for queries like my own (e.g. an index on column A
alone)?
3. Engine settings and temp files.
4. Is Superserver superior to Classic Server for large DBs?
5. Do I need to chunk the tables?
6. Is Firebird capable of handling these volumes with fast query
times on Windows XP?

Thanks very much in advance.