Subject Firebird and huge data
Author Christian Stengel
Hi *,

I have to make a reporting engine for a big company. I get data from 15
different sources. I have to create reports on a daily and monthly base
(+ some reports on the fly - but not on the huge tables).

The bigger input files are (csv data):

40.000.000 lines, 6 GB of size,
10.000.000 lines, 2 GB of size (and a view more).

Total size of data is about 15 GB and 80.000.000 rows per day (but it
may grow).

These data are delivered on a daily base - so I get 40 million records
per day. I have to hold this data min. 30 days in the DB for reporting,
and after this the data can be aggregated to a history table.

So I will import each file into a single table :-)

The biggest thing I ever did was 15 million rows per table in firebird,
and that worked like a charm (less than 1 hour to import - and this was
not the only table) - and with that, I had to deactivate the indices
for import, because they run out of bound, when importing so much data
:-) (which took the same time :-) ).

Has anyone done such thing with firebird yet? Would you recommend
firebird 1.5, 2 or vulcan for that?

Have you any hints (=suggestions), of what to avoid, where to find
traps? They want me to do this with SQL Server or something else
(firebird was not mentioned, but ... :-) ).

I'm thinking of doing the import stuff with classic on linux and c api
to connect local.

Thanks,

Chris