Subject | Re: [ib-support] FAST DATA PUMPING - BEST PRACTICE |
---|---|
Author | Marco Krause |
Post date | 2003-04-03T20:24:18Z |
> Is there a faster way? Would parameter substitution help?IMHO the fastest way is to use external tables in firebird for
very large datasets
Example:
- create external tables in firebird
- load all records from the as/400 und store them directly in the
ascii files (without using firebird !!)
- pump all records with an sql statement from the external table
in the final table
running a quick test on my production server (W2k, FB 1.5b4, 2x 1GHz P3, 1GB Ram, Raid-1)
shows the following results:
exporting the table "matchcode" (1 million records) into an external
file -> 51s
re-importing the table from the external file
-> 4m, 37s
The metadata looks like:
CREATE TABLE MATCHCODE (
HERKUNFT CHAR(1) NOT NULL,
MATCHCODE VARCHAR(40) NOT NULL,
LOOKUPKEY VARCHAR(22) NOT NULL,
MATCHART CHAR(2) NOT NULL
);
I don't know, if you can use it, but this is a VERY FAST way to pump data
into tables. But remember, externals files are not under transaction control. Storing
a blob field is not supported and a few other things; look into the firebird docs
for more information...
cu
--
marco krause