Subject Re: [ib-support] FAST DATA PUMPING - BEST PRACTICE
Author Lucas Franzen
Milan Babuskov schrieb:
>
> gorepj wrote:
> > Is there a faster way? Would parameter substitution help?
>
> It definetly would. Create a statement with parameters and prepare it
> before entering the loop. Inside the loop fill the parameter values and
> execute statement. Don't commit after each record, instead commit for
> example at every 10000th record. (Don't forget to commit after the last
> record).

I second that.
I do import half 1.4 million records into approx. 10 tables that way
within 10 to 15 minutes, and another 15 minutes for creating indices
;-))
(locally, 1.2GHz, 256 MB Ram - enough for a wrok station - not enough
for aplay station ;-))

And in addition:

- if you use parameters, access them by index (params[0],
params[1]...), not by name

- try not to use generators for every insert
(if you know how many records will be imported,
get an "end number" by calling
GEN_ID (<GENERATOR>, <number_of_records>) ),
otherwise if you commit every 10.000 insert (I do that, too)
set the generator by incrementing it with 10.000)

- if possible decativate all indices before and
re-activate them afterwards

- try to import data without insert triggers

- or use external tables as Marco told you.

This should help you to speed up your import.
Let us know.

Regards

Luc.