Subject Re: Re[2]: [firebird-support] Re: Handling large imports while system is in production
Author W O
Thank you Dmitry, but it seems to me that you think I am using Delphi or
something similar and not, is not the case.

But your advices are very good, thanks again.

Greetings.

Walter.





On Wed, Feb 6, 2013 at 4:40 PM, Dmitry Kuzmenko <kdv@...> wrote:

> **
>
>
> Hello, W!
>
> Wednesday, February 6, 2013, 11:03:13 PM, you wrote:
>
> WO> THAT'S INCREDIBLE!!!!!!!
>
> WO> Importing row by row 3.600.000 records takes about 14 hours. Importing
> WO> 1.160.000 records using an external table takes seconds, less than a
> minute.
>
> WO> Thanks very much to all for the advice.
>
> The most common solutions to speedup import
>
> 1. do not use cached datasets to read data. The more records you read,
> the more cache allocated in client program, the slower import will be.
> Source dataset must be able to "cache" only one row.
>
> 2. do not commit on each insert. That's the rule. The more commits,
> the slower performance. If you can't control transactions during
> import, you'll fail.
>
> 3. lot of indices on target table slowdowns inserts and updates. Turn
> that indices off (inactive) for import. Or, at least, turn of less
> selective indices.
>
> 4. do not use DataSets for target. Insert must be made by simple query
> "insert ..." and nothing else.
>
> 5. use parameters to insert data. plain text insert with data takes
> more time on prepare/execute cycles, than one prepare, and cycle with
> change_params/execute.
>
> --
> Dmitry Kuzmenko, www.ib-aid.com
>
>
>


[Non-text portions of this message have been removed]