Subject | Re: [IBO] Archiving data |
---|---|
Author | Lucas Franzen |
Post date | 2004-09-01T14:00:35Z |
Riho-Rene Ellermaa schrieb:
When doing large data transfer (inserts, updates, deletes) it's
recommended to commit every <n> records (there's no fixed number for
this, it's depending on the size).
So if you commit every 100 or 1000 or 10000 inserts everything will run
better.
And I can just see a transaction control for one of the two databases
(don't know which one the Transaction is bound to).
You should definitely use a two-phase-commit technique when handling two
databases, so every database stays "sane".
Luc.
> Hi!which doesn't show too much, since the used domains are of unknown type ;-)
>
> This approach works OK for "normal" users, but now I encountered one who
> has LOTS of data in one table (structure is at the end of mail).
> Each Post() added approx. 200 KB to the memory usage and my computer runThe problem is that you don't do commits inbetween.
> out of memory very fast.
When doing large data transfer (inserts, updates, deletes) it's
recommended to commit every <n> records (there's no fixed number for
this, it's depending on the size).
So if you commit every 100 or 1000 or 10000 inserts everything will run
better.
And I can just see a transaction control for one of the two databases
(don't know which one the Transaction is bound to).
You should definitely use a two-phase-commit technique when handling two
databases, so every database stays "sane".
Luc.