Subject | Re: [IBO] slow inserts, no chaching ... |
---|---|
Author | prom_prometheus |
Post date | 2004-01-10T16:04:44Z |
THANK YOU VERY MUCH Helen!
Now it works perfect and fast!
I fixed some of my erorrs before, BUT
the real breaktrhough was, that
you showed me IB_DSQL and HOW to use it!!!!!
Thausand thanks !!!!
Please, if you have a ISBN-Number for you book,
let me know it, so i perhaps can preorder it,
as you see i NEED IT! Smile
warm regards
Gerhard Knapp
Now it works perfect and fast!
I fixed some of my erorrs before, BUT
the real breaktrhough was, that
you showed me IB_DSQL and HOW to use it!!!!!
Thausand thanks !!!!
Please, if you have a ISBN-Number for you book,
let me know it, so i perhaps can preorder it,
as you see i NEED IT! Smile
warm regards
Gerhard Knapp
--- In IBObjects@yahoogroups.com, Helen Borrie <helebor@t...> wrote:
> At 12:23 AM 10/01/2004 +0000, you wrote:
> >hi,
> >i have to insert a lot of records from a dynamic array ..
> >how can i cache 1000 records p.e. and flush them to the server?
> >i use Dataset-components from IBO ...
>
> You certainly don't need a dataset component for this operation.
It's one
> part of the reason for this slowness. Use an ib_dsql.
>
> >my source is tooo slow:
>
> Yes. Your sequence and logic are all wrong. Prepare once,
execute many.
>
> ...
> var
> execount: integer;
>
> dmps.ebd.InTransaction then (* dmps.ebd.CloseTransactions; WRONG.
*)
>
> try
> dmps.ebd.Commit
> except
> dmps.ebd.Rollback;
> end;
>
> (* dmps.ebd.autocommit:= false; NOT NEEDED *)
>
> (* dmps.ebd.StartTransaction; NOT YET *)
>
> (* *******************.......................
> dmps.q_in ... is prepared before, q_in = TIBOQuery
> "insert into xxxx (a, b, c, d) Values (:a, :b, :c, :d)" ..
> dmps.prepare
> dmps.ebd.commit;
> **************....................... *) REMOVE ALL OF IT
>
> (* dmps.ebd.StartTransaction; NOT YET *)
>
> // initialise
> execount := 0; // overall statement count before first transaction
loop
>
> with dmps do
> begin
> // q_in is an ib_dsql not an iboquery
> q_in.SQL.Clear;
> q_in.SQL.Add('insert into xxxx (a, b, c, d) Values
(:a, :b, :c, :d)');
> repeat
> ii:= 0;
> ebd.StartTransaction;
> if not q_in.Prepared then q_in.Prepare;
> repeat
> inc(ii);
> (* for x:= 0 to flist.count - 1 do PARAMS WOULD BE SAFER
*)
> for x:= 0 to q_in.ParamCount - 1 do
> begin
> q_in.params[x].Value:= xdata[ii][x];
> end; // MOVE THIS TO HERE
> (*q_in.ExecSQL;*)
> q_in.Execute; // AFTER ALL PARAMS HAVE NEW VALUES
> (* end; NOT HERE *)
> until ((ii = 1000) or (size limit of array is reached??);
> try
> ebd.commit; //-> here it should be written to db, but ..
> if execount + 1000 < flist.count then
> inc(execount, 1000)
> else
> inc(execount, ii);
> except
> ...
> end;
> (* dmps.ebd.StartTransaction; NOT HERE *)
> until ....
> ................
>
> Rough, not tested, certain to have bugs, but gives you a picture
of the
> sequence of events...you don't need execount, I put it in there
initially
> when I thought this was looping through a TStrings; then decided
to leave
> it as a kind of "dumb documenter" because I can't tell what this
array is
> like or whether you have some code in there to avoid an AV when ii
gets
> larger than the number of sets in the array.
>
> i.e. the assignment of the SQL should be a one-time action, the
Prepare
> should be a one-time action. The only stuff that gets replaced is
the
> values in the parameters.
>
> Helen