Subject | Re: [IBO] Parameterized inserts? |
---|---|
Author | Helen Borrie |
Post date | 2002-11-07T13:26:33Z |
At 04:25 AM 07-11-02 -0800, you wrote:
batch is huge. You will need a Commit branch at suitable countered
periods. CommitRetaining isn't suitable because it blocks garbage
collection and would have a big impact on the hygiene of your database.
This makes the concurrency of the two datasets a bit problemmatical.
You will need to have the source query and the dsql in two separate
transactions, so you can keep a consistent view of the source data across
the multiple commits of the output. Just use the default transaction of
the ibodatabase for the source, isolation tiConcurrency. And yes, you do
need to restart the transaction each time after the commit. (Just make
sure you only commit the transaction for the updates/inserts, not the one
you are reading from).
This is a very inefficient way to do a batch transfer, especially such a
huge one. I hope your users are patient.
Helen
> >Open your source query and start to loop through it, after this style:Hmm, didn't think about the mid-stream commits: you will need them if the
> >
> >begin
> >....
> >if not mydsql.prepared then mydsql.prepare;
> >
> >with myquery do
> >begin
> > First;
> > while not eof do
> > begin
> > mydsql.params[0].AsWhatever := FieldByName('whatever').AsWhatever;
> > mydsql.params[1].AsSomething := FieldByName('sthg').AsSomething;
> > // and so on until all the dsql params have their values
> > mydsql.execute;
> > Next;
> > end;
> >end;
> >ibotransaction1.commit;
>
>Ok, now if I have a very large source dataset, I want to commit every
>15,000 records or so, right? So, I would put a counter in the loop and do
>commits periodically in the loop as well.
>
>If I commit in the middle, do I have to re-start my transaction? Or, would
>this be a good place for CommitRetaining?
batch is huge. You will need a Commit branch at suitable countered
periods. CommitRetaining isn't suitable because it blocks garbage
collection and would have a big impact on the hygiene of your database.
This makes the concurrency of the two datasets a bit problemmatical.
You will need to have the source query and the dsql in two separate
transactions, so you can keep a consistent view of the source data across
the multiple commits of the output. Just use the default transaction of
the ibodatabase for the source, isolation tiConcurrency. And yes, you do
need to restart the transaction each time after the commit. (Just make
sure you only commit the transaction for the updates/inserts, not the one
you are reading from).
This is a very inefficient way to do a batch transfer, especially such a
huge one. I hope your users are patient.
Helen