Subject | Re: [IBO] Loop Faster |
---|---|
Author | Gordon Hamm |
Post date | 2008-07-16T16:45:26Z |
I guess I could write a stored procedure.. That would help.. but Im building a moving average of data as I move through the data.. so its quite complex.
I think the advice on using a Cursor is a good one.. also, eliminating using Fieldbyname.
Thanks for the good advice.
I think the advice on using a Cursor is a good one.. also, eliminating using Fieldbyname.
Thanks for the good advice.
----- Original Message -----
From: Svein Erling Tysvaer
To: ibobjects@yahoogroups.com
Sent: Wednesday, July 16, 2008 9:51 AM
Subject: Re: [IBO] Loop Faster
1400 records per second, hmm, well, not terrible, but unless the
calculation is complex or you have non-selective indexes, it should be
possible to improve.
A few general hints:
Use prepared statements and parameters rather than repreparing
statements for each iteration of a loop (though I doubt Firebird is
capable of 1400 records per second unless you already do this).
TIB_DSQL is the component with the least overhead that should be used
for UPDATE/INSERT/DELETE statements.
TIB_Cursor is simpler (quicker) than TIB_Query or TIBOQuery and could be
used for SELECT statements unless you have to be able to move backwards
in your dataset (using .Prior or similar).
FieldByName is generally slow. Using TIB_Field or Fields[] is
considerably quicker. My hunch is that this is where you could improve
your performance by doing things like:
var
MyUpdateField: TIB_Field;
MyNewValue: Integer;
...
MyDSQL.Prepare;
MyUpdateField:=MyDSQL.ParamByName('Gordon');
...
while not MyTIB_Cursor.eof do
...
MyUpdateField.AsInteger:=MyNewValue; //This sets a MyDSQL parameter
...
MyDSQL.Execute;
...
end
...
Though, as Jerry said, it will be quicker to do the update on the server
- either through an update statement updating several records (albeit
this in some cases can make the server feel slow for other users) or
through a stored procedure.
Another option could of course be to do the update daily rather than
weekly. That would mean less to update each day.
HTH,
Set
Gordon Hamm wrote:
> Hi,
> I need t loop through 10 million records and do some math on on some values and post result to a field (Same table).
>
> It works fine..
>
> I do this with 2 queries..One to loop, one to update etc.
>
> I am
>
> 1. Committing the transaction every 1000 records for performance.
> 2. Turn on the UniDirectional property to use less memory.
>
>
> Is there any sort of buffering that I can set to increase performance?
> The whole process takes about 2 hours, Id like to try to improve that as I have to run it about once a week.
[Non-text portions of this message have been removed]