Subject Re: [IBO] Performance
Author Helen Borrie
At 10:06 AM 13/09/2004 +1200, you wrote:
>Hi All
>Sorry if this is off topic, im not sure if there is a Firebird solution or
>if the IBO components can be tweaked.
>I have been struggling to improve the performance in a large application I
>am converting from BDE/ DBase to Firebird. We are using IBOTable and
>IBOQuery components to speed the conversion process. I have discovered
>some code that is run frequently that is a real bottle neck. The code
>recalcs tax rates for each line item in a transaction. Currently to do
>this The code loops through each record like such...
> with InvoiceItemTable do begin //This is an IBOTable component
> try
> DisableControls;
> Post;
> First;
> while (Eof = False) do begin
> Edit;
> UpdateLineItem;
> Post;
> Next;
> end;
> finally
> EnableControls;
> end;
> end;
>The UpdateLineItem code, does the recalculation. It calls code that is in
>a class called from multiple places.
>I know I could write an SQL to do the recalculation but we want control
>over rounding etc and have customized rounding routines. Is there a
>simple way to accelerate this code using the IBO components or should we
>look at writing a UDF in firebird to handle our rounding and do it with a
>query? I want to avoid re writting the entire screen at this time.
>All suggestions welcome :)

Yup, you've certainly pinpointed the cause of the poor performance.

Definitely don't perform this operation on a table object. If you insist
on doing this as a client-side operation, you want the smallest possible
subset in the buffer -- replace the detail set with a parameterised query
("where TheDetailFK = :TheMasterPK).

Don't call Post on the dataset *before* you perform the operation....was
this a typo?

If you really-really-really don't want to do this server-side (the proper
place for it, in a trigger) then use a prepared, parameterised update
statement that you fire off from a TIB_DSQL in the same transaction context
as your datasets. Apply your function to the relevant parameter[s] in the
BeforeExecute event of the IB_DSQL after you've applied and posted the new
rate to the master. IOW, don't touch the detail dataset at all.

Once the IB_SQL has finished, post any outstanding changes from the detail
dataset and refresh it. Done, quick as a wink.

The biggest break-through you will make when moving from desktop to
client/server is when you realise that everything you do on the client is
operating on a local snapshot of what came over from server at the
beginning of the transaction (and erk!! a table component is one horribly
panoramic snapshot!!).

Thus, if you have an operation that has to search through local buffers to
find what it's looking for, it's going to slow things in a number of
ways..actually, in just about every way you can think of. Most
particularly. DML ops (searches, updates) that have to do the same thing to
a set of rows in the client buffers have to do the same costly stuff over
and over to get positioned for each update -- and that's before anything
even gets submitted to the server. If the server doesn't like it, it will
spit it out and all those processing cycles will have been in vain.

When you've got to do something iteratively like this, a direct searched
operation on the underlying server-side set bypasses all that noise and
risk and removes this legacy bottleneck that you inherited from the desktop
db app. It has a lot of benefits, not the least of which is that, if the
update can't happen for some reason, e.g. an update conflict or a CHECK
constraint, the application will know about it *before* it commits itself
to chewing up all those processing cycles. It all translates to a much
more responsive application for your users.