Subject Re: [IBO] Dealing with complex queries
Author Geoff Worboys
Hi Rob,

> We have an animation that is supposed to show when a complex
> query is running. However it doesn't. It shows but does not
> progress (its Timer fails to trigger) until after the query
> has run. We looked at using callbacks but in this case its
> not massive amounts of data being returned, its actually just
> a complex query.

When the API is called and expects a response the application
just sits there until a response is received or until the
connection times out. The entire application is frozen at
that point, the message queue is not processed, nothing can
be done other than killing the entire process.

As you indicate, a large volume query can be managed via the
callback mechanism. This works because a large volume query
is actually many API calls: prepare, fetch, fetch, fetch ...
and the callback function can be executed between fetch calls.
A complex, long-running, query is more of a problem because
it is essentially one API call taking a long time to return.

There are two ways to deal with this problem, that I know of:

1. Set up a connection/transaction/ib_dsql in a separate
thread and execute it there. Use your timing stuff to
monitor for when that thread returns/exits.

(As you probably realise this can be quite difficult to
get exactly right, but works very well once you do.)

2. Turn your complex query in to a long running query so
that you can use callbacks from an ib_cursor to give
visual feedback.

(This can be much simpler to fit existing applications
because it avoids the threading synchronisation issues.)


Option two assumes that your complex query is complex because
it is a stored procedure that does lots of things before
finally exiting (or returning a single row of data). The idea
is that you turn it in to a select procedure that calls suspend
at various intervals while executing. Most commonly I return a
string to indicate what the procedure is up to at the moment,
and display that to the user. By using an ib_cursor the client
does not store an excessive amount of returned data, just the
most recently returned row (status).

If the procedure already returns one or more rows of data you
can return a "flag" value to tell your application when it
is returning a dummy row (just giving feedback).

I have/do use both of the above depending on the situation.


Note that you can sometimes create a query, that may eventually
return many rows of data, that simply takes a very long time to
return it's first row. This can be because it is so complex
that the prepare phase takes a long time or it can be because
you are trying to order the results of a selectable stored
procedure; in which case FB must obtain all rows first (execute
the entire stored procedure) so that it can sort the result
before it knows which is the first to return. [Similar can
exist with non-procedure queries but are less common and can
sometimes be solved with the creation of appropriate indexes or
tuning of the query.]

These situations can only really be resolved by a multi-thread
solution (option 1 above) OR you can try to refactor your code
to avoid the problem - if this is possible it is probably a
good idea - for example order the results inside the procedure.


In Firebird version 2.1 it is possible to cancel a long running
query... but this can only be done from a separate connection.
So if you think you would like (eventually) to provide the user
with a "cancel" button from which they can cancel a long query
then I suggest you seriously consider running long queries from
a separate thread.


Note also that your application can "freeze" as described above
due to network or server problems (calls the API and waits).
The only solution here (other than live with it) is to use a
separate thread for all network interactions - but that is not
really practical with the VCL/IBO code as currently designed.


HTH

--
Geoff Worboys
Telesis Computing