Subject RE: [IBO] Very slow execution if Transaction.AutoCommit is true
Author Svein Erling Tysvær
>there are two versions of my application: the released one and a new one still in progress.
>The released one ships with Firebird 2.1.3 and IBO 4.9.9 on Delphi 2007.
>The new one will run on Firebird 2.5.2 and IBO 4.9.14 Build 52+ (or IBO 5) on Delphi XE.
>I'm importing the basic data via a script from time to time, once the data was updated. Data
>in the column BIC is not complete. One record per BANKGROUP has the value, which applies to
>all others records of this BANKGROUP, so in that routine I'm creating an update script to
>fill the missing data.
>It looks like
>update BANKCODES set BIC = 'DEUTDEBBXXX' where ID = 45;
>update BANKCODES set BIC = 'DEUTDEDBBER' where ID = 47;
>update BANKCODES set BIC = 'BEVODEBBXXX' where ID = 66;
>The table has about 25000 records, update script has about 13000 statements.
>When doing this with the released version, it took maybe 2 minutes. If doing the same with
>the new version, it takes much longer - over 25 minutes.
>Apparently the transaction settings do play a role here. Although it has worked fine before,
>it seems to be necessary now to use a transaction with isolation=tiConcurrency and
>AutoCommit=false, otherwise it takes much longer and additionally FastMM has reported some
>memory leaks sometimes, but somehow I can't reproduce them repeatedly.

First, sorry for not really answering your question...

Have you looked at database statistics before, during and after your import, particularly the size of the gap between oldest (active) transaction and next transaction? I think AutoCommit defaults to CommitRetaining (AutoCommitRetains is false by default) and until a hard commit is done, that could stop the OAT from advancing.

I doubt it matters, but why do you use tiConcurrency and not tiCommitted? Is there any reason for having a static view of the data when you import (the static view that tiConcurrency provides is good for export or reports, but you are importing)?

Committing all the time is time consuming (as you've already experienced). But having separate update statements for each update also makes things far slower than they need to be. You would experience a dramatic speed improvement if you (in addition to doing everything within the same transaction) changed to using a prepared statement with parameters (my guess is that you could update a couple of thousand rows per second, though that is of course dependent upon there only being one or a few transactions, not 13000):

MyBIC: TIB_Column;
with TIB_DSQL.CREATE(Application) do
SQL.Add('update BANKCODES set BIC = :BIC where ID = :ID');
MyBIC:= ParamByName('BIC');
while not eof(<whatever file has the parameters>) do