Subject RE: [firebird-support] Best Practice Question - How many rows to update before commit?
Author Rick Debay
> What is the best number of rows to post in a transaction before a
commit should be done?

It should be based on your transaction requirements and not performance
metrics. From what I've experienced and read, the differences are
Knowing nothing about the business process your running, I can guess
that the answer is commit after every insert, or after the entire
process is done.

-----Original Message-----
[] On Behalf Of Myles Wakeham
Sent: Friday, November 09, 2007 11:08 AM
Subject: [firebird-support] Best Practice Question - How many rows to
update before commit?

I have a large data load application that moves about 500,000 rows of
data from an ODBC source to Firebird 1.5, which is stored in approx. 10
different tables. My load routine obtains the source data and then
loops through it, using an ODBC connection to FB to INSERT data into the
target tables.

It has been working very well for some time, however on implementing it
on a client's server to run as a scheduled batch process it is showing
signs of failure.

What I'm seeing is that after getting through about 75% of the data
load, it appears to go into some continual loop in which it never
recovers. I see activity in the FB server computer's hard drive, and
continuous activity on the data loading client, but it never gets
through the loop. Its very strange because I would have expected to see
errors coming from FB that would interrupt the load, but I don't see
this. Its almost like it gets stuck in a groove on one random record
and never recovers from it. Yet I do not see repetitive rows being
posted to FB.

In attempting to debug this behavior, I checked my code and noticed that
I was going through the data load for the different data source tables,
and incrementing a counter for each row posted. After I hit 5,000 rows,
I would force a commit to FB.

What is the best number of rows to post in a transaction before a commit
should be done? I'm sure that I probably need to commit more frequently
but I'm trying to balance performance vs. reliability here. The data
load takes about 3 hours to finish, so frequent commits could have
dramatic affect on the length of time that this has to run, and as it is
run nightly in most cases, I need to find a way to do this as quickly
and reliably as possible.

Any thoughts are greatly appreciated.


Myles Wakeham
Director of Engineering
Tech Solutions USA, Inc.
Scottsdale, Arizona USA
Phone +1-480-451-7440

Try our new Outlook Utility 'Split Personality'


Visit and click the Resources item on the
main (top) menu. Try Knowledgebase and FAQ links !

Also search the knowledgebases at


Yahoo! Groups Links

Disclaimer: This message (including attachments) is confidential and may be privileged. If you have received it by mistake please notify the sender by return e-mail and delete this message from your system. Any unauthorized use or dissemination of this message in whole or in part is strictly prohibited. Please note that e-mails are susceptible to change. RxStrategies, Inc. shall not be liable for the improper or incomplete transmission of the information contained in this communication or for any delay in its receipt or damage to your system. RxStrategies, Inc. does not guarantee that the integrity of this communication has been maintained nor that this communication is free from viruses, interceptions or interference.