Subject Inserting 100's of thousands from a SP
Author buppcpp@yahoo.com
I have a stored procedure that inserts approx. 700,000 records into a single
table.
The data is gather from about 300 tables from a database that is 2.8 GB.
This process takes about 18 hours to run.

Other processes will try to read from this table as records are being added
to it.

Since Firebird doesn't support COMMIT's in Stored Procedures what will be
the negative impact on the database and the clients?

I know that the client won't see any data (since none of it has been
committed), but my client is getting slower and slower when it tries to
access this table.

This process has to happen within a stored procedure, there is no changing
that, because or application has to also work with other databases as well
as Firebird, and this design is the best for our application and web
interface.

I need to know what is going on with the client slowing way down, and how
(if possible) to correct it.

Configuration (test machine):

Firebird 1.5.2 Classic Server, WinXP, 800 Mhz, 512 MB.

2 clients:

1 - Is running the long insert process.
1 - Trying to access the table that is being updated.


Thanks
Bupp