Subject | RE: [firebird-support] Inserting 100's of thousands from a SP |
---|---|
Author | Alan McDonald |
Post date | 2005-07-08T09:33:44Z |
> I have a stored procedure that inserts approx. 700,000 recordsyou can still commit from the client and I would highly recommend it during
> into a single
> table.
> The data is gather from about 300 tables from a database that is 2.8 GB.
> This process takes about 18 hours to run.
>
> Other processes will try to read from this table as records are
> being added
> to it.
>
> Since Firebird doesn't support COMMIT's in Stored Procedures what will be
> the negative impact on the database and the clients?
>
> I know that the client won't see any data (since none of it has been
> committed), but my client is getting slower and slower when it tries to
> access this table.
>
> This process has to happen within a stored procedure, there is no changing
> that, because or application has to also work with other databases as well
> as Firebird, and this design is the best for our application and web
> interface.
>
> I need to know what is going on with the client slowing way down, and how
> (if possible) to correct it.
>
> Configuration (test machine):
>
> Firebird 1.5.2 Classic Server, WinXP, 800 Mhz, 512 MB.
>
> 2 clients:
>
> 1 - Is running the long insert process.
> 1 - Trying to access the table that is being updated.
>
>
> Thanks
> Bupp
the insert of 300000 records.
Just use a loop and commit every 1000 or 10000
Alan