Subject | Re: [IBO] help needed. browsing slows down after a days work |
---|---|
Author | Helen Borrie |
Post date | 2001-12-12T00:58:41Z |
At 12:45 PM 11-12-01 +0000, DennisFantoni wrote:
As long as even one user is holding open that transaction, obsolete record versions will accumulate and clog up the database. If anyone is using this interface to initiate changes to records, without committing anything, it is also potentially creating big concurrency bottlenecks.
In your environment, consider 200 records in an output set for browsing as an approximate maximum. Have users enter or select parameters to reduce output sets to those records they are interested in; and commit and refresh often. { Drum roll } Presenting a spreadsheet interface for huge datasets is NOT one of the tasks client/server databases are designed for { resounding chord on harp }
Use the client to *target* required data. That's what the smart SQL parsing and search capabilities of IBO are all about.
regards,
Helen
> with an interbase6 server and abt. 20Yes: these are the exact symptoms of a database where no garbage collection is happening.
>workstations.
>
>whe a workstation is on, it shows a ib_grid that connects to a query,
>typically something like select * from table (returning lots of rows)
>
>The application works fine and browsing is quite fast, all considered.
>
>But after 5-6 hours work, brosing becomes really really slow. Perhaps
>20 times slower than at the beginning.
>
>Is there something in ibo or interbase that might go slower over time?
>the query for the grid loads perhaps 6000 records.This is a design flaw in an application designed for more than a single user. It is probably a legacy from an old BDE app which used a TTable over a desktop database such as Access or Paradox. If users need to "browse" 6000 records all day, then you need to do some redesigning.
As long as even one user is holding open that transaction, obsolete record versions will accumulate and clog up the database. If anyone is using this interface to initiate changes to records, without committing anything, it is also potentially creating big concurrency bottlenecks.
In your environment, consider 200 records in an output set for browsing as an approximate maximum. Have users enter or select parameters to reduce output sets to those records they are interested in; and commit and refresh often. { Drum roll } Presenting a spreadsheet interface for huge datasets is NOT one of the tasks client/server databases are designed for { resounding chord on harp }
Use the client to *target* required data. That's what the smart SQL parsing and search capabilities of IBO are all about.
>My own idea of attacking the problem right now, is to somehow try toIt's always good practice to look for memory leaks but this does look like a serious case of transaction constipation rather than memory diarrhoea...
>get the slowing down to occour at my own computer, then to see if
>there's any local memory leaks.
regards,
Helen