Subject The Cost of Versions on performance
Author Eduardo A. Salgado
Running Firebird v 1.5 as a service at one of our servers. Have a
database that has a table where "versions" look like this:

Primary pointer page: 177, Index root page: 178
Average record length: 64.04, total records: 3825
WoW >> Average version length: 9.01, total versions: 3795833, max
versions: 47376
Data pages: 13153, data page slots: 13153, average fill: 92%
Fill distribution:
0 - 19% = 0
20 - 39% = 0
40 - 59% = 0
60 - 79% = 1
80 - 99% = 13152

It is called by a ColdFusion page. We have placed transactions and
commits in that page to make sure we do not leave transactions hanging.

Eventually, however, the database grows from about 15 MB to about 150
MB and the system response is slowed. We take the database out of
service, clean it with a Sweep, Validate, Mend, BackUp and Restore.
Then all is well again for a week or so.

We now have a cron job at 10 PM that runs a Sweep on all the databases
at this server.

When we test using IBExpert, at a desktop, the versions also increase
with any (example) "Update table BLA Set <field> = 1 where this is so"

We cannot figure out what is causing this problem. Should we worry
about the large number of versions? Is the slowness being caused by
something else that we are missing or is it the versions?