Subject Large row count in tables
Author roar4711
I got a database with 5 different table, each containing around 19 million records. Each five minutes between 1500 and 2000 records are added to each table. Right now the database is 12 GB but I expect it to grow to 80-90 GB for a whole year worth of data.

A table looks like this:

integer ID
integer Time
float Data

I have an asc and desc index on ID/Time for (time) ranged queries. Cant have a primary index since there is no unique data.

Default page size is set to 4096 bytes.

I'm using Firebird 2.1 (Embedded) and the IBPP API.e

Problem is that insert times are starting to take a long time, up to 4 minutes. Also upon startup a query is done to determine the last added record (time wise, select MAX(Time) from XYZ), this also a length process and very memory consuming (just a hair short of using all of the 4 GB process space).

To try to attack the problem above I'm thinking of splitting the database into several smaller databases. Trying to limit the size of each database to around 1 GB where non of the above problems is visible.

Before I do that I just wanted to throw out a question here, any good advice to give ?, I rather have one database than splitting them, but right now I'm out of ideas.