Subject Re: Small Subsets of data
Author Adam
--- In, Lee Jenkins <lee@...> wrote:
> Hi all,
> I'm currently working on a re-write of an existing application. In
> application, I need to pull small subsets of data from a table that
will have
> many records in it (2 million eventually?) but I may only need to
grab say, 3,5
> or maybe 15 records and I need to get them fast.
> The existing app was originally written against an MS Access
database. Because
> of this, current "days" records would be held in temporary tables
until end of
> day was run and they were transferred to a historic table. This was
> to avoid 1) pulling all those records over the network to query
locally with
> access database engine and 2) to avoid having to search through that
> records to find just a few. When we ported the app over to
firebird, it was
> necessary to stick with this model.
> Now that I'm re-writing the app, I want to use just one table.
While having to
> pull thousands upon thousands of records over the pipe to query them
locally is
> no longer a concern, I'm concerned about the server having to query
through that
> many records to find just a few.
> PK's in the database are based on a static local ID and a database wide
> generator that produces (Varchar(35) PK's like this:
> LocalID-Generator value.
> IE:
> 555-1
> 555-2
> 555-3
> ....
> 555-2220303030
> ,etc.
> FK are the same way.
> I'll be using mostly a compound index using a FK with a format like
that shown
> above along with a DateTime field. Can I expect reasonable speed
here to pull
> these few records out of so many?


Providing the queries you anticipate are well served by indices on the
table, this will not be a problem. Firebird will read the ind(ex/ices)
from disk (or probably from a memory cache), build a bitmap of where
the particular records of interest are then read them in storage order.

But download FlameRobin and use the test data generator to fill a test
database with 2 million records then see for yourself.