Subject Re: [firebird-support] Maximum speed of firebird database
Author Richard Wesley
On Feb 20, 2007, at 07:18, wojciech_materna wrote:

> I have fiebird 1.5.4, database about 20GB, 2xXeon 3GHz,2GB RAM and I
> need to make calculations by stored procedure on 100k records table
> with many necessary calculations on fev additional tables. It spends
> about an hour. How can I improve speed of calculations large amount of
> data? Bigger server? More RAM, other .....When I divide calculations
> from one procedure to f.ex. four starting in the same time - it hives
> me no positive efect.
> Somebody, who have big databases and many calculations can help me?
> Procedure is maximally clear, optimized.

In my experience, FB and many other relational engines are slow at
this type of OLAP task where you need to do some sort of calculation
on every row of the fact table (the table containing most of your
numbers) because the disk cache is smaller than the database and the
task is essentially IO bound.

If your fact table is wide and you are only using a small subset of
the columns, consider extracting a vertical slice of the data with
only those columns (and the dimension/foreign keys you are interested
in) and then performing the calculations on that subset. This is
basically what data warehouses do (take a snapshot and do the
calculations/aggregations offline/overnight). It is also the
reasoning behind some high-performance OLAP column-oriented
relational databases (fetch only the columns you need). You will pay
for the snapshot operation on the first calculation, but hopefully
the other computations will go faster.

I'm also assuming that your tables are well-normalised. If not, you
may have wider rows than you need for these operations, which will
cut down on performance because you are reading unnecessary data. If
you have (say) every salesperson's name in the table, consider
factoring them out into a separate table with a foreign key.

You may also be able to gain some performance by bumping up the page
size to cut down on the number of data reads, but this may affect the
performance of your database if it is live. In general, the settings
for data collection and data analysis may be somewhat different,
which is another reason why large data warehouses usually work off
snapshots rather than live data.

Richard Wesley Senior Software Developer Tableau