Subject Re: [firebird-support] OLAP tuning in Firebird 2.1
Author Richard Wesley
Hi Set -

Thanks for responding.

On 19 Jan 2009, at 23:58, Svein Erling Tysvær wrote:

> Processing 600000 records is always going to take a bit of time (not
> necessarily too long, but it will not be instant). I'd say the main
> problem with your query is that there is no WHERE clause to limit
> the number of records that needs to be examined.
> If there is lots of duplicates, I suppose you could try to add
> another table to hold the unique value and populate it through a
> trigger (insert the record only when it does not exist). Then you
> could try something like:

Yes, building a de-normalised table is the other option, but that
seems to run afoul of Firebird's join performance.

> SELECT DISTINCT tu."Object Source" AS "none:Object Source:nk"
> FROM "TableauUnique" tu
> WHERE EXISTS (SELECT * FROM "TableauExtract" te WHERE te."Object
> Source" = tu."Object Source")
> Though I know nothing about neither OLAP

OLAP: On Line Analytic Processing
OLTP: On Line Transaction Processing

> nor whether "TableauExtract"."Object Source" contains lots of
> duplicates, so my answer may or may not be useful.

Yes, it contains lots of duplicates, and this is generally true for
data warehouses. The tension is between de-normalising, which reduces
disk reads on the fact table when there are lots of dimensions and
allows quick domain queries on the dimensions themselves, but seems to
be slow when joining dimensions to the fact table for multi-
dimensional analysis (which is a common OLAP operation). We are
trying to decide which way to jump, but before we can do that, we have
to know whether our use of Firebird for these kinds of queries is
optimal. Which is what I am asking about.

Richard Wesley Senior Software Developer Tableau