Subject | Optimizing FB 1.5 for Data Warehousing |
---|---|
Author | myles@techsol.org |
Post date | 2007-10-06T13:11:02Z |
We are using a Firebird 1.5 SQL database for a data warehouse on a number of
sites, where an external application does a nightly 'kill & fill' of the
database. The database contains about 20 tables of business data, and about
1/4 of the tables contain in excess of 20,000 rows.
What we have been noticing is a gradual deterioration of speed over time
with this process. Yet the performance of the database for queries is
acceptible.
The nightly process first does a DELETE for all rows in all tables, setting
the database to empty. Then it does a parse through the 20 core tables
compiling the business data to load, and constructing an INSERT statement
for the data into those tables. This is where we are seeing the
deterioration of speed. What started off as a 1 hour process for one site,
has gradually deteriorated to a 3 hour process, etc. My concern is that if
this continues, the nightly process will spill over to daily transaction
processing on their master system, making it unusable.
Is there anything that can be done to optimize a database for this type of
use? I'm happy to reduce the query performance down a bit to gain on INSERT
performance. There are some foreign key indicies on this database for
referential integrity, and I took a sample of one table that is noticeably
slower for inserting over time, and noticed that it had 4 FKs on it, but NO
ACTION on the Constraints for them. There are Generators on each time with
a Before Insert trigger, but that is simply to assign a Unique ID for the
new rows. That's about it.
All help in optimization suggestions are greatly appreciated.
Regards,
Myles
============================
Myles Wakeham
Director of Engineering
Tech Solutions USA, Inc.
Scottsdale, Arizona USA
www.techsol.org
Phone +1-480-451-7440
Try our new Outlook Utility 'Split Personality'
http://splitpersonality.techsol.org
sites, where an external application does a nightly 'kill & fill' of the
database. The database contains about 20 tables of business data, and about
1/4 of the tables contain in excess of 20,000 rows.
What we have been noticing is a gradual deterioration of speed over time
with this process. Yet the performance of the database for queries is
acceptible.
The nightly process first does a DELETE for all rows in all tables, setting
the database to empty. Then it does a parse through the 20 core tables
compiling the business data to load, and constructing an INSERT statement
for the data into those tables. This is where we are seeing the
deterioration of speed. What started off as a 1 hour process for one site,
has gradually deteriorated to a 3 hour process, etc. My concern is that if
this continues, the nightly process will spill over to daily transaction
processing on their master system, making it unusable.
Is there anything that can be done to optimize a database for this type of
use? I'm happy to reduce the query performance down a bit to gain on INSERT
performance. There are some foreign key indicies on this database for
referential integrity, and I took a sample of one table that is noticeably
slower for inserting over time, and noticed that it had 4 FKs on it, but NO
ACTION on the Constraints for them. There are Generators on each time with
a Before Insert trigger, but that is simply to assign a Unique ID for the
new rows. That's about it.
All help in optimization suggestions are greatly appreciated.
Regards,
Myles
============================
Myles Wakeham
Director of Engineering
Tech Solutions USA, Inc.
Scottsdale, Arizona USA
www.techsol.org
Phone +1-480-451-7440
Try our new Outlook Utility 'Split Personality'
http://splitpersonality.techsol.org