Subject | RE: [firebird-support]Database design 2.1 and Table Size |
---|---|
Author | Svein Erling Tysvær |
Post date | 2008-05-26T13:14:21Z |
Hi Paul!
I've no experience with tables this size nor Fb 2.1, so hopefully someone else will answer as well.
My largest table (Fb 1.5) contains 15m records (and 74 fields), though I don't really use this table, just another table in the same database where I've extracted (through a trigger) the 4m records that I'm actually interested in.
Trying to join these two tables together seems to be reasonably quick, I did a join with a where clause for which I knew no rows satisfied the criteria and the result was instant when having the WHERE clause focusing on the biggest table (0.062 seconds or something). Changing criteria a tiny bit, it reported counting some 8000 rows in 1.5 seconds. Going back to my first test, just specifying the two criteria for different tables, it took just over 10 seconds to return 13 rows (I didn't believe there to be any rows, but I forgot that persons can move). I suspect this to be due to 17000 rows satisfying criteria in one table and if I remove the other criteria, it counts over 700000 rows in total so it might have been a bit of checking to end up with the 13 rows.
I hope others will tell whether 125 million rows are feasible or not, I expect it to be easy for Firebird, but I do not know. What I would recommend that you consider, is to do the import every day rather than once a month - importing 70000 records per day ought to be simpler than importing 2m records once a month and you probably don't have to care about disabling indexes if you do daily updates.
Set
-----Original Message-----
From: firebird-support@yahoogroups.com [mailto:firebird-support@yahoogroups.com] On Behalf Of Paul Hope
Sent: 26. mai 2008 14:07
To: firebird-support@yahoogroups.com
Subject: [firebird-support]Database design 2.1 and Table Size
Hi
I am looking for an indication that a database table will still perform
adequately if it grows to 125m rows. We will change from FB1.5 to FB2.1 (SS
MS Windows).
Transfer of historic data will start it with 21m records. It will then grow
at 24m per year. Data archiving will then probably keep it to below 150m
records. The table is not very wide, having about 30 fields, one being
varchar(100) and the rest numeric(9,2) or integers. Interest in the data
declines with time and the most recent 24m records will be the most active.
Key requirements are -
Batch insert of 2m records each month - indexes can be disabled for this.
For reporting joins to a 2m record table (<500ms response for one row?).
Is this requirement likey to be a challenge to FB2.1 or will it take it in
its stride?
Does it imply any extra hardware requirements - especially RAM (assuming
dual processors,raid etc).
How much of a benefit might 64bit Windows server provide?
Regards
Paul
I've no experience with tables this size nor Fb 2.1, so hopefully someone else will answer as well.
My largest table (Fb 1.5) contains 15m records (and 74 fields), though I don't really use this table, just another table in the same database where I've extracted (through a trigger) the 4m records that I'm actually interested in.
Trying to join these two tables together seems to be reasonably quick, I did a join with a where clause for which I knew no rows satisfied the criteria and the result was instant when having the WHERE clause focusing on the biggest table (0.062 seconds or something). Changing criteria a tiny bit, it reported counting some 8000 rows in 1.5 seconds. Going back to my first test, just specifying the two criteria for different tables, it took just over 10 seconds to return 13 rows (I didn't believe there to be any rows, but I forgot that persons can move). I suspect this to be due to 17000 rows satisfying criteria in one table and if I remove the other criteria, it counts over 700000 rows in total so it might have been a bit of checking to end up with the 13 rows.
I hope others will tell whether 125 million rows are feasible or not, I expect it to be easy for Firebird, but I do not know. What I would recommend that you consider, is to do the import every day rather than once a month - importing 70000 records per day ought to be simpler than importing 2m records once a month and you probably don't have to care about disabling indexes if you do daily updates.
Set
-----Original Message-----
From: firebird-support@yahoogroups.com [mailto:firebird-support@yahoogroups.com] On Behalf Of Paul Hope
Sent: 26. mai 2008 14:07
To: firebird-support@yahoogroups.com
Subject: [firebird-support]Database design 2.1 and Table Size
Hi
I am looking for an indication that a database table will still perform
adequately if it grows to 125m rows. We will change from FB1.5 to FB2.1 (SS
MS Windows).
Transfer of historic data will start it with 21m records. It will then grow
at 24m per year. Data archiving will then probably keep it to below 150m
records. The table is not very wide, having about 30 fields, one being
varchar(100) and the rest numeric(9,2) or integers. Interest in the data
declines with time and the most recent 24m records will be the most active.
Key requirements are -
Batch insert of 2m records each month - indexes can be disabled for this.
For reporting joins to a 2m record table (<500ms response for one row?).
Is this requirement likey to be a challenge to FB2.1 or will it take it in
its stride?
Does it imply any extra hardware requirements - especially RAM (assuming
dual processors,raid etc).
How much of a benefit might 64bit Windows server provide?
Regards
Paul