Subject | FB Size Limitations/Performance Questions |
---|---|
Author | Lee |
Post date | 2003-10-14T00:11:56Z |
Hello all,
I've recently switched to FB from IB6 and have been very please. Now I am
starting a new project that I would like to use FB with, but I wanted to get
some advice about the size of FB databases and performance.
For this project, I will be polling remote stores to get aggregated sales
information to put into a centralized database that will be accessible to
clients written using kbmMW midleware. I don't have any concrete idea of
the amount of data that will accumulate over time, but I can imagine will be
a lot.
For instance, sales line item information will be retrieved. That
information will be grouped so that say 1000 raw line items would turn out
to be maybe 200 aggregated items sent up to the server (FB) database.
200 items average a day would be 73000 rows added annually (guessing here).
Some of our clients have 200 stores which could be 14,600,000 rows added
annually to a database. This seems to be an extreme example since better
aggregation would result in fewer rows being added to the database, but who
knows?
Can FB handle this kind of size?
What is the largest number of rows in a table anyone seen?
I know that there are way too many variables to give an accurate yes or no,
but just a general idea would be great...!
Thanks,
Lee
I've recently switched to FB from IB6 and have been very please. Now I am
starting a new project that I would like to use FB with, but I wanted to get
some advice about the size of FB databases and performance.
For this project, I will be polling remote stores to get aggregated sales
information to put into a centralized database that will be accessible to
clients written using kbmMW midleware. I don't have any concrete idea of
the amount of data that will accumulate over time, but I can imagine will be
a lot.
For instance, sales line item information will be retrieved. That
information will be grouped so that say 1000 raw line items would turn out
to be maybe 200 aggregated items sent up to the server (FB) database.
200 items average a day would be 73000 rows added annually (guessing here).
Some of our clients have 200 stores which could be 14,600,000 rows added
annually to a database. This seems to be an extreme example since better
aggregation would result in fewer rows being added to the database, but who
knows?
Can FB handle this kind of size?
What is the largest number of rows in a table anyone seen?
I know that there are way too many variables to give an accurate yes or no,
but just a general idea would be great...!
Thanks,
Lee