Subject RE: [firebird-support] Optimal algorithm for inventory
Author Svein Erling Tysvær
A few thoughts:

Would anyone actually care to know exactly how many people lived in London 23 May 1930 or would an approximation be good enough? If you e.g. had a table with the population each year, you could do a decent guess by doing something like

SELECT ((cast('23.05.1930' as Date) - cast ('31.12.1929 as Date)) / (cast('31.12.1930' as Date) - cast ('31.12.1929 as Date)) * a1929.Population +
(cast('31.12.1930' as Date) - cast ('23.05.1930 as Date)) / (cast('31.12.1930' as Date) - cast ('31.12.1929 as Date)) * a1930.Population)/2
from Population a1929
join Population a1930 on a1929.Town = a1930.Town
and a1929.Year = a1930.Year - 1
where a1929.year = 1929

This would assume that the population grew/decreased linearly throughout the year, if that is too crude an approximation, you could store monthly/weekly totals rather than yearly of course. Also note that there are errors in what I wrote (like subtracting a date from another date returning an integer and integer/integer always yielding an integer so the number to be multiplied with the population will always be 0 unless you modify things a bit), it was just an example.

It shouldn't be too much of a problem breaking things down into regions. I sometimes use a table containing the population of Norway for each year (albeit only a fifty year timespan), and we have separate records for municipalities, counties and Norway as a whole (and we have columns for each age from 0-99 years). It is a small table with just over 64000 rows. Of course it would grow if I should try to store the population for each day rather than each year, but I doubt it would cause Firebird much trouble looking it up. Lets say you have 10000 places you may want to know the population of and that you're interested in the population for each day the last century. That would require a table of 365.25 million rows. I would assume it to be a time consuming task to populate this table the first time, but once that's done, things ought to perform pretty well with decent indexing and adding 10000 records each day should be no problem.