Subject Re: time series and FB/IB
Author Jan Pomahac <>
--- In, Svein Erling Tysvaer
<svein.erling.tysvaer@k...> wrote:
> 10 000 000 rows at 1 row per half hour. That should be about 570
years of

8 years of cca 70 series.

> data in your table. Still, it is not all that big - even we have
> that size (albeit not using time series).
> Provided you use a generated field for your PK and have at least a
> descending index on your datetime field (assuming latest data are
> queried) this should be possible to make work OK. Note though that
> never worked with time series, if you need to go through 10000000
rows to
> calculate row # 10000001 and repeat that every 30 minutes, I guess
it could
> consume quite a bit of processing power.
> >1. I plan to create new table "series" containing
> >fields "series_id", "where" and "what" and use field "series_id"
> >a part of data tables primary key.
> The series table could be a good idea, but don't use the series_id
as part
> of the PK of another table. Use a generator to populate the PK,
never use
> composite primary keys.

Why? I think that index on "series_id" would have low selectivity,

> >2. Use timestamp instead fields "year" ... "halfhour".
> Good idea (I think).
> >I hope these changes can improve database performance, but I am
> >still satisfied with the data part of tables.
> >I thought about using array fields to store more data in a row
> >(say 1440 minutes in case of first type of series), but I am not
> >able to access such fields from Delphi.
> I think IBO let you access array fields, but I've never used them
> >Usualy we insert data in a batch mode once a week.
> This could be OK, if it takes too long time inserting them you
just upgrade
> to once a day/once an hour, whatever.
> HTH,
> Set