Subject Re: [ib-support] Re: time series and FB/IB
Author Svein Erling Tysvaer
10 000 000 rows at 1 row per half hour. That should be about 570 years of
data in your table. Still, it is not all that big - even we have tables
that size (albeit not using time series).

Provided you use a generated field for your PK and have at least a
descending index on your datetime field (assuming latest data are most
queried) this should be possible to make work OK. Note though that I've
never worked with time series, if you need to go through 10000000 rows to
calculate row # 10000001 and repeat that every 30 minutes, I guess it could
consume quite a bit of processing power.

>1. I plan to create new table "series" containing
>fields "series_id", "where" and "what" and use field "series_id" as
>a part of data tables primary key.

The series table could be a good idea, but don't use the series_id as part
of the PK of another table. Use a generator to populate the PK, never use
composite primary keys.

>2. Use timestamp instead fields "year" ... "halfhour".

Good idea (I think).

>I hope these changes can improve database performance, but I am not
>still satisfied with the data part of tables.
>I thought about using array fields to store more data in a row
>(say 1440 minutes in case of first type of series), but I am not
>able to access such fields from Delphi.

I think IBO let you access array fields, but I've never used them myself.

>Usualy we insert data in a batch mode once a week.

This could be OK, if it takes too long time inserting them you just upgrade
to once a day/once an hour, whatever.