Subject | RE: [ib-support] Re: time series and FB/IB |
---|---|
Author | Alan McDonald |
Post date | 2002-12-10T12:14:05Z |
two questions,
1. where is the data stored for a week until they are batched into the FB
database?
2. once in the database, are you updating the data or just reading ,
summarising and otherwise analysing?
I think someone here made a good suggestion to something similar here and
that's if it's not being updated, why not store it in an external file?
Alan
-----Original Message-----
From: honza_pom <honza@...> [mailto:honza@...]
Sent: Tuesday, 10 December 2002 10:44 PM
To: ib-support@yahoogroups.com
Subject: [ib-support] Re: time series and FB/IB
Hi all,
I am talking about aprox. 350 000 000 values (1 per minute per
series), which I managed to store in aprox. 10 000 000 rows ( 30
minutes in 1 row ). Now the main data table structure is:
where_id integer,
what_id integer,
year integer,
month integer,
day integer,
hour integer,
halfhour integer, // (0 or 30)
val00 double,
val01 double,
.
.
.
val29 double
where "where" and "what" have foreign key to another tables.
Another thing I should mention is that we are storing more types
of series:
Data measured every minute,
Data measured (or computed from series above) every 30 minutes ,
Data measured (or computed from series above) every day.
The tables for other series types have similar structure.
We have to add new type of series ( Data measured or computed every
hour), so I am thinking of how to improve our database structure:
1. I plan to create new table "series" containing
fields "series_id", "where" and "what" and use field "series_id" as
a part of data tables primary key.
2. Use timestamp instead fields "year" ... "halfhour".
I hope these changes can improve database performance, but I am not
still satisfied with the data part of tables.
I thought about using array fields to store more data in a row
(say 1440 minutes in case of first type of series), but I am not
able to access such fields from Delphi.
Usualy we insert data in a batch mode once a week.
Jan Pomahac
--- In ib-support@yahoogroups.com, Svein Erling Tysvaer
<svein.erling.tysvaer@k...> wrote:
ib-support-unsubscribe@egroups.com
Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/
1. where is the data stored for a week until they are batched into the FB
database?
2. once in the database, are you updating the data or just reading ,
summarising and otherwise analysing?
I think someone here made a good suggestion to something similar here and
that's if it's not being updated, why not store it in an external file?
Alan
-----Original Message-----
From: honza_pom <honza@...> [mailto:honza@...]
Sent: Tuesday, 10 December 2002 10:44 PM
To: ib-support@yahoogroups.com
Subject: [ib-support] Re: time series and FB/IB
Hi all,
I am talking about aprox. 350 000 000 values (1 per minute per
series), which I managed to store in aprox. 10 000 000 rows ( 30
minutes in 1 row ). Now the main data table structure is:
where_id integer,
what_id integer,
year integer,
month integer,
day integer,
hour integer,
halfhour integer, // (0 or 30)
val00 double,
val01 double,
.
.
.
val29 double
where "where" and "what" have foreign key to another tables.
Another thing I should mention is that we are storing more types
of series:
Data measured every minute,
Data measured (or computed from series above) every 30 minutes ,
Data measured (or computed from series above) every day.
The tables for other series types have similar structure.
We have to add new type of series ( Data measured or computed every
hour), so I am thinking of how to improve our database structure:
1. I plan to create new table "series" containing
fields "series_id", "where" and "what" and use field "series_id" as
a part of data tables primary key.
2. Use timestamp instead fields "year" ... "halfhour".
I hope these changes can improve database performance, but I am not
still satisfied with the data part of tables.
I thought about using array fields to store more data in a row
(say 1440 minutes in case of first type of series), but I am not
able to access such fields from Delphi.
Usualy we insert data in a batch mode once a week.
Jan Pomahac
--- In ib-support@yahoogroups.com, Svein Erling Tysvaer
<svein.erling.tysvaer@k...> wrote:
> Hi Jan!likely to get
>
> No, I do not have any such experience, but you would be more
> a reply if you specified huge. Approximately how many rows are wetalking
> about, how many transactions/connections to insert them and willit be an
> even load throughout the day?databases
>
> We all have different definitions of huge, what makes desktop
> cripple is not all that much in the world of c/s databases.FB/IB?
>
> Set
>
> At 08:40 10.12.2002 +0100, you wrote:
> >Hi all,
> >
> >Does anyone have experience storing huge time series data into
> >To unsubscribe from this group, send an email to:
> >Jan Pomahac
ib-support-unsubscribe@egroups.com
Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/