Subject | Re: time series and FB/IB |
---|---|
Author | Jan Pomahac <honza@ekc.cz> |
Post date | 2002-12-10T13:14:46Z |
--- In ib-support@yahoogroups.com, "Alan McDonald" <alan@m...> wrote:
from different locations), ...
times - to correct or delete bad data ).
How do I ensure data integrity in external file, when those data are
accessed from different clients and we plat to publish part of our
data on Internet.
> two questions,the FB
> 1. where is the data stored for a week until they are batched into
> database?Files on other PC (offline), floppy disks ( as staff collects data
from different locations), ...
> 2. once in the database, are you updating the data or justreading ,
> summarising and otherwise analysing?Primarily reading, but we also update data ( usualy once or two
times - to correct or delete bad data ).
>here and
> I think someone here made a good suggestion to something similar
> that's if it's not being updated, why not store it in an externalfile?
How do I ensure data integrity in external file, when those data are
accessed from different clients and we plat to publish part of our
data on Internet.
>Jan Pomahac
> Alan
>
> -----Original Message-----http://docs.yahoo.com/info/terms/
> From: honza_pom <honza@e...> [mailto:honza@e...]
> Sent: Tuesday, 10 December 2002 10:44 PM
> To: ib-support@yahoogroups.com
> Subject: [ib-support] Re: time series and FB/IB
>
>
> Hi all,
>
> I am talking about aprox. 350 000 000 values (1 per minute per
> series), which I managed to store in aprox. 10 000 000 rows ( 30
> minutes in 1 row ). Now the main data table structure is:
>
> where_id integer,
> what_id integer,
> year integer,
> month integer,
> day integer,
> hour integer,
> halfhour integer, // (0 or 30)
> val00 double,
> val01 double,
> .
> .
> .
> val29 double
>
> where "where" and "what" have foreign key to another tables.
>
> Another thing I should mention is that we are storing more types
> of series:
>
> Data measured every minute,
> Data measured (or computed from series above) every 30 minutes ,
> Data measured (or computed from series above) every day.
>
> The tables for other series types have similar structure.
>
> We have to add new type of series ( Data measured or computed every
> hour), so I am thinking of how to improve our database structure:
> 1. I plan to create new table "series" containing
> fields "series_id", "where" and "what" and use field "series_id" as
> a part of data tables primary key.
> 2. Use timestamp instead fields "year" ... "halfhour".
> I hope these changes can improve database performance, but I am not
> still satisfied with the data part of tables.
> I thought about using array fields to store more data in a row
> (say 1440 minutes in case of first type of series), but I am not
> able to access such fields from Delphi.
> Usualy we insert data in a batch mode once a week.
>
> Jan Pomahac
>
> --- In ib-support@yahoogroups.com, Svein Erling Tysvaer
> <svein.erling.tysvaer@k...> wrote:
> > Hi Jan!
> >
> > No, I do not have any such experience, but you would be more
> likely to get
> > a reply if you specified huge. Approximately how many rows are we
> talking
> > about, how many transactions/connections to insert them and will
> it be an
> > even load throughout the day?
> >
> > We all have different definitions of huge, what makes desktop
> databases
> > cripple is not all that much in the world of c/s databases.
> >
> > Set
> >
> > At 08:40 10.12.2002 +0100, you wrote:
> > >Hi all,
> > >
> > >Does anyone have experience storing huge time series data into
> FB/IB?
> > >
> > >Jan Pomahac
>
>
>
> To unsubscribe from this group, send an email to:
> ib-support-unsubscribe@egroups.com
>
>
>
> Your use of Yahoo! Groups is subject to