Subject RE: [firebird-support] Large volumes of data
Author Edwin A. Epstein, III
The data I have to use to link the tables together is a 10 digit number. It
started out in interbase and did not have the BigInt datatype, so I needed
something that was 64bit. Double precision has worked just fine for me. I
could probably convert it now to BigInt, but why fix something that "ain't
broke"?

For lookups I just pass the number as 1234567890, when exported it will look
like 1234567890.00.

-----Original Message-----
From: Kjell Rilbe [mailto:kjell.rilbe@...]
Sent: Friday, February 18, 2005 8:42 AM
To: firebird-support@yahoogroups.com
Subject: Re: [firebird-support] Large volumes of data



Edwin A. Epstein, III wrote:

> I have a couple of databases with 20 million + rows in 2 or 3 tables
linked
> by indexed double precision fields. I have not needed to link them with a
> foreign key constraint. I have PK's on double precision fields in each of
> the tables. As far as speed goes, lookups on 2 joined tables takes about
> 0.062 seconds. Fetching 10K-100K records at a time takes me about 8-30
> seconds.

Are you saying that you're using a floating point type for PK/FK? I
would have thought that floating point numbers would be inherently
problemativ for such purposes, considering they will suffer
truncations/round-off errors as soon as you pass the through some API
that uses a different floating point number format, or to/from string
format.

Or am I missing something?

Kjell
--
--------------------------------------
Kjell Rilbe
Adressmarknaden AM AB
E-post: kjell.rilbe@...
Telefon: 08-761 06 55
Mobil: 0733-44 24 64



Yahoo! Groups Links