Subject 32 bit vs 64 bit arthmetic
Author Tim Ward
So I've got this stored procedure which takes DECIMAL(18,6) parameters
and returns DECIMAL(18,12) results, and internally copies stuff into and
out of DOUBLE PRECISION variables which it uses to call a UDF to do the
actual calculations.

And the results are different (in the 8th decimal place or so) between
running on 32 bit and 64 bit systems.

I would expect that Firebird is going to do exactly the same thing on
both, down to the last bit, as the data types are defined to be what
they are and nothing to do with the native word length of the machine?

So I've got to look at the UDF for the difference?

(Which I'd not expect to find there either, actually, *if* the code is
the same, as IEEE arithmetic also shouldn't change just because the
lengths of integers changes. So I suspect that what I'm actually going
to be looking for is a difference in the code in the two different
versions of the library called by the UDF.)

--
Tim Ward