Subject Re: [ib-support] Re: 3 * 1/3 = 0 ???
Author Paul Schmidt
On 30 Aug 2002 at 19:01, rogervellacott wrote:

> The issue is not whether it is legitimate to have integer
> operations. It is whether literal values should be interpreted as
> integers by default.
> Division by literal values should default to the accurate result, not
> the inaccurate result. So 1/3 should default to 0.3333.. If I have
> declared a variable x as in integer, then it is reasonable that x/3
> (where x = 1) should return 0. But nowhere was I asked to declare
> that 1 was an integer, or that 3 was an integer. So this behaviour
> makes SQL arithmetic quite esoteric, and accident prone.

When you pass 1 that is an integer (no decimal point), 3 well that is also an integer,
since everything is integers, it does integer math, remember grade-school math (it's
getting tough it's ~25 years ago for me), 1 divided by 3 is 0, 3 goes into 1 exactly 0
times, with 3 remainder, since we don't care about the remainder it gets dropped, so
we are left with 0, 0 times 3 is still 0. So 0 is, the right answer.

If we use 1.0 and 3.0 that's real math, so 1.0 / 3.0 is 0.33 (mathematically it repeats,
ad nausium), so how many times do we want to represent the repeat? Let's try an
experiment, 1.000 / 3.000 = .333333 so (1.000/3.000) * 3 should give us .999999
(and using Dialect 3 on Firebird, it does). In fact so does (1.000000 / 3) * 3.

It's being consistant, it totals the number of characters behind the decimal point, as
the basis for it's representation, fantastic. In fact the integer with no characters
behind the decimal point is still consistant, your left with nothing behind the decimal
point, since there is nothing in front of the decimal either your left with nothing.

Paul Schmidt, President
Tricat Technologies