Subject Re: [ib-support] newbie question : NUMERIC and DECIMAL type difference
Author Marco Bommeljé
Hi there,

BigDecimal is a type that is introduced for Oracle only.
Oracle has lots of awkward, non-standard datatypes among which only
one numeric type. It holds exact numerics with a precision of 36 or 38
(can't remember, something huge).

This non-standard implementation is non-transparent for any
middle-ware or application code. So in the end, everyone has to adapt
to Oracle. Which is probably precisely their objective.

Keep the good work going,

Helen Borrie wrote:
> At 02:05 PM 20/02/2003 -0500, you wrote:
> >Ann and/or Helen (I think Helen gives credit for
> >it to Ann) stated a rule of
> >thumb to use DECIMAL for things you "measure" and NUMERIC for things you
> >"count".
> Actually, the mantra is "Use floating types for things you measure and
> fixed types for things you count.
> It is documented that DECIMAL stores only numbers of exactly the scale
> specified, while NUMERIC stores numbers of at least the scale
> specified. But, really, DECIMAL and NUMERIC seem the same to me. That
> is, both seem to overflow when given numbers of larger than the specified
> scale. I think there might have been a difference when InterBase stored
> numbers as 32-bit integers.
> If I have a rule of thumb about scaled numerics, it is to use either
> numeric or decimal and make sure I store a big enough scale to accommodate
> the results of any multiplications or divisions.
> Sorry I can't help with any enlightenment on BigDecimal though. If you
> still don't have a satisfactory answer, I recommend asking directly on the
> Jaybird list (
> heLen
> *Yahoo! Groups Sponsor*
> <*>
> To unsubscribe from this group, send an email to:
> Your use of Yahoo! Groups is subject to the Yahoo! Terms of Service
> <>.

-- Marco Bommeljé
-- Bommeljé Crompvoets en partners bv
-- W:
-- E: mbommelj@...
-- T: +31 (0)30 2428369