Subject Re: decimal / numeric
Author csswa
The only difference I can see between numeric and decimal types is
that decimal has a sliding precision: decimal(4,2) will accept a
value of 12345678 whereas numeric(4,2) will reject it. So even
though the decimal is defined nominally as four digits (smallint, two
bytes) it actually defines the field as four bytes. Just seems like
slight of hand, but there must be good reason for the distinction
between the two types.

What would be a real-world example of where you would use decimal
rather than numeric?

Andrew Ferguson

--- In ib-support@y..., "Claudio Valderrama C." <cvalde@u...> wrote:
> ""csswa"" <csswa@y...> wrote in message
> news:a8u5f1+7o21@e...
> > The distinction is something to do with numerics storing the
> > precision/scale and decimals storing *at least* that
> > (I may have that back to front. No manual within reach right now.)
> 17) NUMERIC specifies the data type exact numeric, with the decimal
> precision and scale specified by the <precision> and <scale>.
> 18) DECIMAL, specifies the data type exact numeric, with the
decimal scale
> specified by the <scale> and the implementation-defined decimal
> equal to or greater than the value of the specified <precision>.
> MsSql has true numeric. We only have true decimal.
> C.
> --
> Claudio Valderrama C. - -
> Independent developer
> Owner of the Interbase® WebRing