Subject Support for Numeric datatypes
Author Geoff Worboys
Hi All,

I've been working on improving IBOs handling of numeric data types.
(Trying to earn my "Team IBO" badge ;-)

(Jason and Helen, this is essentially a continuation of our previously
private discussions regarding the tech sheet, however I am taking this
to the list for some general comment and suggestion.)

I have several improvements being tested, you will be able to read
about them when they are ready to be released.

Some of the results of my tests so far seem to indicate that I need to
make some changes to avoid potential problems at the limits of large
numerics with default native handling for controls (which currently
uses the extended data type). I had thought that the limits of the
range may not be such a problem, but comments from Burak OZLER seem to
indicate that even the extreme limits will be important.

After reviewing what is happening with TDataset I am inclined to setup
so that things work as follows (for BOTH TDataset and natve)...

Scale = 0 - as now, direct integer implementation
Scale <= 4 - use currency unless NOBCD given as a column attrib
(in which case native uses extended, TDataset double)
Scale > 4 - use extended in native, double in TDataset

This approach has the least impact on existing processing and simply
reuses the existing NOBCD attribute to support management of native as
well as TDataset. It also brings the native numeric processing into
line with the TDataset processing.

In some respects NOBCD is misleading, since neither TDataset nor
native (nor Delphi) actually use BCD anyway. It would be better named
NOCURRENCY - since that is the effect in either environment. However
given the TDataset stuff (and the presumption that it should rarely be
required anyway) it is probably acceptable and means the least number
of changes and keeps everything consistent.

I should point out that the current IBODataset implementation only
uses BCD (currency) in Delphi/BCB v5 or later.

Theoretically making this change to native default processing could
introduce problems to existing code when the scale is < 4, because
theoretically this limits the number of digits to the left of the
decimal point to 14. I say theoretically because in my experiements
the last 2-3 digits of an 18 digit extended value do not seem to be
reliable anyway.

In the (hopefully) rare instances where a problem is discovered the
developer can add NOBCD to the column attribute as required. I think
it is better to update the default processing to currency as it is
more consistent with the declared type of numeric. It is much more
relevent now, with more people on IB6/FB1, because large numerics are
now stored as scaled integers on the server.


There are still a couple of items remaining...

1. Whether the "Value" property of TIB_ColumnNumeric should be changed
to the currency data type? Such a change would impact aspects for all
scales. Note that this does not impact default control processing -
since that goes through the AsString processing which I can alter
according to the above. However it may potentially impact other user
code. I am inclined to leave it as extended and add notes to the tech
sheet indicating possible difficulties in using the property.

(Another possibility here is to instantiate different classes of
TIB_ColumnNumeric which define the Value property according to the
rules described above.)


2. Variant processing. Current variant processing (which is used more
than most people probably realise - such as reverting to old column
values) uses AsExtended (in most instances regardless of scale). If we
are really concerned about supporting numerics properly then this is
not acceptable.

Given the apparent problems with variants in Delphi6 (and Kylix?), and
with versions prior to 4, I suggest that we go with a suggestion that
I made some time ago. Variant assignment to be done via AsString.
This will avoid the known problems without losing accuracy of large
numeric values. The downside (there is always a downside) is that
AsString processing will impact performance where Variants are used,
however I suspect that the impact would not be noticeable in most
instances. It would also mean that you cannot detect the true type of
the data from the variant type - but you can blame that on Borland and
Microsoft for not doing things properly the first time :-)


Does anyone see any problems with the above?
Comments, thoughts, suggestions?

--
Geoff Worboys
Telesis Computing