Subject Major problems when using multiple character sets
Author Jim Beesley
More posting issues ... hopefully this one will make it through ... sorry if it is a duplicate.

This is currently a major issue for us, and any help is greatly appreciated.

Anyway, is it possible to fetch (and send via input parameters) the data from 2 character columns with different character sets without having them converted to the connection character set?

As an example, let's say a user has a table with 2 columns: one is ISO8859_1 and the other is BIG_5 (or whatever ...)
No matter what character set I connect with (*), there will be some possible data that con not be transcoded/copied to the sqlvar->sqldata buffer (i.e. a character in the column being fetched may not be representable in the connection character set).

Ideally, I'd set the sqlda->sqlvar->sqlsubtype to CS_BINARY, and the data would be fetched "as-is" but it appears that the subtype field gets ignored during fetch.
I'm using dsql calls if that makes any difference.

Is there anything obvious that I'm missing?

(*) UNICODE_FSS won't work - try inserting 'xíx' into a CHAR(3) CHARACTER SET ISO8859_1, and then try to fetch it after connecting with UNICODE_FSS - you'll get a truncation error internally (no matter how large you make your UNICODE_FSS sqlvar fetch buffer) because firebird only allocates buffers of the source column size.

(**) Connecting as NONE doesn't work either. Fetching works fine, since I just get the bytes "as-is", but if I try to use input parameters - like INSERT INTO TABLE VALUES (?), I'll get a transcoding error if I use any NLS characters. For example, try setting the sqlvar->sqldata for a parameter to 'xíx' for a CHAR(3) CHARACTER SET ISO8859_1 column ... it _ought_ to work, but the input parameter is apparently assumed to be in the NONE character set, rather than the target character set, or the subtype.

Thanks in advance,
-Jim Beesley