Subject | Is it safe to use UNICODE_FSS? |
---|---|
Author | Federico Tello Gentile |
Post date | 2005-09-05T14:25:55Z |
I'm using Firebird 1.5.1, with the latest JayBird JDBC driver (1.5.5). I
created the database with UNICODE_FSS default charset. My application is
a pure java 1.5 running with tomcat. So far I can insert and retrieve
non ascii values form the db and they seem to show up as I entered them.
I know UNICODE_FSS is broken and it has problems, but I'm not quite sure
which problems are they.
On a previous project I did, I ended up using ISO8859_1, but I remember
some people recomending me keeping UNICODE_FSS.
I know the JDBC driver handles unicode well, as Java is great at it, but
should I keep using it in this database with FB 1.5.1? I don't plan to
transliterate neither to ASCII nor to any other charset. All my input is
UTF-8 and my output should be UTF-8, too.
All I want when FB 2.0 comes out is to be able to create a script with
all the insert statements in my DB and run then in FB 2.0 without losing
the UTF-8 characters.
Thanks.
created the database with UNICODE_FSS default charset. My application is
a pure java 1.5 running with tomcat. So far I can insert and retrieve
non ascii values form the db and they seem to show up as I entered them.
I know UNICODE_FSS is broken and it has problems, but I'm not quite sure
which problems are they.
On a previous project I did, I ended up using ISO8859_1, but I remember
some people recomending me keeping UNICODE_FSS.
I know the JDBC driver handles unicode well, as Java is great at it, but
should I keep using it in this database with FB 1.5.1? I don't plan to
transliterate neither to ASCII nor to any other charset. All my input is
UTF-8 and my output should be UTF-8, too.
All I want when FB 2.0 comes out is to be able to create a script with
all the insert statements in my DB and run then in FB 2.0 without losing
the UTF-8 characters.
Thanks.