Subject | UTF8 and ANSI column in one table? |
---|---|
Author | |
Post date | 2014-02-13T12:35:48Z |
Hi guys.
We are thinking (and thinking and thinking and thinking...) about converting our database to UTF8. The thing is that we have some columns that we would like to keep as ANSI columns, the reasons for that are:
1. We want to save some space, we have VARCHAR(300) column, after switching to UTF8 it will take 4 times more space and this matters because this table has a lot of records. We do not want our database to grow from 20 GB to 70 GB (10GB for other tables) becuase of that.
2. Above mentioned column has been treated for us as a storage for binary data. We are putting raw bytes there. We do not want to replace it with blob because: (a) - its size will be bigger, (b) - blobs are slower, (c) blobs for 300 bytes? come on;)
3. We have one column that is varchar(2000) after converting it to UTF8 it will take 8000 bytes thus we won't be allowed to construct index on it...
Please tell me, are my reasons valid for keeping up some columns as ANSI columns? I have not tested it thoroughly but I think that FIBPlus components have no problem in reading and selecting data from such table at once (I mean, selecting ANSI column and UTF8 in the same query). However Flamerobin does not work properly in such case and Firebird Maestro also have problems.
Flamerobin: allows me to configure connection to database, when I setup it as ANSI-based connection then I can read properly ASNI-based columns but not UTF8 columns. If I setup UTF8 based connection then I see UTF8 columns properly but not the ANSI ones. I can never see both types of columns properly in the same query.
Firebird Maestro: here I can not setup character base for my connection (a big drawback in my opinion). After executing a query I see both types of columns properly but I can not filter them properly in WHERE clause.
What are your experiences with this problem? Do you know any tool that handles both types of columns properly?
Thanks in advance.
We are thinking (and thinking and thinking and thinking...) about converting our database to UTF8. The thing is that we have some columns that we would like to keep as ANSI columns, the reasons for that are:
1. We want to save some space, we have VARCHAR(300) column, after switching to UTF8 it will take 4 times more space and this matters because this table has a lot of records. We do not want our database to grow from 20 GB to 70 GB (10GB for other tables) becuase of that.
2. Above mentioned column has been treated for us as a storage for binary data. We are putting raw bytes there. We do not want to replace it with blob because: (a) - its size will be bigger, (b) - blobs are slower, (c) blobs for 300 bytes? come on;)
3. We have one column that is varchar(2000) after converting it to UTF8 it will take 8000 bytes thus we won't be allowed to construct index on it...
Please tell me, are my reasons valid for keeping up some columns as ANSI columns? I have not tested it thoroughly but I think that FIBPlus components have no problem in reading and selecting data from such table at once (I mean, selecting ANSI column and UTF8 in the same query). However Flamerobin does not work properly in such case and Firebird Maestro also have problems.
Flamerobin: allows me to configure connection to database, when I setup it as ANSI-based connection then I can read properly ASNI-based columns but not UTF8 columns. If I setup UTF8 based connection then I see UTF8 columns properly but not the ANSI ones. I can never see both types of columns properly in the same query.
Firebird Maestro: here I can not setup character base for my connection (a big drawback in my opinion). After executing a query I see both types of columns properly but I can not filter them properly in WHERE clause.
What are your experiences with this problem? Do you know any tool that handles both types of columns properly?
Thanks in advance.