The product I work on currently supports English and Japanese via CodePages. I’ve never really dealt with globalized applications before, so I’m doing quite a bit of research on the matter before I jump in. I’m reading Developing International Applications and finding it to be a fascinating read. I’d always wondered what the System.Text.Encoding object was really all about and now I know. If you read Joel Spolsky, you may have seen his article on globalization, which is a nice intro. DIA is a good bit deeper, weighing in at 529 pages of prose and around 520 pages of appendix. It includes code samples in both Win32 and .NET, which is fantastic, and covers ASP, ASP.NET, and Sql Server in some detail. Additionally, it has the simplest explanation of how to conditionally compile your application for Unicode or MBCS and what it really means. It’s just fantastic.
From what I’m reading, I’m realizing that what we’re doing currently is dead wrong. We send down content encoded using old-style code pages instead of UTF-8 encoded Unicode. I’ll have to work on fixing that. The other thing I found out today is that we store our Japanese content in the ISO-2022-JP code page. I’d like to consolidate the content (it’s currently stored in two separate tables as text, not ntext) into one table with ntext, but there seems to be some limitations with collations and indexes that might be troublesome.
Has anyone else dealt with globalized web sites? Does UTF-8 really work as well as advertised? Some experience from the gallery would really be great.