Even Java, which was designed from the beginning to use Unicode characters on the theory that Unicode was the going to be the character encoding of the future, has run into trouble since Java characters are defined to be a 16-bit quantity and the Unicode 3.1 standard extended the range of the Unicode character set to require a 21-bit representation. Ooops.
http://www.gigamonkeys.com/book/numbers-characters-and-strings.html
http://www.gigamonkeys.com/book/numbers-characters-and-strings.html