chicken (chicken_cem) wrote in mozilla,
chicken
chicken_cem
mozilla

Decimal Unicode or Hex Unicode?

Do any of you bright minds know WHY some programmers decide to use Decimal Unicode notation (e.g., Firefox), whereas others decide to use Hex Unicode notation (e.g. the Gnome/libxml2/xmllint folks)?

My wish is that:

1.) They had all gotten together and decided on just one,
2.) They had all chosen Hex, padded on the left with zeroes so that all characters are represented by a full four digits (e.g.  rather than  for Â), the latter of which is NOT conformant to the Unicode specification, which requires four to six digits, not bloody darn one, two or three).

I know they are functionally equivalent, and I know it's a simple mathematical calculation to go from one to the other, and that Perl and PHP both probably have built-in functions that I could use to convert from one to the other, but GRRRR. I long for consistency, and god forbid, standards-compliance.

Also, why if Firefox smart enough to take UTF-8 input in an HTML form and convert it automatically to Decimal Unicode (which mySQL 4.0 can actually understand), but Safari is not smart enough (and of course, neither is I.E., duh).

--

Don't get me wrong, I don't hate Safari. Safari is FAR better than Firefox at rendering certain difficult Unicode glyphs, like Devanagari (Sanskrit) that has internal HTML markup inside conjunct consonants (Opera 6.03 comes close on this). However, when it comes to Hebrew, Greek, and Romanian (the others I've had to deal with recently), Firefox is just hands-down better.
Subscribe
  • Post a new comment

    Error

    default userpic

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 0 comments