Unicode is universal, so how come that universality doesn’t apply to digits?

Richard Wordingham richard.wordingham at ntlworld.com
Tue Dec 29 13:58:05 CST 2020


On Tue, 29 Dec 2020 11:28:23 -0800
Markus Scherer via Unicode <unicode at unicode.org> wrote:

> What effort? Given basic Unicode support in many programming
> languages and libraries, it takes minutes to go from parsing ASCII
> digits to parsing any & all decimal digits.

I think you've overlooked the paperwork.

There's probably code that relies on non-ASCII digits not being treated
the same way as ASCII digits.

Richard.


More information about the Unicode mailing list