Is there a difference between converting a string of ASCII digits to an integer versus a string of non-ASCII digits to an integer?
Richard Wordingham
richard.wordingham at ntlworld.com
Tue Dec 15 18:42:29 CST 2020
On Tue, 15 Dec 2020 15:45:11 -0800
Markus Scherer via Unicode <unicode at unicode.org> wrote:
> I suspect that Roger is just looking at decimal digits (property gc=Nd
> <https://util.unicode.org/UnicodeJsps/list-unicodeset.jsp?a=%5B%3Agc%3DNd%3A%5D%26%5B%3Anv%3D4%3A%5D&g=bc&i=>
> ).
> I believe that they can all be parsed like strings of ASCII digits
> (and you can call ICU or other libraries to get at the digit values
> and other properties).
> I suggest you double-check about the RTL digits (N'Ko & Adlam);
> please take a look at the relevant Unicode book chapters.
It looks as though the N'ko section documents the significance by
accident! I thought a policy was going to be documented (2012 or
slightly later) that decimal digits are stored most significant
digit first, but that doesn't seem to have happened.
Richard.
More information about the Unicode
mailing list