Is there a difference between converting a string of ASCII digits to an integer versus a string of non-ASCII digits to an integer?

Bill Poser billposer2 at gmail.com
Tue Dec 15 17:10:07 CST 2020


What do you mean by "non-ASCII digits"? Things like superscript and
subscript versions of the usual Western "Arabic' numbers? Or are you
talking about numbers like those of Chinese, roman numerals, Tamil, etc.?
In the case of the former, once you map the digits to their standard forms,
the algorithm is the same. In the case of the latter, no, in many cases
very different algorithms are required.

On Mon, Dec 14, 2020 at 12:28 PM Roger L Costello via Unicode <
unicode at unicode.org> wrote:

> Hi Folks,
>
> As I understand it, when the C programming language was created it just
> used ASCII. Programs written in C used ASCII digits.
>
> Nowadays C supports Unicode and Unicode contains more digits than just the
> ASCII digits. (I think) modern C programs can express numbers using strings
> of non-ASCII digits.
>
> Questions:
>
> 1. Is the algorithm for converting a string that contains non-ASCII digits
> different than the algorithm for converting a string containing ASCII
> digits?
>
> 2. The C function atoi() converts a string of digits to a number. I have
> seen the source code for atoi(). The source code that I saw was dated
> around the year 2000. Can you point me to the modern source code for atoi()?
>
> /Roger
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://corp.unicode.org/pipermail/unicode/attachments/20201215/5bff9c77/attachment.htm>


More information about the Unicode mailing list