Scientific Notation question
Martin Green
greenmr at gmail.com
Fri Feb 26 15:13:27 CST 2016
Unicode LDMR reference says this about Scientific Notation patterns:
* When using scientific notation, the formatter controls the digit
counts using significant digits logic. The maximum number of
significant digits limits the total number of integer and fraction
digits that will be shown in the mantissa; it does not affect
parsing. For example, 12345 formatted with "##0.##E0" is "12.3E3".
See the section on significant digits for more details.
I don't understand how they arrive at this. From the result of "12.3E3"
I assume they have determined the maximum significant digits in the
pattern to be three, but I just don't see that. My expectation would be
that "##0.##E0" would format 12345 as "123E2". The only thing I can
think of is that perhaps scientific notation MUST have a fractional
part, which would mean that the result for "E2" would be "123.0E2"
which is FOUR significant digits. Since I don't see what made them
decide there can only be three significant digits I don't get why
"123.0E2" would be an invalid result.
What am I missing?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://unicode.org/pipermail/cldr-users/attachments/20160226/16757033/attachment-0001.html>
More information about the CLDR-Users
mailing list