Short number patterns
Matthew Stuckwisch via CLDR-Users
cldr-users at unicode.org
Wed Apr 10 19:23:03 CDT 2019
I am writing a module that uses CLDR data and am somewhat confused as to the correct method of interpreting CLDR number data. Looking at English's short decimal formats, we have the following (trimmed for clarity)
<pattern type="1000" count="one">0K</pattern>
<pattern type="10000" count="one">00K</pattern>
<pattern type="100000" count="one">000K</pattern>
Given the number 12345, how does the result end up actually being 12K? The number pattern 00 is for two digits, of course, but how is the number reduced to 12? It can't be from significant digits, because those are defined by @ (and would result in trailing zeros), and it can be from maximum integer digits (which is neither set in the number pattern nor trims the smaller units). Dividing by 10000 (from the type attribute) nets 1.2345, which for the pattern 00K should be 01K.
What have I been misinterpreting between the data and the standard or do the short forms have a different pattern syntax where 0 defines a sort of trimmed significant digits?
More information about the CLDR-Users