Negative/Negation Sign

Asmus Freytag asmusf at ix.netcom.com
Sat Oct 29 16:42:22 CDT 2022


On 10/29/2022 1:18 PM, Sławomir Osipiuk wrote:
> On Saturday, 29 October 2022, 15:43:03 (-04:00), Asmus Freytag via 
> Unicode wrote:
>
>     According to
>     https://en.wikipedia.org/wiki/TI_calculator_character_sets the
>     "negation" is mapped to U+207B SUPERSCRIPT MINUS in TI Character
>     sets. Unless that information is definitely incorrect, this should
>     be the end of discussion.
>
>     A./
>
> I tried to look through the sources for that page but found no 
> definitive mapping. The Unicode values seem to have simply been 
> matched by sight by the editor. The sources contain only bitmaps of 
> the characters and their TI-internal byte values. Just another 
> reminder that Wikipedia is not always reliable.

The Wikipedia article does show a mapping. And, no matter its origin, 
that mapping appears uncontested. (I haven't looked through the page 
history, but that's where you would find any disagreement on the issue; 
unless you can point us to something in there, I'll assume it's 
uncontested; let me know what you find).

Because it's a mapping and out there, there's now a published choice for 
how to represent that character in Unicode. That fact alone changes the 
question from a completely open one to one where there's a de-facto 
"proposed solution". If you (or anyone) disagrees, you would have to 
demonstrate why that choice is incorrect or insufficient.

And, "matching by sight" isn't necessarily an incorrect approach. 
Unicode distinguishes between the identity of a character and the thing 
that it denotes in a certain context --- with very deliberate exceptions.

For '.', for example, the precedent is very strong: The identity is the 
"period" whether used as a full stop or decimal point, or delimiter in 
internet addresses or abbreviation marker.  For ':' we don't code a 
different character for the use of abbreviation marker in Swedish, and 
so on.

For letters, on the other hand, membership in a certain script, or 
having a particular case mapping can contribute to the defining 
characteristics of a character's identity, leading to disunification of 
otherwise identical shapes.

For dashes, Unicode considers that differences length, and position 
relative to baseline or centerline are charateristics that make two 
dashes distinct symbols. However, that means that when two dashes have 
identical appearance, they should not be disunified based simply on how 
they are used. (The issue is a bit more complex than that, because ASCII 
unifies two of them into 002D, but that's a historical one-off, not a 
precedent).

So, if you disagree with this mapping, you'll have to demonstrate that 
there's a consistent visual difference to the "actual" character, such 
that it would render SUPERSCRIPT MINUS distinct from the unary negation. 
Otherwise, the conclusion stands that there is one known convention (TI 
character set) that uses SUPERSCRIPT MINUS to indicate unary negation.

A./

PS: interestingly enough, one of the sources cited for the Wikipedia 
article actually has a mapping to U+203E (spacing overline). You now 
have two choices of "de-facto" mappings; however, I think we can agree 
that U+203E seems a much poorer match for the glyph given for negation 
that U+207B; the former is at caps height, the latter between centerline 
and caps. The dot matrix glyph image has the negation 1 pixel above 
center.  The resolution severely limits the available positions; like 
the position of SUPERSCRIPT MINUS in Cambria math, the TI negation sits 
on just between the centerline of superscripted digits and their 
(raised) baseline. I think whoever came up with that mapping did a 
better job than whoever mapped this to U+203E.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://corp.unicode.org/pipermail/unicode/attachments/20221029/ded23eff/attachment-0001.htm>


More information about the Unicode mailing list