Difference between Klingon and Tengwar and emoji and indigenous languages

Tex textexin at xencraft.com
Wed Sep 15 17:52:33 CDT 2021

Some day in the future, this thread and other similar threads are going to be compared with the current discussions in the scientific community and the medical community that are now recognizing that only theories and rationales supported by insiders or considered “standard science”are being brought forward, and worthy research has been suppressed, delayed and dismissed, only to be found to be true years and decades later.


1)      In today’s heavily technological world, both demand and innovation are only comparable when there is equivalent access to technology. Demand and innovation are both suppressed when technology is not available. So Klingon is at an unfair disadvantage. (As are many indigenous languages.)

2)      The arguments catering to emoji as a separate entity are quite unfair. Emoji could be subdivided into classes just as most other scripts are treated as distinctive even when they are derivative, and then demand is looked at separately. Arguing that the tail is always small, when instead there are entire categories of emoji that could be described as having less usefulness than Klingon is preposterous. (In fact, I would argue that the abundance of similar looking emoji, makes it hard to recognize their intended distinctive meanings, diluting their usefulness altogether. I need a magnifying glass and a dictionary to make sure the emoji I select is roughly what I intend to say.)

3)      Emoji is in a different class as well because there is a committee within Unicode which acts both to administer (or regulate). Having such a committee spurs innovation and demand. Imagine if there was an “invented” scripts committee… We would then have much more (both legitimate and unwarranted) activity.

4)      There are 50+ years of interest in Klingon. At this point, the lack of technological support if anything has acted to suppress demand and interest (unsuccessfully). Hard to say how much activity there would be if Klingon had been supported for several years now.


Perhaps the more important aspect of all of this, is how much harm is done when a language with sufficient interest (I have in mind indigenous languages more than Klingon) is overly scrutinized and takes years to be incorporated into the standard and more years to be incorporated into major platforms to become usable. The difficulty for indigenous communities who risk losing their culture and history is significant.

I understand the difficulty of defining all the properties and the associated algorithm support for new scripts. However, I wonder if there isn’t a process that could be adopted where characters are allocated to Unicode and rudimentary glyphs defined, in a category of preliminary. This would speed incorporation into technologies. The text could then be used in primitive or experimental ways by the interested communities, with further standardization of properties etc. coming with further study.

I am not suggesting all proposals be adopted this way, just those with sufficient interest but perhaps insufficient documentation.


This would at the least help indigenous communities preserve text and even allow the communities to assist in the definition of properties for further standardization, as they work with and understand how the text needs to operate to meet their intent.

And the same is true for languages like Klingon.


(I tried really hard not to respond to this thread, but I finally couldn’t. And I will try to not debate the above. Ducking and running)


From: Unicode [mailto:unicode-bounces at corp.unicode.org] On Behalf Of Mark E. Shoulson via Unicode
Sent: Wednesday, September 15, 2021 2:00 PM
To: unicode at corp.unicode.org
Subject: Re: Difference between Klingon and Tengwar


On 9/15/21 4:47 PM, Asmus Freytag via Unicode wrote: 

It's a writing system that has global reach (even if not "high-brow") and is actively, you could even say enthusiastically, supported by systems/font vendors (and users). 

I was telling someone once about Unicode: it's the standard for representing letters of all alphabets, etc, they're the ones who officially encode emoji, etc.  The response was surprise: "Why encode emoji?  Who uses those?"  "Um... millions of people, every day, in tweets and stuff?"  "Yeah, but apart from that?"  Well, yeah, apart from the people who use them, nobody uses them.  But that's true of English letters too.  Just that emoji usage wasn't "high-brow" enough for this listener, apparently. 

It's more like encoding a brand-new character in the IPA that hasn't seen use yet, but we know people use the IPA and so this letter will be used.  (I know, the parallel isn't perfect: an IPA character would have been approved by the IPA, etc.  Try to see the forest for the trees.) 

When it comes to new items, mathematical symbols may be more similar. Because of existing, parallel technologies, like TeX, it's possible for that notation to innovate in advance of standardizing by Unicode. However, de-facto, the collection is unbounded and actively being added to. Not all fields of mathematics will ever expand with equal popularity; so there's a similar issue with additions not equally guaranteed to be of the same importance/ popularity/longevity. 

Yeah, that's a good example, though math symbols also have to show usage before being encoded.  They have better mechanisms for avoiding the chicken-and-egg problem. 

When it comes to immediate support, currency symbols come to mind. They form an unbounded set of their own, with active innovation happening, but users not really having a choice whether or not to use a new symbol (the only thing is that the currency could fail and all usage to become historical). 

This is probably a better example: there is built-in demand that we know is there, and it's adding a symbol to an "alphabet" that's already supported. 

So, yeah, emoji are weird, but I don't think they can be generalized. 

They fit the intersection between pictographic writing systems with unbounded collection and writing systems (symbol collections) with active innovation. 

To the extent that no other system shows just that combination of trends you can't derive any parallels; on the other hand, they have a define place in any Venn diagram of writing systems. 

Yes.  By "generalized" I meant you can't generalize Unicode's treatment of them to other situations.  I think we're saying the same thing. 


-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://corp.unicode.org/pipermail/unicode/attachments/20210915/b7f9ad9d/attachment.htm>

More information about the Unicode mailing list