Power and Responsibility and Cultural Respect

It took me a while to fully appreciate the issue that Twitter user Yoshimasa Niwa was describing. At first glance I and many others assumed that setting Japanese over English would solve his app library sorting issue.

Then I realized that wasn’t his point at all. The software app in the screenshot is the Yahoo Japan ‘Norikae Annai’ transit app, one of the most popular free stand alone transit apps in Japan. I use it all the time. It’s a Japanese app with a Japanese name but the basic iOS English sorting algorithm ignores this and assumes all Chinese characters used everywhere must follow modern mainland China’s Simplified Chinese rules for reading and sorting.

This is ridiculous as assuming that all Roman based character sets everywhere must follow modern Italian reading and sorting rules. I always find that westerners assume the Kanji culture flow was always one way from China which it is not, with different and unique readings, usages, and Japanese Kanji like shitsuke 躾 traveling the other way over the centuries. The same is true for other cultures that adapted the Chinese writing system for their languages.

It amounts to cultural destruction by neglect and ignorance by large western based technology companies who think things are ‘good enough’. Or are just bugs to fix in a later software update that usually never appears. Modern computer software has pretty much destroyed traditional kanji culture publishing this way, with many countries abandoning mainstream traditional vertical text layout for western style layout because ‘it’s easier’, i.e. western tech companies couldn’t be bothered getting Asian language typography right. All these years later web browsers still can’t do vertical text worth a damn.

A veteran Japanese font engineer whose entire career was devoted to preserving high end Japanese typography in the digital age recently told me, “I don’t think anybody cares anymore.” In the end it all too often comes down to this: I don’t care cultural death by I don’t care companies who have the money and power to care.

That’s bitter irony in our age that purports to champion cultural diversity.

Apple’s Once and Future Japanese Variable System Font

2020 is the coming out party for Apple designed OpenType variable fonts, both the SF Pro and SF Compact system fonts and the all-new New York font shipping in iOS 14, watchOS 7 and macOS 11. The Apple created variable font technology is not new of course. It has been around since the QuickDraw GX days along with the TrueType GX enhanced Skia font. It was due to be standard in MacOS Copland system fonts including a Japanese variable font created by FontWorks. Then Steve Jobs returned to Apple and everything changed.

Yes, it has taken 25 years for an Apple created technology to make it into the basic system. It proves my long stated belief that font technology doesn’t matter unless it is built into every nook and cranny of the OS foundation. The TrueType GX Skia variable font has been with us all this time, but only matters now because the SF Pro system font has gone variable.

Why is It Taking So Long?
iOS 14 and macOS 11 variable font basics are covered in an excellent WWDC20 video, ‘The Details of UI Typography’. It’s important to remember that while OpenType variable font technology is ‘world ready’, at this stage they only apply to Roman based font sets. It’s going to be a long time before we see a Japanese language system font in variable format.

There are many reasons. In the WWDC20 video Loïc Sander of the Apple design team drops a big hint when he explains that while digital technology (PostScript fonts) “gave us a lot more flexibility in handling text,” it also “made typography a bit more crude than it used to be.” The statement shows how clueless designers and engineers outside of Japan can be about Japanese fonts and typography.

While a ‘bit more crude’ might be true for Roman based fonts and text layout, PostScript fonts completely broke traditional Japanese font design and composition models. Everything was thrown out because Adobe made no accommodation outside of western typography needs when creating the PostScript font DTP foundation.

Japanese DTP forced users to adapt to technology rather than technology solving their production problems. I know this because everyday at work I had to deal with the endless problems and limitations of Japanese PostScript fonts that could only reside on the output device.

Another big problem was that Adobe relations with Japanese PostScript licensees in the 1990’s was not healthy. Adobe stuck with closed print device font licensing for far too long and discouraged independent font production wherever they could. Because of this situation, digital font progress in Japan was slow and very expensive.

Here are some challenges facing Japanese variable fonts.

Once Upon a Time
One basic flaw of OpenType font outline technology is that it’s extremely inefficient for kanji glyph production and storage. Every glyph has to be created and stored separately and doesn’t scale well. This is why OpenType CJK fonts on tiny devices like Apple Watch are a match made in hell. One solution to this problem is stroke fonts. Stroke fonts use a library of basic glyph parts to efficiently create complex glyphs.

Stroke fonts are a perfect fit for kanji font production and for small constrained devices like Apple Watch because reusable parts don’t take up precious resources. On the desktop, stroke fonts can do weight variations over the full range from Light through Ultra Bold without losing typographic details, all in a single 4 MB font while an equivalent OpenType variable font can weigh in around 18 MB.

The technology has been around for a long time and was supported up until macOS 9 but lost out when Apple quietly dropped the QuickDraw GX derived Open Font Scaler architecture in the migration from classic to macOS X.

While stroke fonts are not supported in the current Apple OS lineup, on the font tool side stroke font technology has appeared in software such as the classic MacOS Gaiji Master from FontWorks. The lead engineer of that effort is currently working independently on a similar gaiji glyph tool for Windows based on stroke font technology that is much more advanced than the old and long unavailable FontWorks software. I plan to cover developments in a future post.

The Japanese Font Production Challenge
The Hiragino iOS/macOS Japanese system font was not created by Apple, it was licensed from Screen Holdings (SH), originally created by independent font design studio Jiyukobo in the early 1990’s. There is much more work involved creating a Japanese font compared to Roman based languages. Hand drawn glyphs are created, scanned and cleaned up for digital production.

The Adobe Japan 1-7 glyph collection requires 23,060 glyphs for a single weight, multiply this work by the different weights for one family and you get an idea how massive the undertaking is. From Osamu Torinoumi, one of the key designers of the Apple licensed Hiragino font on its creation:

On average, one person would (hand) draw 12 or 13 glyphs a day, which is not much change of pace from the days of creating block type…the whole process, from start to finish, took three years.

One might think that a single CJK (Chinese-Japanese-Korean) font sharing a common design can streamline the process but this is a huge misconception. Each culture has centuries worth of different design aesthetics that good design must incorporate: what looks good to a Chinese designer and works well in a Chinese text design, looks terrible in Japanese context. I have yet to see a decent digital ‘kana’ design from a Chinese font designer. Osamu Torinoumi on the differences in creating the Simplified Chinese Hiragino Sans GB:

“We worked with the Adobe GB 1-4 character set (29,064 glyphs) at 2 weights. Basically we had to finish one weight in 6 months. One year for the entire project. At first we only thought we would be there as backup, but Screen kept passing us all the questions from Beijing. It turned out to be a lot more work than we anticipated.”

Jiyukobo sent all the original Hiragino design data to Hanyi Keyin through Screen and they adapted the designs for China. Torinoumi said that one of the major differences is that Chinese design demands that Gothic (sans serif) characters mimic handwritten style. This means the character should be slightly off center within the virtual body. “Even after the project was over I still didn’t understand the difference between Japan and Chinese “Kokoro” glyph which the Chinese designers insisted were different.”

The Variable font UI Challenge
Finally we get to a problem on the Apple OS platform side that has been around since the GX days: how to present advanced typography features in a useful and easy to understand system UI that works everywhere. What works on macOS obviously won’t work on iOS, but iPad OS will need some degree of advanced typography feature access. Sliders have their place but I agree with Adobe Type Senior Manager Dan Rhatigan who made a very good point in his TYPO Talk 2016 presentation: there has to be a better UI control concept out there.

fvar
Dear Apple, didn’t Adobe tell us not to use sliders?

This is because there are many more OpenType Japanese variable font features than just weights. There are gylph variations, vertical layout variations, horizontal and vertical compression for tatechuyoko instances. In macOS Catalina these are hidden away in the crusty old Font Pallet that is desperately in need of a major overhaul. Please tell me that macOS 11 fixes this or that Apple has a vision how to.

Oh where can my glyph variations be?

Japanese typography is unique in that it has preserved its own print ‘moji bunka’ cultural history and vision that China and Korea have largely abandoned in the face of western centric computer culture that all too often pretends to care about such things, which it does not. If it did we’d have vertical text in web browsers by now that actually works. I hope a rich text culture can be preserved and conveyed to future generations even in such small details as a well designed and executed Japanese variable font for computers and smart-devices.


Japanese Typography and Font Posts

Unicode needs a new Mission

With Unicode adding more and more useless emoji, and seemly doing little else, it’s time to ask an important question: what the fuck is the Unicode Consortium supposed to be doing anyway?

It’s time to dust off Howard Oakley’s excellent blog post Why we can’t keep stringing along with Unicode, and think about the Normalization problem for file names and the Glyph Variation problem of CJK font sets. These problems fit together surprisingly well. My take is the problems must be tackled together as one thing to find a solution. Let’s take a look at the essential points that Oakley makes:

Unicode is one of the foundations of digital culture. Without it, the loss of world languages would have accelerated greatly, and humankind would have become the poorer. But if the effect of Unicode is to turn a tower of Babel into a confusion of encodings, it has surely failed to provide a sound encoding system for language.

Neither is normalisation an answer. To perform normalisation sufficient to ensure that users are extremely unlikely to confuse any characters with different codes, a great many string operations would need to go through an even more laborious normalisation process than is performed patchily at present.

Pretending that the problem isn’t significant, or will just quietly go away, is also not an answer, unless you work in a purely English linguistic environment. With increasing use of Unicode around the world, and increasing global use of electronic devices like computers, these problems can only grow in scale…

Having grown the Unicode standard from just over seven thousand characters in twenty-four scripts, in Unicode 1.0.0 of 1991, to more than an eighth of a million characters in 135 scripts now (Unicode 9.0), it is time for the Unicode Consortium to map indistiguishable characters to the same encodings, so that each visually distinguishable character is represented by one, and only one, encoding.

The Normalization Problem and the Gylph Variation Problem
As Oakley explains earlier in the post: the problem for file system naming boils down to the fact that Unicode represents many visually-identical characters using different encodings. Older file systems like HFS+ used Normalization to resolve the problem, but it is incomplete and inefficient. Modern file systems like APFS avoid Normalization to improve performance.

Glyph variations are the other side of the coin. Instead of identical looking characters using different encodings, we have different looking characters that are variations of the same ‘glyph’. They have the same encoding but they have to be distinguished as variation 1, 2, 3, etc. of the parent glyph. Because this is CJK problem, western software developers traditionally see it as a separate problem for the OpenType partners to solve and not worth considering.

Put another way there needs to be an unambiguous 1-to-1 mapping and an unambiguous 1-1/1-2/1-3-to-1 mapping. I say the problems are two sides of the same coin and must be solved together. Unicode has done a good job of mapping things but it is way past time for Unicode to evolve beyond that and tackle bigger things: lose the western centric problem solving worldview (i.e. let’s fix western encoding issues first and deal with CJK issues later), and start solving problems from a truly globally viewpoint.

Japanese Text Layout for the Future* (hint: there isn’t one)

I finally had time to catch Adobe Nat McCully’s ATypl Tokyo 2019 presentation. He covers the topic that I have covered in depth many times before: the (sad) state of CJK typography. As Nat points out most software developers and system engineers talk about CJK support as typography without any idea of what it means. Throwing CJK glyphs on a screen is not typography, they are not the same thing at all.

The defining feature of CJK typography and layout in general and Japanese typography in particular is that space is an essential composition element equal with text and graphics, with fine space element control way beyond a baseline. Instead of thinking about how much space should be between text, flip it around and think about how much text should be between the space. Baseline font metrics will never deliver great CJK typography because there are too many limitations. So everybody implements the missing stuff on the fly and everybody does it different. Unfortunately the irony of it all is that Adobe played a huge role in how these limitations played out in the evolution of digital fonts, desktop publishing (DTP) and the situation we have today.

QuickDraw GX was probably the only time in computer history that fonts, layout engine and the basic OS came together to solve these limitations for all language systems, all language typography as equal from the bottom up. Parts of that effort survived, such as Apple’s San Francisco variable system font based on the TrueType GX model, and the inclusion of the TrueType GX model as the base technology for OpenType Variable fonts. Nice as this is, it’s only a tiny sliver of the GX vision pie that survived, all the other baseline font metric and CJK typography limitations still exist. Outside of a handful of people like Nat at Adobe, and the Adobe CJK typography ghetto approach of keeping all the good stuff corralled in InDesign J, very little is being done to address them.

Call me a pessimist but after 20 years of watching things slide sideways, I don’t see much hope for the future evolution of great CJK typography on digital devices. Most western software development people think that having CKJ glyphs on a screen is ‘good enough’ CJK typography, end of story.

Already I see the OpenType Variable Font effort devolving into a bauble for web developer geeks, always stuck in demo-hell, never going mainstream. It is the same story for quality CJK typography on digital devices. When the current Adobe CJK leaders like McCully and Ken Lunde reach retirement age, whom have devoted their careers to fixing these problems, I think it will be the end of an era. In many ways we are already there.

Apple prides itself on having good typography but cannot be bothered with such Japanese typography basics as not mixing Gothic and Ryumin Japanese font styles seen here in the Photos app

UPDATE
Ken Lunde posted a wonderful overview of his Adobe career to date, also his ATypl Tokyo 2019 presentation.

Tokyo Olympics Apple Maps: Death by Point of Interest

Apple tells Engadget Japan reporter Masaichi Honda that Apple Maps Japan will be ready for the Tokyo Olympics in the summer of 2020. Apple Maps will have robust indoor maps for tall buildings and underground station malls in Tokyo, and Real-time transit for better transit route searching and transit updates. That is exactly one year from now, far into the iOS 13 life cycle. Honda san also reports that Apple is not ready to show Japanese reporters a demo yet, not an encouraging sign.

In addition to the Apple Maps image collection vans combing Japan right now, WWDC19 unveiled the Indoor Maps Program for registered developers and building owners to map indoor areas and encode the data using (Apple’s ?) Indoor Mapping Data Format (IMDF). Once the data is encoded in IMDF, surveyed and validated, developers and building owners can use the data in their apps and designate indoor areas to share on Apple Maps.

That’s great for building owners to indoor map their own building. What about shared public places like Shinjuku Station which is spread out and shared by 8 different owners? There is also the localization problem. It’s one thing to indoor map for Japanese users, but who’s going to localize all those Point of Interest (POI) icons and information sheets in English, Chinese, Spanish, etc. That costs serious time and money.

Let’s take a comparison look at indoor maps of the primary entrance gate for inbound visitors coming to the Tokyo Olympics next year: Tokyo Station, and compare Yahoo Japan Maps, Apple Maps and Google Maps.

Yahoo Japan Maps
Yahoo Japan Maps only offers Japanese language but it has best cartography and attention to small details that matter, like yellow station exit signage colors that exactly match what you find on the ground. Apple and Google don’t.

Yahoo Japan Maps Tokyo Station Indoor mapping

Apple Maps Japan
Apple Maps does not offer indoor station mapping in Japan. It does offer multilingual support but judging from the English Point of Interest information, it’s not robust. As usual Apple Maps Japan overwhelms users with Point of Interest icons. It’s map death by Point of Interest. There’s a lot of fixing Apple needs to do if they want to present a good map product in time for the Olympics.

Apple Maps Japan Tokyo Station (as yet no indoor station mapping)

Google Maps Japan
Google Maps offers indoor mapping for Tokyo Station in multiple languages. For all the detail Google offers here, it’s much less helpful than Yahoo Japan Maps. For high density areas like Tokyo, good cartography and smart editing makes all the difference between a good map and lousy one.

Google Maps Japan Tokyo Station