MacRumors posted an interesting comment Tim Cook made in the 4Q 2020 earnings call
As you can imagine in this environment, people are less wont to hand over a card. Contactless payment has taken on a different level of adoption and I don’t think we’ll go back. The United States has been lagging in contactless payments and I think the pandemic may very well put the U.S. on a different trajectory there. We are very bullish on this area and there are more things that Apple can do in this space so this is an area of great interest to us.
What exactly are the ‘more things that Apple can do in this space’ Tim is talking about? There are two iOS 14 Apple Pay features that haven’t arrived yet: App Clips and Apple Pay QR Code Payments.
We are cashless…App Clips at Tailored Cafe but the nifty Apple-designed App Clip Code stickers aren’t available in Japan yet (Coral Capital blog)
The problem is that the Apple-designed App Clip Codes aren’t fully ready yet and require a future iOS 14 update (iOS 14.3?) to enable optical code reading, as noted in the iOS 14 web page fine print. Also note the 2 flavors of NFC tag reading iPhones: 1) automatic NFC with reader mode (iPhone Xs and later), 2) manual Control Center NFC scan mode (pre-iPhone XS).
I expect iOS 14 Apple Pay QR Code Payments to arrive at the same time. It only makes sense to enable and launch App Clip Codes + Apple Pay QR Code Payments together as one rollout. The only question is announcement timing. We already have the ‘soft’ App Clips Code October 22 launch in Japan and USA. If Apple holds another event this year, I think there’s a very good chance we’ll hear about it.
Many thanks to Hagiwara san for getting Apple to do that NFC scan mode Control Center thingy for older iPhones
Only iPhone Xs and later have automatic NFC with reader mode
Apple Devloper site App Clip Code closeup
ExxonMobile App Clip Code + NFC tag sticker closeup, note the optical code differences
App Clip Code stickers combine the Apple-designed optical code with a NFC tag
UPDATE iOS 14.3 beta has support for Apple designed App Clip Code scanning. Here is a quick screen recording of the scan process and animation. The App Clip Code is a photo of the ExxonMobile gas pump stickers that launched October 22. The App Clip does not load because the ExxonMobile App is not available in Japan.
The timing makes sense now that iOS 14 is nearing official release, but Apple has not officially announced Apple Pay Code Payments yet though they may reveal something at the online September 15 Apple Event. Things will be damn awkward if they don’t.
There are lots of questions: will LINE Pay Apple Pay be a NFC/FeliCa + QR Code Payment in one Wallet ‘card’, or will it be the Apple Pay flavor of the Line Pay JCB prepaid card already on Google Pay that works on the FeliCa QUICPay network.
LINE Pay implied the intention of leveraging both NFC and Apple Pay QR Code Payments but there isn’t much to go by at this point, except that LINE said Apple Pay will ‘complete the LINE PAY contactless payment platform.’ Whatever that means.
Now that LINE has made their Apple Pay move, PayPay is sure to follow at some point. The trend to offer flexible NFC + QR payment solutions started with Toyota Wallet and will gain momentum with iOS 14 Apple Pay, especially with App Clips.
UPDATE 12/22/2020 Line Pay and SMBC announced the VISA Prepaid Line Pay card for Google Pay and Apple Pay. It’s basically a replacement for the JCB Prepaid card that will be phased out along with plastic issue cards sold in convenience stores. The instant issue VISA Prepaid Line Pay card works on iD and EMV contactless payments networks with Google Pay but is limited to iD on Apple Pay due to the long running Apple/VISA Japan feud that may, or may not be, thawing soon. Nobu Ringo has posted his usual into video.
The Apple designed App Clips code combines a visual code and a NFC tag
When the AliPay Apple Pay leak surfaced earlier this year the stock story was that Apple Pay must support AliPay and WeChat Pay if Apple Pay is to have any relevance for iPhone users in China. The real story is more interesting and is centered on App Clips, not AliPay or other specific QR code payment players.
iOS 14 is the first time Apple Pay is moving beyond NFC. CarKey will incorporate Ultra Wideband when the Car Connectivity Consortium Digital Key 3.0 spec is finalized and ‘Code Payments’ are coming at some point in the iOS 14 cycle.
Tap or Scan Simplicity The strength of code payments is simplicity and low cost. iPhone is both a radio (NFC) and camera (scanner). NFC always has an advantage over a scanner in that it works without light and can be activated just by the user pointing their device at an NFC reader or tag.
The downside is the NFC reader side of the equation: the reader + cash register/transit gate + transaction software has a higher initial investment than a code scanner attached to a POS system. The promise of App Clips is they finally put NFC, specifically NFC tags, on the same low cost entry bar of QR codes.
App Clips are activated by:
App Clip Codes
NFC Tags
QR Codes
Safari App Banners
Links in Messages
Place Cards in Maps
Let’s examine the ‘real world’ App Clip activation triggers: Apple App Clip codes, NFC tags, QR codes. For Apple designed App Clip codes, “You can scan them with your camera or tap one using NFC.14” The #14 footnote is interesting: “Camera support for scanning an App Clip code will be made available in an iOS 14 software update later this year.”
This means those fancy Apple designed App Clip codes are coming after the initial iOS 14 launch, and when they do Apple Pay Code Payments will certainly be coming with them. It boils down to one thing: making App Clips a simple tap or scan process. NFC tags still enjoy the ’point here’ advantage as App Clip does the rest. For visual codes the user has to launch the camera and scan before App Clip takes over.
The Code Payment/App Clip Network Connection Requirement Apple Pay Wallet NFC payment cards have 3 major features that payment apps do not:
Direct side button Wallet activation with automatic Face/Touch ID authentication and payment at the reader
Device transactions without a network connection
Ability to set a default main card for Apple Pay use
Apple Pay Code payments can possibly offer this for dynamic code payments where a scanner reads the code off the iPhone screen. However, static code payments are messy because Apple Pay requires a network connection to process the payment just like apps do. In the Apple Pay code payment scenario suggested by the AliPay screenshot leaks, a static code scan directly activates the appropriate Apple Pay code payment (AliPay, etc.), the user enters the amount, taps ‘Pay’, authenticates, and Apple Pay does the transaction via the network connection. It’s a similar scenario for NFC tag payments.
It’s because of this network connection requirement that I believe Apple is pushing Apple Pay NFC tag and code payments wrapped in the App Clip experience. They will work by themselves of course, but they work better as part of the total App Clip experience. This is where App Clip codes come in.
App Clip codes are Apple-designed identifiers that are uniquely paired to specific App Clips and provide an easy way to find and launch an app experience at the exact place and moment you need it. You can scan an App Clip code with your camera or by tapping one using NFC.14 We will be adding support for them in an iOS 14 software update later this year.
How is this any different from regular NFC tags or QR codes? I suspect it’s a mini qualification program for developers, payment providers and merchants to supply the ultimate App Clip experience. It also works as App Clip branding and advertising for Apple.
Are there special App Clip code tags that push the App Clip experience further than regular NFC tags and QR? I suspect so and that could be fun. Think about it, what if the Apple designed App Clip code NFC tag activated an App Clip with code payment. A QR payment without the static QR code. That would be the ultimate App Clip experience indeed.
Yes, as crazy as that sounds, but according to Kenta Yamaguchi’s piece on ASCII that’s exactly what is happening. The point of his story is that starting today, second brand carriers Y! mobile and UQ Mobile are selling iPhone SE instead of iPhone 8. Until yesterday they only offered the budget Apple Pay Suica capable iPhone 7 and normally they would offer iPhone 8, but iPhone 8 is nowhere to be seen in the budget lineup. Instead they are offering iPhone SE only 4 months after it went on sale at first tier carriers.
Yamaguchi san says the SE is so popular that major carriers are bitching it will slowdown the 5G migration in Japan…while still selling as many iPhone SE units as they can. 5G will just have to wait until Apple comes up with a budget 5G Touch ID iPhone SE.
2020 is the coming out party for Apple designed OpenType variable fonts, both the SF Pro and SF Compact system fonts and the all-new New York font shipping in iOS 14, watchOS 7 and macOS 11. The Apple created variable font technology is not new of course. It has been around since the QuickDraw GX days along with the TrueType GX enhanced Skia font. It was due to be standard in MacOS Copland system fonts including a Japanese variable font created by FontWorks. Then Steve Jobs returned to Apple and everything changed.
Yes, it has taken 25 years for an Apple created technology to make it into the basic system. It proves my long stated belief that font technology doesn’t matter unless it is a standard feature built into every nook and cranny of the OS foundation. The TrueType GX Skia variable font has been with us all this time, but only matters now because the SF Pro system font has gone variable.
Why is It Taking So Long? iOS 14 and macOS 11 variable font basics are covered in an excellent WWDC20 video, ‘The Details of UI Typography’. It’s important to remember that while OpenType variable font technology is ‘world ready’, at this stage they only apply to Roman based font sets. It’s going to be a long time before we see a Japanese language system font in variable format.
There are many reasons. In the WWDC20 video Loïc Sander of the Apple design team drops a big hint when he explains that while digital technology (PostScript fonts) “gave us a lot more flexibility in handling text,” it also “made typography a bit more crude than it used to be.” The statement shows how clueless designers and engineers outside of Japan can be about Japanese fonts and typography.
While a ‘bit more crude’ might be true for Roman based fonts and text layout, PostScript fonts completely broke traditional Japanese font design and composition models. Everything was thrown out because Adobe made no accommodation for Japanese Kanji based font metrics. Western created DTP layout is graphics-driven and calculated by margins and font baselines.
The western baseline typography model and font metrics is how PostScript and OpenType fonts, and all layout engines evolved. Adobe was well acquainted with the shortcomings of their own font technology and InDesign J got around the problems by adding proprietary Kanji virtual body font metrics and Japanese line break algorithms. None of this exists as an open standard that benefits everybody.
Because of this situation Japanese DTP forced users to adapt to limited font technology rather than technology solving their production problems. I know this because everyday at work I had to deal with the endless problems and limitations of Japanese PostScript fonts that could only reside on the output device.
Another big problem was that Adobe relations with Japanese PostScript licensees in the 1990’s was not healthy. Adobe stuck with closed print device font licensing for far too long and discouraged independent font production wherever they could. Because of this situation, digital font progress in Japan was slow and very expensive.
Here are some challenges facing Japanese variable fonts.
Once Upon a Time One basic flaw of OpenType outline font technology is that it’s extremely inefficient for kanji glyph production and storage. Every glyph has to be created and stored separately and doesn’t scale well. This is why OpenType CJK fonts on tiny devices like Apple Watch are a match made in hell. One solution to this problem is stroke fonts. Stroke fonts use a library of basic glyph parts to efficiently create complex glyphs.
Stroke fonts are a perfect fit for kanji font production and for small constrained devices like Apple Watch because reusable parts don’t take up precious resources. On the desktop, stroke fonts can do weight variations over the full range from Light through Ultra Bold without losing typographic details, all in a single 4 MB font while an equivalent OpenType variable font can weigh in around 18 MB.
The technology has been around for a long time and was supported up until macOS 9 but lost out when Apple quietly dropped the QuickDraw GX derived Open Font Scaler architecture in the migration from classic to macOS X.
While stroke fonts are not supported in the current Apple OS lineup, on the font tool side stroke font technology has appeared in software such as the classic MacOS Gaiji Master from FontWorks. The lead engineer of that effort is currently working independently on a similar gaiji glyph tool for Windows based on stroke font technology that is much more advanced than the old and long unavailable FontWorks software. I plan to cover developments in a future post.
The Japanese Font Production Challenge The Hiragino iOS/macOS Japanese system font was not created by Apple, it was licensed from Screen Holdings (SH), originally created by independent font design studio Jiyukobo in the early 1990’s. There is much more work involved creating a Japanese font compared to Roman based languages. Hand drawn glyphs are created, scanned and cleaned up for digital production.
The Adobe Japan 1-7 glyph collection requires 23,060 glyphs for a single weight, multiply this work by the different weights for one family and you get an idea how massive the undertaking is. From Osamu Torinoumi, one of the key designers of the Apple licensed Hiragino font on its creation:
On average, one person would (hand) draw 12 or 13 glyphs a day, which is not much change of pace from the days of creating block type…the whole process, from start to finish, took three years.
One might think that a single CJK (Chinese-Japanese-Korean) font sharing a common design can streamline the process but this is a huge misconception. Each culture has centuries worth of different design aesthetics that good design must incorporate: what looks good to a Chinese designer and works well in a Chinese text design, looks terrible in Japanese context. I have yet to see a decent digital ‘kana’ design from a Chinese font designer. Osamu Torinoumi on the differences in creating the Simplified Chinese Hiragino Sans GB:
“We worked with the Adobe GB 1-4 character set (29,064 glyphs) at 2 weights. Basically we had to finish one weight in 6 months. One year for the entire project. At first we only thought we would be there as backup, but Screen kept passing us all the questions from Beijing. It turned out to be a lot more work than we anticipated.”
Jiyukobo sent all the original Hiragino design data to Hanyi Keyin through Screen and they adapted the designs for China. Torinoumi said that one of the major differences is that Chinese design demands that Gothic (sans serif) characters mimic handwritten style. This means the character should be slightly off center within the virtual body. “Even after the project was over I still didn’t understand the difference between Japan and Chinese “Kokoro” glyph which the Chinese designers insisted were different.”
The Variable font UI Challenge Finally we get to a problem on the Apple OS platform side that has been around since the GX days: how to present advanced typography features in a useful and easy to understand system UI that works everywhere. What works on macOS obviously won’t work on iOS, but iPad OS will need some degree of advanced typography feature access. Sliders have their place but I agree with Adobe Type Senior Manager Dan Rhatigan who made a very good point in his TYPO Talk 2016 presentation: there has to be a better UI control concept out there.
Dear Apple, didn’t Adobe tell us not to use sliders?
This is because there are many more OpenType Japanese variable font features than just weights. There are gylph variations, vertical layout variations, horizontal and vertical compression for tatechuyoko instances. In macOS Catalina these are hidden away in the crusty old Font Pallet that is desperately in need of a major overhaul. Please tell me that macOS 11 fixes this or that Apple has a vision how to.
Oh where can my glyph variations be?
Japanese typography is unique in that it has preserved its own print ‘moji bunka’ cultural history and vision that China and Korea have largely abandoned in the face of western centric computer culture that often pretends to care about such things, when it really doesn’t. If western developers cared about good typography for everybody we would have vertical text in web browsers that actually works. I hope a rich text culture can be preserved and conveyed to future generations even in such small details as a well designed and executed Japanese variable font for computers and smart-devices.
Japanese Typography and Font Posts
This is a collection of long form Japanese typography posts. They were written as stand alone pieces, so there is some background explanation overlap, always a weak point of the blog format.
You must be logged in to post a comment.