On my monoglot Macintosh…

The Mac has always done a pretty good job of dealing with different languages, ever since the advent of WorldScript in System 7.1, way back in 1992, which, to look at it now, was groundbreaking, especially when you consider that in the PC world, it wasn’t really well done until Windows 2000. And Mac OS X’s multilingual functionality is unparallelled – great Unicode, beautiful fonts, fluid input methods, all the intefaces languages included in every copy of the operating system (unlike Windows, where Microsoft makes you pay a fortune for language packs), and supremely simple localisation, making extending your software to a wider audience a breeze. For developers and users alike, it’s a great platform for the world that doesn’t speak English.

And since day one, the Mac has been a talker. When Steve Jobs demoed the Macintosh in 1984, he was proud to showcase its speech synthesiser, because it was pretty special stuff. Fast-forward twenty years, and the Mac can still talk in the same now-semi-famous voice, as well as various others, reading documents and web pages with impressive accuracy. And to improve accessibility, Tiger is bringing a technology Apple are calling VoiceOver into the fold, using the speech synthesiser as a fully comprehensive screen reader. This should be a boon for a sizeable number of users.

But it only works in English. Although there was support for Mac OS 9 to speak Chinese (and possibly other languages as well, of which I am unaware), Mac OS X only speaks English. Attempts to input French result in very painful output, and were I better versed in other languages, I would experiment with them too, doubtless with similar results. Basically, though its reading and writing skills are excellent, the Mac is bound to fail Elementary French – it can’t talk.

The challenge is not insurmountable either. Apple have obviously done it for Chinese in the past, and Microsoft, apparently through licensing technology from Lernout & Hauspie (now seemingly acquired by ScanSoft) have done it at least for Japanese, and probably other languages as well (although the capability is sadly wasted, because nothing apart from Microsoft Word appears to take advantage of it). But why has Apple put speech synthesis out to pasture since the advent of Mac OS X?

The main reason is, of course, pretty obvious. Wonderful though it is that the Mac can talk, it’s not its biggest selling point. Yes, it speaks, reading things aloud with a choice of voices, but you’re not going to choose it over Windows on that basis, unless you have extremely peculiar needs. Until now, the return on investment would simply not have been sufficient to justify developing (or licensing) foreign language speech synthesis.

But now we have VoiceOver. And given that Apple’s second biggest market is in Japan, the fact that the Mac only does English starts to look like a bit of an oversight. It’s a bit pathetic that Apple has devoted time to translating the page on VoiceOver into Japanese, with only a couple of parenthesised additions to note that it only works in English. At least the French page redirects to the English one, so the French are under no illusion as to where Apple stands on the whole issue.

Ethnocentricity has always been a problem in computing, which has traditionally emanated primarily from the United States. And in many ways, Apple has done much to combat this, providing even a British version of the Mac OS up until Mac OS X, as well as, obviously, a multitude of other localisations. But it seems to me a bit much for them to trumpet VoiceOver’s brilliance to the Japanese, when the people who will be able to use it (blind, English speaking Mac users living in Japan) probably number less than one hundred.

Speech synthesis and (from what I can tell) VoiceOver are good. But that’s not enough. Apple ought to be making them Insanely Great.