Translation and Interpreting in 150+ Languages
Through a Glass, Darkly
January 19, 2011 - By: - In: Machine Translation - 41 comments

Word Lens enables iPhone users to instantly translate Spanish into English. Heard about it? A Google search reveals about 1.4 million pages, so market awareness is pretty good. The video is great, so great that it made me want to pick up the phone and order some ceramic steak knives, but since that required a couple of swipes and 10 key strokes for the toll free for the steak knives, I decided instead to download the translation app onto my new iPhone and test it out.

I test these language tools all the time. You don’t read about it too often because most of the tests are perfect fizzles. It goes something like this: I load the app. If I can figure it out, I use it until I get bored with it―10 minutes tops―then spend another half-hour pulling it off of my PC. So now, to reduce fizzling, I ignore them all. But that video on YouTube was so cool, even though as I look at it I can see how jammed it is.

Word Lens allows you to hold your iPhone up to a sign in Spanish, which it then translates into English on your iPhone screen. In the demo, someone holds up various signs, which the app then translates into a pitch for the app itself. The idea is that the app will allow you to read signs and menus when traveling. That’s if you are travelling in a Spanish-speaking country, of course, since English and Spanish are the only languages the tool can read. This type of interface is called “augmented reality,” a live direct or indirect view of a physical real-world environment whose elements are augmented by virtual computer-generated sensory input such as sound or graphics.

In the demo, all the signs are printed using highly readable fonts against clear backgrounds, face-on and held at waist height about 6 feet from the lens. This is what I call augmented demo reality, as in you set up a “real world” demo that augments reality so that the device can actually work in a setting made completely and deceptively artificial.

So this app would be great for reading Burma Shave signs in Mexico, with little road signs rolling straight past you (and with your passenger holding the smartphone)… please. Otherwise, what have you got besides roadkill? Optical character recognition from hell, where every misread letter is a monkey wrench tossed into the gearbox of the translation engine, so that each misread character blows up a sentence worth of translation.

These kinds of apps have been around for years and, well, fizzle is as fizzle does.

But who cares, really? If you are planning on using a smartphone to find your way out of the museum or to tell your Goya from your Dalí, I’m not so sure that Word Lens will give you the language leg up you are looking for. If you reach a point where you need a camera to order dinner, best take your nose out of the small screen and keep your eyes on the road.

And now on to the final part of the demo, where this fizzle of a review will encourage the 50 cent party members of Word Lens to comment on the multiple benefits of what they are paid to perceive as a useful, even life-saving, tool. I prepare to stand corrected.

This is because all language technology demos must be massaged into mimicking functionality for a few brief moments during the presentation. Otherwise, the bilingual experience will be alienating and frustrating, just like it’s supposed to be. (Isn’t that what different languages are for, after all, to keep us all from speaking to strangers?) So I don’t test. But that video was so cool.

LiveZilla Live Chat Software