This week, we posted a blog on Microsoft’s recent breakthrough in machine interpreting, and it prompted some interesting feedback. Namely, the question of whether human interpreters will be replaced by computer software in our lifetime.

Obviously, Language Insight is biased in thinking this won’t happen. However, we have some very valid reasons for coming to this conclusion.

It’s only words

When Superstorm Sandy hit the east coast of the US in late October, it caused untold devastation. The cost of the damage was in the tens of billions, according to early estimates, while dozens of people lost their lives.

In the midst of all this, mayor of New York Michael Bloomberg gave a press conference where he was joined by his official sign language interpreter, Lydia Callis. Ms Callis quickly became a bright light in the otherwise dismal news coverage, with the New York Times claiming she “gave New Yorkers a legitimate reason to smile”. The reason? The emotion with which she signed.

“Unlike Bloomberg’s own stilted Spanish, another highlight of the updates, Callis’ signing is both lightning-fast and emotive, her animated face lighting up and contorting happily as she goes, not unlike a guitarist during a blistering solo,” the publication explained.

Obviously sign language interpreting is different to language interpreting, but what Ms Callis proves is that a human interpreter can have a hugely positive impact on the viewer in a way that a machine simply can’t. However far Microsoft gets with its interpreting software, it will not be able to compete with the passion and emotion demonstrated by Ms Callis.

Yes, human interpreters make mistakes, particularly when interpreting simultaneously. However, won’t a listener be far more forgiving of a person than of a machine?

Communication is not just the words we say; it’s about so much more. In fact the words we say account for only a small portion of the message the listener receives. Other factors that influence communication include eye contact, posture, facial expressions, body language, and the pitch, tone and volume of our voice. While these factors might not seem to carry as much weight when you’re making a presentation at a conference or speaking to someone at a shop while on holiday, they are all things a machine is unable to do.

Non-verbal communication can reinforce the message you want your audience to get. It can even be more effective than the words you are saying, as proven by the many people who became so caught up in Ms Callis’ interpreting (despite being unable to understand sign language) they lost track of what Mayor Bloomberg was saying.

We can safely say that it will be some time before a machine is able to put in a performance like a human interpreter!

Expect the unexpected

Another reason why we think interpreters have a while to wait before they lose their jobs to machines is that a computer cannot cope with surprises. Should the speaker go completely off track to what was prepared, it is likely to take a machine a long time to catch up – if it can at all. Of course, this would also throw a human interpreter but they will probably be able to catch up faster.

If machines are unable to cope with anything unplanned, that means taking questions from members of the audience who speak a different language to you could be incredibly difficult. Should your simultaneous interpreter be replaced by a machine, you will still need a human interpreter to act as a language liaison between you and your audience.

Localised interpreting

Then there’s localisation. We recently wrote about Republican presidential candidate Mitt Romney’s faux pas during the build-up to the election. When asked on a Cuban-American radio station what his favourite fruit was, he innocently replied that he was a fan of papaya, with no idea that in Cuba this is the slang term for vagina.

A human interpreter could stop you from making a mistake like this. Because they are an expert in your target language, they will be able to interpret your words in a way that makes sense for the audience – and is free of anything that might unwittingly send the wrong message. While the Microsoft interpreter can translate the words you say and reorder them to basically make sense in the target language, a human interpreter can do this and also correctly interpret any idioms or technical terminology that might cause a computer to blow a circuit.

These are just a few of the reasons why, at Language Insight, we think it will be quite some time before interpreter technology can realistically put human linguists’ jobs at risk. Find out more about the interpreting services we offer here.