Imagine travelling to a foreign country without having to worry about communicating in a different language. This may soon be a possibility thanks to Google's new wireless headphones, which have the power to translate between 40 languages in real time.
The ability to communicate freely with people from all over the world has the potential to make the world seem much smaller and could be for huge foreign business opportunities.
While Google isn't the first company to offer in-ear translation devices, they are the largest, suggesting we might be at the tipping point where this type of technology turns mainstream.
Much like the Babel Fish from the Hitchhiker's Guide To The Galaxy, in-ear translation systems allow people who speak different languages to understand and communicate with each other without having to know the other person's (or alien's) language.
As a frequent traveller, I am already a user of machine translation services. My Google Translate app easily translates signs and menus in real time using augmented reality. By looking through my smartphone screen, I can instantly see a translation of the words or characters in front of me.
However, there is a big difference between being able to read a menu in a Chinese restaurant and having colloquial conversations with a local!
In-ear translation devices have suffered from teething problems in the past. Complexities around context of language still arise. For example, if you hear the sentence "I saw red" would you assume the viewing of a colour or the feeling of anger?
Machine translation devices struggle with examples like this, but with the combination of artificial intelligence, human error correction and natural language processing, the technology is rapidly advancing for deeper context understanding.
We live in a world where self-driving cars may mean that our children will never learn to drive. The question now arises around whether translation devices mean that our children won't need to learn a second language.
As it is, relatively few Kiwis learn a second language today. According to the last census, the majority of us speak only English and only 18 per cent of us can speak more than one language.
With such a low uptake in language learning, the ability to use technology to communicate with speakers of other languages - without having any knowledge of how their languages work - sounds tempting.
The question is, if advances in speech recognition reach the point where a second language isn't required, are there other benefits to learning a language?
Scientifically, learning a language has been shown to boost thinking skills, improve mental agility and delay the ageing of the brain. These mental improvements have been shown to occur regardless of the age that the person is when they start to learn the new language.
Cultural nuances are also not picked up by machine translation devices. There are many non-verbal miscommunication errors that can seem offensive if the cultural politeness rules are not observed.
Technologically, machines are still hardware that can fail at any time. With flat batteries, frozen software and dropped headphones providing physical challenges, the need for human language is still required as a backup if not as a central system.
New technologies will probably keep changing the way that we approach how we learn languages.
However, although the way that we learn maths has changed since calculators became commonplace, we still learn maths to help us understand the core principles needed to solve problems.
With that in mind, technology translators may eventually help us to raise our awareness of the differences across more languages while we still learn the complimentary cross-cultural knowledge that goes with them.
Dr Michelle Dickinson, also known as Nanogirl, is an Auckland University nanotechnologist who is passionate about getting Kiwis hooked on science. Tweet her your science questions @medickinson.