In 2011, Google released an April Fool’s video. They pranked viewers into believing a new Gmail feature allowed users to interact with their inboxes using gestures and body language. On December 31, 2018, the company’s motion-sensing technology secured approval from the Federal Communications Commission (FCC), paving a new way for humans to interact with their devices – and that’s no joke.
Radar technology plays a major role in this type of communication. Human-computer interaction (HCI) with radar sensors is a relatively new development, but experimentation with radio waves dates back to the 1880s. In the case of Google’s ATAP Project Soli (the very same one that received FCC approval), a commercialized outlet for motion-sensing with radar is now in the works. Because of this innovative technology, we can now expect similar developments from Google’s competitors. Controlling phones, watches, and other IoT devices with swift hand gestures is now a reality.
Radar works by reflecting and detecting radio frequency waves. One of the benefits of radar technology is that light and atmospheric conditions do not negatively impact performance. Radar-sensing technology is sensitive enough to recognize gestures with sub-millimeter accuracy. This means that the technology is robust enough to pick up slight hand gestures, allowing you to perform functions without physical contact. A radar sensor, embedded in a device such as a smart watch, captures raw data and generates a signal. This signal is then interpreted by your device, allowing you to control it with the swipe of a hand.
Some current applications include:
While radar-based motion sensing is a win for users with mobility and speech impairments, the language nerds among us are left pondering the linguistic implications that will be ushered in by this type of human-computer interaction technology. Some argue that a new language is in the making. Others say that virtual swipes and sliders are but gestures and cannot technically be defined as a language. Many are also drawing a connection to sign language.
Like sign language, motion-sensing technology allows for the interpretation of meaning – in this case – through hand gestures. But it would be a mistake to assume that sign language can be boiled down to hand movements. Sign language uses complex combinations of hand motions, body movements, and facial expressions to express meaning.
It seems that every year, a new, innovative endeavor presents us with a solution for interpreting sign language. In the last 2 months, at least 7 patents were filed with the World Intellectual Property Organization (WIPO) for technology designed specifically to interpret sign language. But the truth is, technology has not yet reached a point of truly capturing the complexities of sign language.
Just how important is sign language to communicate? Well, when we consider the population of people with hearing loss in just these 8 countries alone, we get a pretty clear picture of the importance of alternate forms of communication:
Information is communicated in various ways, but not all means of communication are classified as a language. Ants communicate information chemically, some mammals use odor to mark territory, and bees dance. What makes human language unique is productivity. Human language, both spoken and signed, has the capacity to use limited structure to produce an unlimited number of sentences. With the estimation of over 170,000 words in the English language, we can create sentences that no one has ever heard before, and they will be understood.
In the case of human-computer interaction technology, a sort of translation process is taking place – a device is programmed to interpret the limited commands given by a human. In natural language, concepts are represented by terms, definitions, and symbols while artificial language uses codes and formulae to achieve this.
If we know anything, it’s that language evolves. Natural language evolves with culture, geopolitics, historical events, and contact with other languages. But the languages of human-computer interaction with radar picking up gestures will also evolve in their own right. This is because the types of functions we can perform with technology will continue to develop. It seems as if we really are entering a new paradigm.
The World Federation of the Deaf estimates that there are around 70 million deaf people in the world. The international deaf community uses approximately 300 different sign languages, and new ones are popping up all the time. The Sign Language MarketGaps, Trends, and Opportunities for Growth The World Federation of the Deaf estimates that there […]
New disciplines are continually being created as the way we do business evolves. Trends pop up. Some only for a moment, others for the long-run. Entire market niches come into being seemingly out of thin air. Although it’s not always easy to know where these trends come from or where they are headed, the truth of the matter is that they burst forth in a flurry into our daily lives, and suddenly everyone is talking about them.
Every once in a while, people outside of the localization industry join events dedicated to the language business. We've heard them say we're a nice bunch of people, enthusiastic about our jobs. This feeling surrounding our industry was confirmed once again at MESA’s Content Workflow Management forum in London on 26 February.
The Nimdzi 100 is one of our flagship publications. It includes a ranking of the top 100 LSPs by revenue, a watchlist of large players that don’t disclose their revenues, and a detailed overview of the size and state of the language services industry. The Nimdzi 100 is widely considered an industry standard and is read by tens of thousands of people in the translation and localization space and beyond. LSPs, localization buyers, investors, savvy job seekers, and analysts will benefit from this free resource.