Report written by Özge Ünlü.
It is a given nowadays that when organizations expand internationally or consolidate their existing global presence, they need to invest in localization. This does not simply mean linguistic resources but also technological investments that contribute to translation productivity among other efficiencies. When speaking about translation productivity, two technologies are an absolute must —translation management systems (TMS) and machine translation (MT). While a TMS helps optimize processes around linguistic tasks and project management, MT represents a time gain by contributing to the scalability and speed of translation. Indeed, these two tools are often integrated together and work seamlessly side by side. For this reason, it seems to us to be important to analyze and explain the various common ways they are combined and what this means for the shaping of language technology’s future.
Neural machine translation (NMT) uses machine learning technology to teach systems how to produce translations with a close to human-like quality and fluidity. Thanks to algorithms, systems learn from pools of corpora to translate texts in the best way possible. It is the latest MT technology and the most widely used today.
NMT engines can be generic, domain-based, customized, or adaptive.
Generic MT is a non-customized, non-specialized system — free publicly available systems such as Google Translate fall into this category. But MT engines can also be trained on a domain or customized for a specific organization. In this case, we refer to them, respectively, as domain-based and customized MT.
Since the NMT market is continually expanding, there’s an ever-growing choice of generic MT solutions on the market such as Microsoft Translator, DeepL, and the like. Generic engines don’t have a specific domain or specialization. Therefore, they often lack context, specific terminology, and/or style features. The vocabulary might sometimes be imprecise and/or inconsistent, for example. This kind of MT usually requires thorough human post-editing. For this reason, a well-trained customized engine might be a better fit than a generic MT system, even if the latter is provided by a big name like Google or Microsoft.
Source: Nimdzi Language Technology Atlas 2022
Remote interpreting solutions have been both in development and in use for a long time now. However, prior to the COVID-19 pandemic, uptake was slow. The onset of the pandemic changed this drastically, and, ever since, it seems that the growth, innovation, and investment in this field has been unstoppable. Once considered an afterthought or sub-par alternative to onsite services, remote interpreting has stepped out of the shadows to become the key to continuity of business and care in many industries.
Introduction The language services industry is undergoing a profound transformation with the emergence of cutting-edge technologies such as ChatGPT and large language models (LLMs). These powerful language generation models have captivated the attention of businesses and language professionals alike, offering exciting possibilities for translation, localization, and content creation. In this article, we will explore the […]
The year is 2023. Six years after the big neural MT push of 2017, it seems appropriate to say that machine translation (MT) has finally found its way in the localization industry. Most MT providers are producing reasonably acceptable baseline quality and MT solutions have never been more accessible. As a result, MT is becoming a reality in many organizations. What’s more, MT technology has reached a certain level of maturity in terms of customization and training.
Developing your own approach to using generative AI models such as ChatGPT — one that is both practical AND ethically sound — is perhaps the best way of proving naysayers wrong and ensuring that you get the most out of this promising piece of technology. Perhaps surprisingly, the first key to success with generative AI models is to learn how to talk to them.