Smartling supports the use of Large Language Models (LLMs) as translation providers.
The process for setting up LLM translation is similar to setting up a traditional MT provider, with the added need for a well-crafted translation prompt, customized parameter settings, and recommended testing and monitoring to ensure the translation output aligns with your brand messaging.
MT engines vs. LLMs
Unlike MT engines, which are generally ready to use out of the box, LLMs rely heavily on well-designed prompts to produce the desired translation output. Additionally, LLMs are prone to hallucinations, which refers to cases where they generate nonsensical, incorrect, or inconsistent translations. Because of this, MT engines are often a more reliable and recommended option for translation, while LLMs are better suited for smoothing and refining the translation output (for example, by using Smartling's AI Toolkit).
However, with the right prompt and tools like RAG technology, the translation quality produced by LLMs often rivals or even exceeds traditional NMT providers, and can produce more customized results.
Check out this post in the Smartling Community to learn about best practices for translating with LLMs and how they compare to MT engines.
Supported LLMs as translation providers
Smartling supports the following LLMs as translation providers:
Similarly to traditional MT providers, these LLMs can be used to translate content with any workflow or integration that is part of Smartling’s AI Hub.
Built your own in-house LLM service? Use it in Smartling—learn more in Bring Your Own MT or LLM Service.
Benefits of translating with LLMs in Smartling
To achieve optimal results with LLM translation, Smartling offers highly specialized features to enhance and customize the translation output, and to prevent potential issues due to hallucinations.
Efficient prompt creation and testing
- Smartling's prompt management interface enables you to easily store, test and adjust your prompt and configuration details.
- A side-by-side view of your translation prompt and a testing interface allows you to adjust your prompt in real time based on the test results.
-
Adjustable translation parameters allow you to tailor the translation output to your preferences, for example by adapting the level of output creativity, sampling range and repetition tolerance.
For more information, see Translation Parameters for LLM Translation. -
Prompt tooling with RAG (Retrieval-Augmented Generation) allows you to automatically inject your translation prompt with example translations and glossary terms from your linguistic assets. This is proven to significantly enhance the quality of the translation output, and to ensure that translations adhere to your organization's preferred style and terminology.
For more information, see Prompt Tooling with RAG. - Smartling allows you to create dynamic translation prompts with conditional logic.
Jinja2 conditions can be included in your prompt to dynamically adapt it to specific circumstances based on predefined rules.
Tip: For more information about Smartling's prompt interface and syntax, see Managing LLM Profiles and Prompts.
String batching
- When strings are sent to an LLM for translation, they are bundled into a string batch. This allows for your content to be processed more efficiently.
- Since the translation prompt is sent to the LLM only once per string batch (instead of each individual string), string batching helps reduce the token count for your translation request.
- The exact size of each string batch depends on the model used for translation, but won't exceed the maximum supported token count.
Mitigate hallucination issues
- When used as a translation provider, LLMs may at times generate nonsensical, incorrect, or inconsistent translations. This behavior is referred to as "hallucinating".
- Smartling's hallucination detection feature automatically flags potential issues due to LLM hallucinations, allowing you to route affected strings to an alternative provider or workflow.
- Hallucination detection helps catch problematic translations before they get published and may negatively impact your brand.
Tip: For more information, see Hallucination Detection for LLM Translations.
Customize LLM translations with your linguistic assets
- AI-Enhanced Glossary Term Insertion allows for your glossary terms to be inserted in LLM translations and adapted to the surrounding sentence structure, in order to preserve your brand terminology.
- For strings where a translation memory match is available, the existing translation from the TM can be inserted and used instead of an LLM translation. If the AI Toolkit is enabled, available TM matches can be optimized with the help of AI, to provide a translation that is better adapted to the current source text.
Optional add-on: Smartling's AI Toolkit
Smartling's AI Toolkit can be used in combination with LLM translation workflows as an optional add-on. This bundle of AI-powered features to optimize the LLM translation output and workflow.
- Adjust the formality register to address your audience with the correct formality level.
- Use the AI Post-Editing Agent to further optimize LLM translations by referencing your linguistic assets and locale-specific rules.
- Use AI Adaptive TM to increase your translation memory leverage by optimizing available matches, which can then be inserted and used instead of an LLM translation.
- Use the Language Quality Estimation Agent to predict the quality level and route LLM translations accordingly to the right workflow steps.
Measurable translation quality
- Smartling's Linguistic Quality Assurance (LQA) tools can help facilitate an objective evaluation process, providing an MQM quality score for easy assessment.
- An AI-powered LQA Agent will become available in 2026, allowing you to automatically assess LLM translation quality.
How to get started with LLM translation
- To begin translating with an LLM in Smartling, you will need to obtain provider credentials from your preferred provider and then store them in Smartling.
For more information, see MT and LLM Provider Credentials (BYOK). - Once you have set up your provider credentials, you can use them to create an LLM Profile. This is where you store your translation prompt, and further customize the translation output.
For more information, see Managing LLM Profiles and Prompts. - The LLM Profile can then be used to translate your content in the Smartling platform (in a translation workflow or to provide translation suggestions in the CAT Tool), or with one of Smartling's instant MT integrations.