Smartling Auto Select LLM is Smartling's fully managed LLM translation Profile. It is a pre-configured LLM translation provider with Retrieval-Augmented Generation (RAG), managed entirely by Smartling. Smartling handles the prompt, model selection, and all parameters, no setup or LLM expertise required.
Smartling continuously benchmarks across leading LLMs and updates the model selection as higher-quality options become available, ensuring your content is always translated by the best available model. If you need direct control over model selection, prompts, or parameters, create a standard LLM Profile instead for a specific LLM provider.
How it works
Auto Select LLM uses leading models from Vertex AI (Gemini), OpenAI, and Amazon Bedrock.
Smartling continuously benchmarks against new models as they are released and updates the model as higher-quality options become available, so you are never locked into a single model or provider.
The Profile also includes fallback models. If the primary model is unavailable, Smartling automatically routes the request to a backup model to ensure uninterrupted translation.
Retrieval-Augmented Generation (RAG)
Auto Select LLM supports RAG by default. RAG is automatically enabled and enhances translation quality by referencing the following linguistic assets from your project:
- Translation Memory Examples
- Glossary Terms
- Style Rules for AI (see the rule categories chart for details on supported rules)
Set up Auto Select LLM
An Auto Select LLM Profile is pre-provisioned in your account. You can find it by going to AI Hub > Profiles.
Edit the pre-provisioned Profile
You can edit the existing Auto Select LLM Profile to configure linguistic packages for non-project-based integrations (also known as MT integrations) such as the MT API or Smartling Translate:
- Go to AI Hub > Profiles.
- Click the Smartling Auto Select LLM Profile name to open its settings.
- Select one or more linguistic packages to define which linguistic assets (translation memory, glossary, and style guide) are used for RAG. For workflow translations within a project, the project's linguistic package is used by default.
- Click Test Integration to confirm the connection is working. To test the actual translation output, you will need to create a test job and submit content for translation using the Profile.
Create an additional Profile
You can create additional Auto Select LLM Profiles if needed. For example, you may want separate Profiles to use different linguistic packages for non-project-based integrations or to set different TM Match Insertion thresholds.
- Go to AI Hub > Profiles.
- Click Create Profile and select Smartling Auto Select LLM.
- Enter a Profile Name. We recommend using a name that is easy to identify, such as including "Auto Select LLM" in the name.
- Select one or more linguistic packages to define which linguistic assets are used for RAG with non-project-based integrations such as the MT API or Smartling Translate. For workflow translations within a project, the linguistic package in the project is used by default.
- Click Test Integration to confirm the connection is working. To test the actual translation output, create a test job and submit content for translation using the Profile.
Considerations
Cost
For customers with a Smartling AI Subscription, Auto Select LLM usage is billed as AI Hub words at the Smartling key rate. By comparison, standard LLM Profiles where you use your own API keys are billed at the AI Hub BYOK rate. If you do not have an AI subscription, usage counts as MT words.
Language support
Auto Select LLM supports all source and target languages. High-resource languages generally produce very good results, but quality can vary for lower-resource languages.
Fine-tuning and custom models
Auto Select LLM uses generic models. Fine-tuned or custom models are not supported. If you would like to use a fine-tuned custom model, you will need to set up a separate LLM Profile specifically for that model. If you are interested in Smartling's LLM fine-tuning services, contact your Customer Success Manager.
Bring Your Own Key (BYOK)
BYOK is not supported for Auto Select LLM. You cannot use your own API keys with this Profile.
Language Adaptation
You can use Auto Select LLM for Language Adaptation, but it currently uses the translation prompt, which may produce more changes than expected. A dedicated Language Adaptation function is planned for a future release.
Hallucination detection
Smartling includes hallucination detection to flag unexpected changes in LLM output. While this reduces risk, LLM translations may still occasionally produce unexpected results. A human review step in your workflow is recommended.
FAQ
What providers and models does Auto Select LLM use?
Smartling uses a primary model selected for the best cost-quality ratio, with two fallback models if the primary model is unavailable. The model is chosen from leading providers such as Vertex AI (Gemini), OpenAI, and Amazon Bedrock. Smartling benchmarks against new models as they are released and updates the selection when a better option is available. See How it works for more detail.
Is the translation prompt visible or customizable?
No. The prompt is not visible and cannot be customized. You cannot add your own prompt messages or modify LLM parameters. If you need direct control over the prompt, create a standard LLM Profile for a specific LLM provider.
Is the quality of Auto Select LLM better than Auto Select MT?
Yes. Smartling has benchmarked translation quality across frequently used languages such as French, German, Spanish, Japanese, Chinese, Italian, and Dutch, and Auto Select LLM outperformed traditional MT engines across all of them. Auto Select MT is still available for customers who cannot use LLMs, but we recommend using Auto Select LLM for the best translation quality. If you use a trained MT engine, results may be more comparable. For a comparable assessment, a trained MT engine should be compared to a fine-tuned LLM model.
Can I restrict which LLM providers are used?
Yes. If due to company policy or preference you do not wish the Auto Select LLM Profile to use a particular LLM provider, you can restrict the use of an LLM provider within the AI Hub Provider settings.