Amazon Bedrock allows you to translate using the latest Amazon models such as Amazon Nova models. Amazon Bedrock can be used as a translation provider in Smartling, as part of a translation workflow or with any of Smartling's instant machine translation integrations.
Prerequisites
Before setting up Amazon Bedrock in Smartling, ensure you have:
- An API key for the Amazon Bedrock API with access to your desired model(s)
- Access to Smartling as an Account Owner
Refer to the Amazon documentation for instructions on generating an API key for Amazon Bedrock.
Supported models
All text models supported by Amazon Bedrock can be used in Smartling, provided your API key has access to them. Check the Amazon Bedrock documentation for a complete list of all models.
Supported languages
Amazon Bedrock models support numerous languages. Different models may have varying results and language coverage. Consult the documentation for your specific model to determine whether a language or specific locale is supported.
Setting up Amazon Bedrock in Smartling
Step 1: Setup in Amazon
Obtain an Amazon Bedrock API key from Amazon with access to the model you wish to use for translation. This key is required to access the model within Smartling. Smartling-provisioned credentials are not available for Amazon Bedrock.
Step 2: Add provider credentials in Smartling
You will need to store your Amazon Bedrock credentials (API key) in Smartling in order to use an Amazon Bedrock model as a translation provider.
- From the top navigation of your Smartling dashboard, access the AI Hub.
- Navigate to the Credentials section.
- Follow the instructions outlined here to store your provider credentials in Smartling.
Step 3: Setup an LLM Profile using the provider credentials
Once your provider credential is saved and tested successfully, create a Profile to configure translation settings. A Profile allows you to configure your translation prompt, as well as additional preferences to further customize the translation output.
- From the top navigation of your Smartling dashboard, access the AI Hub.
- Navigate to the Profiles page.
- Click Create Profile and select LLM Profile (RAG).
- Follow the steps 2-9 on How to create an LLM Profile.
- These steps guide you through creating an LLM Profile for Amazon Bedrock, optionally adjusting translation parameters, and configuring and testing your translation prompt using Prompt Tooling with RAG, along with other optional settings for customizing the translation output.
- In addition to these general steps, refer to the Amazon Bedrock–specific information below.
Amazon Bedrock-specific setup information
When creating the LLM Profile, you must enter the model ID or ARN of the Amazon Bedrock model you want to use for translation in the Bedrock model field.
This field tells the system which model to use when generating a response. The exact value you enter depends on the type of resource you're using:
From the Amazon Bedrock documentation
- If you use a base model, specify the model ID or its ARN. For a list of model IDs for base models, see Amazon Bedrock base model IDs (on-demand throughput) in the Amazon Bedrock User Guide.
- If you use an Amazon Bedrock Marketplace model, specify the ID or ARN of the marketplace endpoint that you created. For more information about Amazon Bedrock Marketplace and setting up an endpoint, see Amazon Bedrock Marketplace in the Amazon Bedrock User Guide.
- If you use an inference profile, specify the inference profile ID or its ARN. For a list of inference profile IDs, see Supported Regions and models for cross-region inference in the Amazon Bedrock User Guide.
- If you use a provisioned model, specify the ARN of the Provisioned Throughput. For more information, see Run inference using a Provisioned Throughput in the Amazon Bedrock User Guide.
- If you use a custom model, specify the ARN of the custom model deployment (for on-demand inference) or the ARN of your provisioned model (for Provisioned Throughput). For more information, see Use a custom model in Amazon Bedrock in the Amazon Bedrock User Guide.
Step 4: Use the LLM Profile for translation
After creating the LLM Profile, you can use it to translate content within the Smartling platform (using an MT workflow or generating MT suggestions in the CAT Tool), or use the profile with one of Smartling's MT integrations to display machine translations directly where needed, such as in connected third-party applications.
For more information, see Using the LLM Profile to translate your content.
Troubleshooting Amazon Bedrock translation issues
Common causes and solutions
Error appears on Profile page in the AI Hub
- Check if you've exceeded token or rate limits in your Amazon Bedrock account
- Verify your API key has access to the selected model
- Confirm your API key is still valid
Translations aren't appearing
- Verify that the LLM Profile has been added to your workflow's translation step or the Profile has been selected for your Instant MT integration
- Check for error messages on the Profiles page in the AI Hub
- Confirm if your fallback provider is working (if you have one configured)
Poor translation quality
- Review and refine your translation prompt and/or assess the quality of your linguistic assets if you are using Prompt Tooling with RAG. The quality of your linguistic assets is also crucial if you are using other features that reference your translation memory, glossary, and style guides, such as AI-Enhanced Glossary Term Insertion or TM Match Insertion.
- Verify that the target language is well supported by your chosen model. LLM providers like Amazon Bedrock typically produce better quality translation for high-resource languages than for less common, low-resource languages.
- Try a different Amazon Bedrock model
- Consider using Smartling's AI Toolkit features or add a human review step to your workflow
Considerations
Compared to traditional machine translation providers, LLMs like those offered through Amazon Bedrock provide more flexibility. However, they also come with a number of challenges that should be considered.
Read about important considerations for translating with LLMs in Translating with LLMs in Smartling.