Amazon Bedrock allows you to translate using the latest Amazon models such as Amazon Nova models. Amazon Bedrock can be used as a translation provider in Smartling, as part of a translation workflow or with any of Smartling's instant machine translation integrations.
Prerequisites
Once you have set up an AWS user with IAM policies that allow the use of Amazon Bedrock (i.e., AWS permission bedrock:InvokeModel), you can generate an Access Key ID and Secret Key ID. Refer to the Amazon documentation for more information. You will need to store these credentials in Smartling and use them later when creating an LLM Profile.
If you plan to use models from third-party providers on Amazon Bedrock, you may need to enable access to those models in your AWS account first.
You must also have Account Owner access in Smartling to complete the setup process.
Supported models
All text models supported by Amazon Bedrock can be used in Smartling, provided your AWS credentials have access to them. Check the Amazon Bedrock documentation for a complete list of all models.
Supported languages
Amazon Bedrock models support numerous languages. Different models may have varying results and language coverage. Consult the documentation for your specific model to determine whether a language or specific locale is supported.
Setting up Amazon Bedrock in Smartling
Step 1: Setup in Amazon
Once you have set up an AWS user with IAM policies that allow the use of Amazon Bedrock (i.e., AWS permission bedrock:InvokeModel), you can generate an Access Key ID and Secret Access Key, which you will need when creating the provider credential in Smartling.
Refer to the AWS documentation for instructions on generating access keys for your IAM user. Smartling-provisioned credentials are not available for Amazon Bedrock.
Step 2: Add provider credentials in Smartling
You will need to store your Amazon Bedrock credentials (Access Key ID and Secret Access Key) in Smartling in order to use an Amazon Bedrock model as a translation provider.
- From the top navigation of your Smartling dashboard, access the AI Hub.
- Navigate to the Credentials section.
- Follow the instructions outlined here to store your provider credentials in Smartling.
Step 3: Setup an LLM Profile using the provider credentials
Once your provider credential is saved and tested successfully, create a Profile to configure translation settings. A Profile allows you to configure your translation prompt, as well as additional preferences to further customize the translation output.
- From the top navigation of your Smartling dashboard, access the AI Hub.
- Navigate to the Profiles page.
- Click Create Profile and select LLM Profile (RAG).
- Follow the steps 2-9 on How to create an LLM Profile.
- These steps guide you through creating an LLM Profile for Amazon Bedrock, optionally adjusting translation parameters, and configuring and testing your translation prompt using Prompt Tooling with RAG, along with other optional settings for customizing the translation output.
- In addition to these general steps, refer to the Amazon Bedrock–specific information below.
Amazon Bedrock-specific setup information
When creating the LLM Profile, you must enter the model ID or ARN of the Amazon Bedrock model you want to use for translation in the Bedrock model field.
This field tells the system which model to use when generating a response. The exact value you enter depends on the type of resource you're using:
From the Amazon Bedrock documentation
- If you use a base model, specify the model ID or its ARN. For a list of model IDs for base models, see Supported Regions and models for inference profiles.
- If you use an Amazon Bedrock Marketplace model, specify the ID or ARN of the marketplace endpoint that you created. For more information about Amazon Bedrock Marketplace and setting up an endpoint, see Amazon Bedrock Marketplace in the Amazon Bedrock User Guide.
- If you use an inference profile, specify the inference profile ID or its ARN. For a list of inference profile IDs, see Supported Regions and models for cross-region inference in the Amazon Bedrock User Guide.
- If you use a provisioned model, specify the ARN of the Provisioned Throughput. For more information, see Run inference using a Provisioned Throughput in the Amazon Bedrock User Guide.
- If you use a custom model, specify the ARN of the custom model deployment (for on-demand inference) or the ARN of your provisioned model (for Provisioned Throughput). For more information, see Use a custom model in Amazon Bedrock in the Amazon Bedrock User Guide.
Step 4: Use the LLM Profile for translation
After creating the LLM Profile, you can use it to translate content within the Smartling platform (using an MT workflow or generating MT suggestions in the CAT Tool), or use the profile with one of Smartling's MT integrations to display machine translations directly where needed, such as in connected third-party applications.
For more information, see Using the LLM Profile to translate your content.
Troubleshooting Amazon Bedrock translation issues
Common causes and solutions
Error appears on Profile page in the AI Hub
- Check if you've exceeded token or rate limits in your Amazon Bedrock account
- Verify your AWS user has the required permissions (e.g.,
bedrock:InvokeModel) and access to the selected model - Confirm your Access Key ID and Secret Access Key are still valid
Translations aren't appearing
- Verify that the LLM Profile has been added to your workflow's translation step or the Profile has been selected for your Instant MT integration
- Check for error messages on the Profiles page in the AI Hub
- Confirm if your fallback provider is working (if you have one configured)
Poor translation quality
- Review and refine your translation prompt and/or assess the quality of your linguistic assets if you are using Prompt Tooling with RAG. The quality of your linguistic assets is also crucial if you are using other features that reference your translation memory, glossary, and style guides, such as AI-Enhanced Glossary Term Insertion or TM Match Insertion.
- Verify that the target language is well supported by your chosen model. LLM providers like Amazon Bedrock typically produce better quality translation for high-resource languages than for less common, low-resource languages.
- Try a different Amazon Bedrock model
- Consider using Smartling's AI Toolkit features or add a human review step to your workflow
Considerations
Compared to traditional machine translation providers, LLMs like those offered through Amazon Bedrock provide more flexibility. However, they also come with a number of challenges that should be considered.
Read about important considerations for translating with LLMs in Translating with LLMs in Smartling.