Search Engine Optimization

Customize your Robots.txt File

It's important to allow search engines to index your localized sites. The Smartling Global Delivery Network (GDN) will duplicate your existing robots.txt file on any localized sites automatically, so that any exclusions you have setup on your source site will be applied to your localized sites.

If you need to modify the robot.txt contents for a localized site only, the process depends on the domain or traffic routing strategy used for your localized sites. The process is described below for each strategy:


For GDN setups using localized subfolders, the robots.txt on your root can be customized to handle any URL exclusions including localized folder URLs, so it's only necessary to update this file on your origin (source) server.

Subdomain or ccTLD/TLD

For GDN setups using localized subdomains or top-level domains, the source robots.txt will be duplicated on the root of each localized domain. In order to customize or modify this for each localized domain, you can create customized files on your origin server's root containing the language-locale code in the filename (i.e. robots-fr-FR.txt, robots-de-DE.txt, etc.). Once this is done, contact your Smartling Customer Success Manager to map these files to your localized domains.

Was this article helpful?