Note
Access to this page requires authorization. You can try signing in or changing directories.
Access to this page requires authorization. You can try changing directories.
Note
The Retail Interest Group by Dynamics 365 Commerce has moved from Yammer to Viva Engage. If you don't have access to the new Viva Engage community, fill out this form (https://aka.ms/JoinD365commerceVivaEngageCommunity) to be added and stay engaged in the latest discussions.
This article describes how to manage robots.txt files in Microsoft Dynamics 365 e-commerce.
Note
Starting with the Dynamics 365 Commerce version 10.0.48 preview release, all HTTP responses served from internal Commerce-generated domains (.dynamics365commerce.ms) include an X-Robots-Tag: noindex, nofollow response header. This header instructs search engines not to index pages on internal domains and replaces the earlier deny-all robots.txt approach. This change automatically applies to all tenants and doesn't require any version upgrade. For more information, see X-Robots-Tag response header for internal domains.
The robots exclusion standard, or robots.txt, is a standard that websites use to communicate with web robots. It instructs web robots about any areas of a website that shouldn't be visited. Robots are often used by search engines to index websites.
To exclude robots from a server, create a file on the server. In this file, specify an access policy for robots. The file must be accessible via HTTP at the local URL /robots.txt. The robots.txt file helps search engines index the content on your site.
Dynamics 365 Commerce lets you upload a robots.txt file for your domain. For each domain in your Commerce environment, you can upload one robots.txt file and associate it with that domain.
For more information about the robots.txt file, visit The Web Robots Pages.
How robots.txt works with different domain types
Robots.txt behavior differs according to which type of domain is used to access your Commerce site.
Custom production domains
When you upload a robots.txt file for a custom domain such as www.fabrikam.com, that file is served when search engines or users access /robots.txt on your production site. This method allows search engines to index your site according to your configured rules.
Internal Commerce-generated domains
Internal domains that use the .dynamics365commerce.ms format serve the uploaded robots.txt file, similar to custom production domains. However, all HTTP responses from these domains include the X-Robots-Tag: noindex, nofollow header, which instructs search engines not to index or follow links on these pages. For more information, see X-Robots-Tag response header for internal domains.
X-Robots-Tag response header for internal domains
Starting with Commerce version 10.0.48 preview release, all HTTP responses from internal Commerce-generated domains (.dynamics365commerce.ms) include an X-Robots-Tag: noindex, nofollow response header. This change automatically applies to all tenants and doesn't require any version upgrade. The header instructs search engines not to index the page and not to follow any links on it. It also helps search engine crawlers recrawl pages and discover the noindex directive, which removes previously indexed content from search results.
Note
The X-Robots-Tag header only applies to responses from internal Commerce-generated domains. It doesn't affect custom production domains.
Test robots.txt for a specific domain
To preview how your robots.txt file appears for a specific custom domain when accessing from an internal Commerce-generated domain, add the ?domain= query parameter to the robots.txt URL.
For example, if your internal domain is https://<e-commerce-tenant-name>.dynamics365commerce.ms and your custom domain is <your-custom-domain>, use the following URL:
https://<e-commerce-tenant-name>.dynamics365commerce.ms/robots.txt?domain=<your-custom-domain>
This method allows you to verify the robots.txt configuration for any of your supported host names before you go live with your production domain. For more information about Commerce-generated URLs, see Commerce-generated URLs. For more information about configuring custom domains, see Configure your domain name.
Upload a robots.txt file
After you create and edit your robots.txt file according to the robots exclusion standard, make sure you can access the file on the computer where you use the Commerce authoring tools. The file must be named robots.txt. For best results, use the format noted in the standard. Each Commerce customer is responsible for validating and maintaining the contents of its robots.txt file. To upload a robots.txt file, sign in to Commerce as a system admin.
To upload a robots.txt file in Commerce, follow these steps:
- Sign in to Commerce as a system admin.
- In the left navigation pane, select Tenant Settings (next to the gear symbol) to expand it.
- Under Tenant Settings, select Robots.txt. A list of all the domains that are associated with your environment appears in the main part of the window.
- Select Manage to upload a robots.txt file for a domain in your environment.
- On the menu on the right, select the Upload button (the upward-pointing arrow) next to the domain that is associated with the robots.txt file. A file browser dialog box appears.
- In the dialog box, browse to and select the robots.txt file that you want to upload for the associated domain, and then select Open to complete the upload.
Note
During upload, Commerce verifies that the file is a text file, but it doesn't validate the file's contents.
Uploaded robots.txt files are served on both custom production domains and internal Commerce-generated domains. However, internal domains also include the
X-Robots-Tag: noindex, nofollowresponse header on all HTTP responses to prevent search engine indexing. For more information, see X-Robots-Tag response header for internal domains.
Download a robots.txt file
To download a robots.txt file in Commerce, follow these steps:
- Sign in to Commerce as a system admin.
- In the left navigation pane, select Tenant Settings (next to the gear symbol) to expand it.
- Under Tenant Settings, select Robots.txt. A list of all the domains that are associated with your environment appears in the main part of the window.
- Select Manage to download a robots.txt file for a domain in your environment.
- On the menu on the right, select the Download button (the downward-pointing arrow) next to the domain that is associated with the robots.txt file. A file browser dialog box appears.
- In the dialog box, go to the desired location on your local drive, confirm or enter a file name, and then select Save to complete the download.
Note
You can use this procedure to download only robots.txt files that you previously uploaded through the Commerce authoring tools.
Delete a robots.txt file
To delete a robots.txt file in Commerce, follow these steps:
- Sign in to Commerce as a system admin.
- In the left navigation pane, select Tenant Settings (next to the gear symbol) to expand it.
- Under Tenant Settings, select Robots.txt. A list of all the domains that are associated with your environment appears in the main part of the window.
- Select Manage to delete a robots.txt file for a domain in your environment.
- On the menu on the right, select the Delete button (the trash can symbol) next to the domain that is associated with the robots.txt file. A file browser window appears.
- In the file browser window, browse to and select the robots.txt file that you want to delete for the domain, and then select Open. A warning message box appears.
- In the message box, select Delete to confirm deletion of the robots.txt file.
Note
You can use this procedure to delete only robots.txt files that you previously uploaded through the Commerce authoring tools.
More resources
- Configure your domain name
- Deploy a new e-commerce tenant
- Create an e-commerce site
- Associate a Dynamics 365 Commerce site with an online channel
- Upload URL redirects in bulk
- Set up a B2C tenant in Commerce
- Set up custom pages for user logins
- Configure multiple B2C tenants in a Commerce environment
- Add support for a content delivery network (CDN)
- Enable location-based store detection