New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Duplicate sitemaps #737
Comments
Closing this issue due to inactivity. |
would rather not |
I can confirm this, and looking at the code the exclude list is not run over the sitemaps added to the
https://support.google.com/webmasters/answer/7451001#errors&zippy=%2Csitemap-parsing-errors |
This package is not actively maintained. It auto-closes issues and PRs after a particular set time |
Describe the bug
Sitemap indexes show in both robots.txt and in the root sitemap index
To Reproduce
With this config:
Expected behavior
Not duplicate sitemaps
Example
See https://embarc.com/robots.txt:
And see https://embarc.com/sitemap.xml :
My preference would be to have only in the sitemap index and not in the robots.txt. How can that be done?
The text was updated successfully, but these errors were encountered: