Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Robots.txt showing local environment / Disallow all when LIVE #383

Open
mark-chief opened this issue May 2, 2019 · 1 comment
Open

Robots.txt showing local environment / Disallow all when LIVE #383

mark-chief opened this issue May 2, 2019 · 1 comment

Comments

@mark-chief
Copy link

mark-chief commented May 2, 2019

Just leaving this here as Discord doesn't have threads and I dont want to lose it :)

I've noticed the robots.txt file on a couple of sites are all displaying the local version. Even though the environment in the settings is set to live.

Sitemap: https://www.domain.com/sitemaps-1-sitemap.xml
# local - disallow all
User-agent: *
Disallow: /

Is there another setting I should be aware of, I cannot see anything other settings relating to robots.txt. I have cleared all caches too. Thanks in advance.

  • Craft 3.1.25
  • Running seomatic 3.1.50 on both sites.
  • Devmode false
@cap-akimrey
Copy link

Adding this in case some else lands here via search. I had a similar situation with a newly-live site. After changing the .env value pair to ENVIRONMENT=live, I needed to disable the robots.txt output and then re-enable it. The refresh got things working as expected.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants