Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEAT]: robots.txt per routes #228

Open
readtedium opened this issue Apr 1, 2024 · 3 comments
Open

[FEAT]: robots.txt per routes #228

readtedium opened this issue Apr 1, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@readtedium
Copy link

readtedium commented Apr 1, 2024

What happened?

I’m attempting to set a robots.txt on a Ghost server I have running on my domain, but nothing is working, and I can’t figure out why. No matter what I do, it remains

User-agent: *
Disallow: /

I would like this server to be accessible to search engines, but I cannot set it. I see a recent change to robots.txt was made to Cosmos that produces the same output and want to confirm this is not the cause.

What should have happened?

When I uploaded a robots.txt file to my Ghost theme, it should have loaded, it did not. It may be related to Cosmos, not Ghost.

How to reproduce the bug?

  1. Load a Ghost server
  2. Upload a theme with a dedicated robots.txt file
  3. Load robots.txt
  4. See disallow message

Relevant log output

No response

Other details

I see that a recent feature was added to support robots.txt to prevent the Cosmos server from being visible on search engines, but even after clearing the site’s cache and removing cookies, the old file is still there.

System details

Client:

  • OS: Nobara Linux
  • Browser: Vivaldi

Server:

  • OS: Pop!_OS
  • Version: 15.0
@readtedium readtedium added the bug Something isn't working label Apr 1, 2024
@readtedium
Copy link
Author

readtedium commented Apr 1, 2024

Update: I turned on the “Allow search engines to index your server” feature on Cosmos and cleared my cache in Cloudflare and the correct robots.txt appeared. I turned off that checkmark and cleared my cache again, and the disallow showed back up.

This is really a feature that should be by domain, because there are some sites I want accessible and others I don’t.

@azukaar
Copy link
Owner

azukaar commented Apr 1, 2024

I will probably add a per domain checkbox yes

@readtedium
Copy link
Author

Good call—thank you!

@azukaar azukaar added enhancement New feature or request and removed bug Something isn't working labels Apr 15, 2024
@azukaar azukaar changed the title [BUG]: Issue with robots.txt files [BUG]: robots.txt per routes Apr 15, 2024
@azukaar azukaar changed the title [BUG]: robots.txt per routes [FEAT]: robots.txt per routes Apr 15, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants