Generates a robots.txt
-
Updated
Nov 1, 2019 - JavaScript
Generates a robots.txt
A tool for debugging robots.txt
The repository contains Google-based robots.txt parser and matcher as a C++ library (compliant to C++17).
🚫🤖 Override /robots.txt to disallow all web crawlers, regardless settings stored in the database. Compatible with Liferay 7.0, 7.1, 7.2, 7.3 and 7.4.
This is a python crawler that disregards robots.txt rules and downloads disallowed resources
🌐 Displays the contents of robots.txt and sitemap.xml files of a website google extension
Robots.txt parser / generator
Front-end workflow to start a new project with Eleventy and Webpack.
Robots.txt parser and generator - Work in progress
Robots Scanner
This is ready to use template to quickly start selling domain with minimum setup.
Optimizes your site's robots.txt to reduce server load and CO2 footprint by blocking unnecessary crawlers while allowing major search engines and specific tools.
🤖 Robots.txt generator done right.
A lightweight and simple robots.txt parser in node
A small, tested, no-frills parser of robots.txt files in Swift.
A simple python program which find out any website robots.txt file.
A lightweight crawler frontier implementation in TypeScript using Redis.
This is a collection of robots.txt templates
Add a description, image, and links to the robots-txt topic page so that developers can more easily learn about it.
To associate your repository with the robots-txt topic, visit your repo's landing page and select "manage topics."