Robotstxt¶
What is it?¶
Robotstxt is a lightweight http-server, just serving a disallow-robots.txt file using the Zig programming language(https://ziglang.org/).
Robots.txt basically works like a “No Trespassing” sign. It actually, tells robots whether we want them to crawl the website or not. With this role, we are disallowing all robots to crawl and avoid indexing in search engines.
Details | |||
---|---|---|---|
Project home | Docs | Github | Docker |
1. Installation¶
sb install sandbox-traefik_robotstxt
2. Result¶
HTTP/1.1 200 OK
Content-Length: 26
User-agent: *
Disallow: /
*.yourdomain.tld/robots.txt