Skip to content

Commit

Permalink
Block search engines via robots.txt (zedeus#631)
Browse files Browse the repository at this point in the history
Prevents instances from being rate limited due to being senselessly
crawled by search engines. Since there is no reason to index Nitter
instances, simply block all robots. Notably, this does *not* affect link
previews (e.g. in various chat software).
  • Loading branch information
minus7 authored Jun 4, 2022
1 parent 778c6c6 commit c543a1d
Showing 1 changed file with 2 additions and 0 deletions.
2 changes: 2 additions & 0 deletions public/robots.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
User-agent: *
Disallow: /

0 comments on commit c543a1d

Please sign in to comment.