This website requires JavaScript.
Explore
Help
Register
Sign In
Archives
/
cgit
Watch
2
Star
0
Fork
0
You've already forked cgit
mirror of
https://git.zx2c4.com/cgit/
synced
2024-11-12 01:10:27 +00:00
Code
Issues
Projects
Releases
Wiki
Activity
91f25909b9
cgit
/
robots.txt
5 lines
68 B
Plaintext
Raw
Normal View
History
Unescape
Escape
robots.txt: disallow access to snapshots My dmesg is filled with the oom killer bringing down processes while the Bingbot downloads every snapshot for every commit of the Linux kernel in tar.xz format. Sure, I should be running with memory limits, and now I'm using cgroups, but a more general solution is to prevent crawlers from wasting resources like that in the first place. Suggested-by: Natanael Copa <ncopa@alpinelinux.org> Suggested-by: Julius Plenz <plenz@cis.fu-berlin.de> Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
2013-05-28 12:17:00 +00:00
User-agent: *
Disallow: /*/snapshot/*
ui-tree,ui-blame: bail from blame if blob is binary This avoids piping binary blobs through the source-filter. Also prevent robots from crawling it, since it's expensive. Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
2019-12-18 21:30:12 +00:00
Disallow: /*/blame/*
robots.txt: disallow access to snapshots My dmesg is filled with the oom killer bringing down processes while the Bingbot downloads every snapshot for every commit of the Linux kernel in tar.xz format. Sure, I should be running with memory limits, and now I'm using cgroups, but a more general solution is to prevent crawlers from wasting resources like that in the first place. Suggested-by: Natanael Copa <ncopa@alpinelinux.org> Suggested-by: Julius Plenz <plenz@cis.fu-berlin.de> Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
2013-05-28 12:17:00 +00:00
Allow: /
Reference in New Issue
Copy Permalink