This website works better with JavaScript.
Explore
Help
Register
Sign In
Archives
/
cgit
mirror of
https://git.zx2c4.com/cgit/
Watch
2
Star
0
Fork
You've already forked cgit
0
Code
Issues
Projects
Releases
Wiki
Activity
You cannot select more than 25 topics
Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
6feb1b669b
ch/git-2-47
ch/for-jason
master
ch/highlight-line
ch/clang-filter
ch/log-commit-message
ch/dynamic-aging
ch/about-link
ch/default-pages
jd/zx2c4-deployment
wiki
jd/render-filter
lf/for-jason
jk/collapsible-sections
rm/namespace
lf/filter
lh/grep
lh/pretty-blob-view
v1.2.3
v1.2.2
v1.2.1
v1.2
v1.1
v1.0
v0.12
v0.11.2
v0.11.1
v0.11.0
v0.10.2
v0.10.1
v0.10
v0.9.2
v0.9.1
v0.9.0.3
v0.9.0.2
v0.9.0.1
v0.9
v0.8.3.5
v0.8.3.4
v0.8.3.3
v0.8.3.2
v0.8.3.1
v0.8.3
v0.8.2.2
v0.8.2.1
v0.8.2
v0.8.1.1
v0.8.1
v0.8
v0.7.2
v0.7.1
v0.7
v0.6.3
v0.6.2
v0.6.1
v0.6
v0.5
v0.4
v0.3
v0.2
v0.1
Branches
Tags
${ item.name }
Create tag
${ searchTerm }
Create branch
${ searchTerm }
from '6feb1b669b'
${ noResults }
cgit
/
robots.txt
5 lines
68 B
Plaintext
Raw
Normal View
History
Unescape
Escape
robots.txt: disallow access to snapshots My dmesg is filled with the oom killer bringing down processes while the Bingbot downloads every snapshot for every commit of the Linux kernel in tar.xz format. Sure, I should be running with memory limits, and now I'm using cgroups, but a more general solution is to prevent crawlers from wasting resources like that in the first place. Suggested-by: Natanael Copa <ncopa@alpinelinux.org> Suggested-by: Julius Plenz <plenz@cis.fu-berlin.de> Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
12 years ago
User-agent: *
Disallow: /*/snapshot/*
ui-tree,ui-blame: bail from blame if blob is binary This avoids piping binary blobs through the source-filter. Also prevent robots from crawling it, since it's expensive. Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
5 years ago
Disallow: /*/blame/*
robots.txt: disallow access to snapshots My dmesg is filled with the oom killer bringing down processes while the Bingbot downloads every snapshot for every commit of the Linux kernel in tar.xz format. Sure, I should be running with memory limits, and now I'm using cgroups, but a more general solution is to prevent crawlers from wasting resources like that in the first place. Suggested-by: Natanael Copa <ncopa@alpinelinux.org> Suggested-by: Julius Plenz <plenz@cis.fu-berlin.de> Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
12 years ago
Allow: /