This website requires JavaScript.
Explore
Help
Sign In
mirrors
/
cgit
Watch
0
Star
0
Fork
0
You've already forked cgit
mirror of
https://git.zx2c4.com/cgit
synced
2024-11-29 11:56:21 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
c1733e28d9
cgit
/
robots.txt
5 lines
68 B
Plaintext
Raw
Normal View
History
Unescape
Escape
robots.txt: disallow access to snapshots My dmesg is filled with the oom killer bringing down processes while the Bingbot downloads every snapshot for every commit of the Linux kernel in tar.xz format. Sure, I should be running with memory limits, and now I'm using cgroups, but a more general solution is to prevent crawlers from wasting resources like that in the first place. Suggested-by: Natanael Copa <ncopa@alpinelinux.org> Suggested-by: Julius Plenz <plenz@cis.fu-berlin.de> Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
2013-05-28 12:17:00 +00:00
User-agent: *
Disallow: /*/snapshot/*
ui-tree,ui-blame: bail from blame if blob is binary This avoids piping binary blobs through the source-filter. Also prevent robots from crawling it, since it's expensive. Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
2019-12-18 21:30:12 +00:00
Disallow: /*/blame/*
robots.txt: disallow access to snapshots My dmesg is filled with the oom killer bringing down processes while the Bingbot downloads every snapshot for every commit of the Linux kernel in tar.xz format. Sure, I should be running with memory limits, and now I'm using cgroups, but a more general solution is to prevent crawlers from wasting resources like that in the first place. Suggested-by: Natanael Copa <ncopa@alpinelinux.org> Suggested-by: Julius Plenz <plenz@cis.fu-berlin.de> Signed-off-by: Jason A. Donenfeld <Jason@zx2c4.com>
2013-05-28 12:17:00 +00:00
Allow: /
Reference in New Issue
Copy Permalink