[gopher] gopher proxies
Brian Koontz
brian at pongonova.net
Wed Jul 24 16:47:03 UTC 2013
On Wed, Jul 24, 2013 at 10:56:25AM +0200, Jacob Dahl Pind wrote:
> user-agent: Lightspeed
> user-agent: SISTRIX Crawler
> user-agent: Baiduspider
> user-agent: YandexBot
> user-agent: Ezooms
> user-agent: Exabot
> user-agent: AhrefsBot
AhrefsBot is relentless. They don't seem to obey robots.txt, and
will just hammer a server for non-existent links. So are you
suggesting that it's the proxy that should be responsible for
filtering this in some way?
--Brian
--
Don't have gopher? Visit the world's first wiki-based gopher proxy!
http://www.pongonova.org/gopherwiki
IRC: Freenode.net channel #gopherproject
More information about the Gopher-Project
mailing list