[gopher] Torrent update

Kim Holviala kim at holviala.com
Mon May 3 13:07:24 UTC 2010


On 3.5.2010 14:19, Martin Ebnoether wrote:

>> PLEASE set up robots.txt to prevent robots from re-archiving this stuff!
>> I don't want to end up with 20 copies of the 30gig archive!
>>
>> Put "robots.txt" in your gopher root with something like this in it:
>>
>> User-agent: *
>> Disallow: /archives
>
> Does this really work for gopherspace?

Yup.

> Besides, are there any popular search engines that index
> gopherspace? Google[1] does not, neither does Bing.

Popular... *cough*, Veronica-2 is pretty popular and it respects 
robots.txt.

I'm building my search engine which will be have much broader scope but 
it's not finished yet. Crawler works, and I have crawled through maybe 
50% of the gopherspace, but the indexer and search are still works in 
progress.


- Kim



More information about the Gopher-Project mailing list