Project

webrobots

0.64
No release in over 3 years
Low commit activity in last 3 years
This library helps write robots.txt compliant web robots in Ruby.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
 Dependencies

Development

>= 1.2, ~> 1.2
>= 1.4.7, ~> 1.4
>= 0
>= 0.9.2.2
> 2.4.2
>= 0
 Project Readme

webrobots¶ ↑

This is a library to help write robots.txt compliant web robots.

Usage¶ ↑

require 'webrobots'
require 'uri'
require 'net/http'

robots = WebRobots.new('MyBot/1.0')

uri = URI('http://digg.com/news/24hr')
if robots.disallowed?(uri)
  STDERR.puts "Access disallowed: #{uri}"
  exit 1
end
body = Net::HTTP.get(uri)
# ...

Requirements¶ ↑

  • Ruby 1.8.7 or 1.9.2+

Contributing to webrobots¶ ↑

  • Check out the latest master to make sure the feature hasn’t been implemented or the bug hasn’t been fixed yet

  • Check out the issue tracker to make sure someone already hasn’t requested it and/or contributed it

  • Fork the project

  • Start a feature/bugfix branch

  • Commit and push until you are happy with your contribution

  • Make sure to add tests for it. This is important so I don’t break it in a future version unintentionally.

  • Please try not to mess with the Rakefile, version, or history. If you want to have your own version, or is otherwise necessary, that is fine, but please isolate to its own commit so I can cherry-pick around it.

Copyright © 2010-2016 Akinori MUSHA. See LICENSE.txt for further details.