Project

robotstxt

0.01
No commit activity in last 3 years
No release in over 3 years
Robotstxt Parser allows you to the check the accessibility of URLs and get other data. Full support for the robots.txt RFC, wildcards and Sitemap: rules.
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
 Dependencies

Development

~> 3.1
~> 0.8
 Project Readme

Robotstxt¶ ↑

Robotstxt is an Ruby robots.txt file parser.

Robotstxt Parser allows you to the check the accessibility of URLs and get other data.

Full support for the robots.txt RFC, wildcards and Sitemap: rules.

Features¶ ↑

  • Check if the URL is allowed to be crawled from your Robot

  • Analyze the robots.txt file to return an Array containing the list of XML Sitemaps URLs

Requirements¶ ↑

  • Ruby >= 1.8.7

Installation¶ ↑

This library is intended to be installed via the RubyGems system.

$ gem install robotstxt

You might need administrator privileges on your system to install it.

Author¶ ↑

Author

Simone Rinzivillo <srinzivillo@gmail.com>

Resources¶ ↑

License¶ ↑

Copyright © 2009 Simone Rinzivillo, Robotstxt is released under the MIT license.