Project

mwcrawler

0.01
No release in over 3 years
Low commit activity in last 3 years
There's a lot of open issues
Essa gema provê uma api ruby para se fazer o scrapping de páginas html do sistema matricula web e retornar um conteudo que pode ser mais facilmente processado pelo programa
2005
2006
2007
2008
2009
2010
2011
2012
2013
2014
2015
2016
2017
2018
2019
2020
2021
2022
2023
2024
 Dependencies

Development

~> 1.16
~> 0.11
~> 13.0
~> 3.0
~> 0.59.2
~> 4.0
~> 3.4

Runtime

~> 1.8
 Project Readme

Mwcrawler

Mwcrawler is a gem for parsing UnB's Matricula Web data into consumable hashes.

Build Status Coverage Status

Installation

Add this line to your application's Gemfile:

gem 'mwcrawler'

And then execute:

bundle

Or install it yourself as:

gem install mwcrawler

Usage

First instantiate a new crawler crawler = Mwcrawler::Crawler.new then you can crawl like so:

courses_hash = crawler.courses
# return example
[{"type"=>"Presencial",
  "code"=>"19",
  "name"=>"ADMINISTRAÇÃO",
  "shift"=>"Diurno",
  "curriculums"=>
   [{"name"=>"Administração",
     "degree"=>"Bacharel",
     "semester_max"=>"8",
     "semester_min"=>"16",
     "credits"=>"200"}]},
 {"type"=>"Presencial",
  "code"=>"701",
  "name"=>"ADMINISTRAÇÃO",
  "shift"=>"Noturno",
  "curriculums"=>
   [{"name"=>"Administração",
     "degree"=>"Bacharel",
     "semester_max"=>"8",
     "semester_min"=>"16",
     "credits"=>"200"}]}
]

The crawled campus by default is :darcy_ribeiro campus, but you can specify another crawler.classes(:planaltina).

The available resources are:

  • classes
  • courses
  • departments
  • curriculum

While classes and curriculum take course_code as param for crawling, courses and departments take as params any of the four campuses :darcy_ribeiro, :planaltina, :ceilandia and :gama.

The utility method semester returns the current semester.

Development

After checking out the repo, run bin/setup to install dependencies. Then, run rake spec to run the tests. You can also run bin/console for an interactive prompt that will allow you to experiment.

To install this gem onto your local machine, run bundle exec rake install. To release a new version, update the version number in version.rb, and then run bundle exec rake release, which will create a git tag for the version, push git commits and tags, and push the .gem file to rubygems.org.

Contributing

Bug reports and pull requests are welcome on GitHub at https://github.com/danilodelyima/mwcrawler.

Guidelines

When developing new features the interface must reflect how much scrapping is necessary. In other words, if many pages are crawled the user must call many methods. This way we don't overload method with functionalities and the user developer can grasp more easily the cost of scrapping that info.

License

The gem is available as open source under the terms of the MIT License.