The 'Robots Exclusion Protocol' <https://www.robotstxt.org/orig.html> documents a set of standards for allowing or excluding robot/spider crawling of different areas of site content. Tools are provided which wrap The 'rep-cpp' <https://github.com/seomoz/rep-cpp> C++ library for processing these 'robots.txt' files.
Version: | 0.2.3 |
Depends: | R (≥ 3.2.0) |
Imports: | Rcpp |
LinkingTo: | Rcpp |
Suggests: | covr, robotstxt, tinytest |
Published: | 2020-05-30 |
Author: | Bob Rudis (bob@rud.is) [aut, cre], SEOmoz, Inc [aut] |
Maintainer: | Bob Rudis <bob at rud.is> |
BugReports: | https://gitlab.com/hrbrmstr/spiderbar/issues |
License: | MIT + file LICENSE |
URL: | https://gitlab.com/hrbrmstr/spiderbar |
NeedsCompilation: | yes |
SystemRequirements: | C++11 |
Materials: | NEWS |
In views: | WebTechnologies |
CRAN checks: | spiderbar results |
Reference manual: | spiderbar.pdf |
Package source: | spiderbar_0.2.3.tar.gz |
Windows binaries: | r-devel: spiderbar_0.2.3.zip, r-release: spiderbar_0.2.3.zip, r-oldrel: spiderbar_0.2.3.zip |
macOS binaries: | r-release: spiderbar_0.2.3.tgz, r-oldrel: spiderbar_0.2.3.tgz |
Old sources: | spiderbar archive |
Reverse imports: | robotstxt |
Please use the canonical form https://CRAN.R-project.org/package=spiderbar to link to this page.