The 'Robots Exclusion Protocol' <https://www.robotstxt.org/orig.html> documents a set of standards for allowing or excluding robot/spider crawling of different areas of site content. Tools are provided which wrap The 'rep-cpp' <https://github.com/seomoz/rep-cpp> C++ library for processing these 'robots.txt' files.
Version: | 0.2.5 |
Depends: | R (≥ 3.2.0) |
Imports: | Rcpp |
LinkingTo: | Rcpp |
Suggests: | covr, robotstxt, tinytest |
Published: | 2023-02-11 |
DOI: | 10.32614/CRAN.package.spiderbar |
Author: | Bob Rudis ([email protected]) [aut, cre], SEOmoz, Inc [aut] |
Maintainer: | Bob Rudis <bob at rud.is> |
BugReports: | https://github.com/hrbrmstr/spiderbar/issues |
License: | MIT + file LICENSE |
URL: | https://github.com/hrbrmstr/spiderbar |
NeedsCompilation: | yes |
In views: | WebTechnologies |
CRAN checks: | spiderbar results |
Reference manual: | spiderbar.pdf |
Package source: | spiderbar_0.2.5.tar.gz |
Windows binaries: | r-devel: spiderbar_0.2.5.zip, r-release: spiderbar_0.2.5.zip, r-oldrel: spiderbar_0.2.5.zip |
macOS binaries: | r-release (arm64): spiderbar_0.2.5.tgz, r-oldrel (arm64): spiderbar_0.2.5.tgz, r-release (x86_64): spiderbar_0.2.5.tgz, r-oldrel (x86_64): spiderbar_0.2.5.tgz |
Old sources: | spiderbar archive |
Reverse imports: | robotstxt |
Please use the canonical form https://CRAN.R-project.org/package=spiderbar to link to this page.