libwww-robotrules-perl 6.02-1 source package in Ubuntu
Changelog
libwww-robotrules-perl (6.02-1) unstable; urgency=medium [ Ansgar Burchardt ] * debian/control: Convert Vcs-* fields to Git. [ Salvatore Bonaccorso ] * debian/copyright: Replace DEP5 Format-Specification URL from svn.debian.org to anonscm.debian.org URL. [ Alessandro Ghedini ] * New upstream release * Bump debhelper compat level to 8 * Bump Standards-Version to 3.9.2 [ Salvatore Bonaccorso ] * Change Vcs-Git to canonical URI (git://anonscm.debian.org) * Change search.cpan.org based URIs to metacpan.org based URIs [ Axel Beckert ] * debian/copyright: migrate pre-1.0 format to 1.0 using "cme fix dpkg- copyright" [ gregor herrmann ] * debian/control: remove Nicholas Bamber from Uploaders on request of the MIA team. * Strip trailing slash from metacpan URLs. [ Salvatore Bonaccorso ] * Update Vcs-Browser URL to cgit web frontend * debian/control: Use HTTPS transport protocol for Vcs-Git URI [ gregor herrmann ] * debian/copyright: change Copyright-Format 1.0 URL to HTTPS. [ Salvatore Bonaccorso ] * Update Vcs-* headers for switch to salsa.debian.org [ gregor herrmann ] * Add debian/upstream/metadata. * Mark package as autopkgtest-able. * Declare compliance with Debian Policy 4.1.4. * Bump debhelper compatibility level to 10. * Add /me to Uploaders. -- gregor herrmann <email address hidden> Sat, 14 Apr 2018 19:07:49 +0200
Upload details
- Uploaded by:
- Debian Perl Group
- Uploaded to:
- Sid
- Original maintainer:
- Debian Perl Group
- Architectures:
- all
- Section:
- perl
- Urgency:
- Medium Urgency
Downloads
File | Size | SHA-256 Checksum |
---|---|---|
libwww-robotrules-perl_6.02-1.dsc | 2.2 KiB | 8419a4bac65737229e54cf2356e2f0ab90a8738d7fefb82a1883480a5747b469 |
libwww-robotrules-perl_6.02.orig.tar.gz | 8.8 KiB | 46b502e7a288d559429891eeb5d979461dd3ecc6a5c491ead85d165b6e03a51e |
libwww-robotrules-perl_6.02-1.debian.tar.xz | 2.2 KiB | d9a0bde5423038c69616c5099a8c03158bfa8bdb6ae99eba3edbe76b8018ceeb |
Available diffs
- diff from 6.01-1 (in Ubuntu) to 6.02-1 (2.6 KiB)
No changes file available.
Binary packages built by this source
- libwww-robotrules-perl: database of robots.txt-derived permissions
WWW::RobotRules parses /robots.txt files as specified in "A Standard for
Robot Exclusion", at <http://www.robotstxt. org/wc/ norobots. html>. Webmasters
can use the /robots.txt file to forbid conforming robots from accessing parts
of their web site.
.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts.