parsero 0.0+git20140929.e5b585a-3 source package in Ubuntu


parsero (0.0+git20140929.e5b585a-3) unstable; urgency=medium

  * debian/control: added python3-pkg-resources in the Depends field.
  * debian/salsa-ci.yml: removed allow_failure to autopkgtest,
      no loonger needed.

 -- Thiago Andrade Marques <email address hidden>  Fri, 21 Feb 2020 14:14:06 -0300

Upload details

Uploaded by:
Debian Security Tools on 2020-02-21
Uploaded to:
Original maintainer:
Debian Security Tools
Medium Urgency

See full publishing history Publishing

Series Pocket Published Component Section
Focal release on 2020-02-22 universe misc


Focal: [FULLYBUILT] amd64


File Size SHA-256 Checksum
parsero_0.0+git20140929.e5b585a-3.dsc 2.1 KiB 6ae8f92632f38d1e4cf00eae768d8705321ff140116317b42c3efa555a13ceb9
parsero_0.0+git20140929.e5b585a.orig.tar.gz 11.5 KiB b93f493e9785281a35cf4542f0f77e8f66ae7cd9cf742dcca6952596bbdc8bdb
parsero_0.0+git20140929.e5b585a-3.debian.tar.xz 3.8 KiB 33f982a78e08288b8080ca1e1bd0e857f65a008fcbaa0e05f74ee81d73d573f9

No changes file available.

Binary packages built by this source

parsero: Audit tool for robots.txt of a site

 Parsero is a free script written in Python which reads the Robots.txt file
 of a web server through the network and looks at the Disallow entries. The
 Disallow entries tell the search engines what directories or files hosted
 on a web server mustn't be indexed. For example, "Disallow: /portal/login"
 means that the content on it's not allowed to
 be indexed by crawlers like Google, Bing, Yahoo... This is the way the
 administrator have to not share sensitive or private information with the
 search engines.
 Parsero is useful for pentesters, ethical hackers and forensics experts.
 It also can be used for security tests.