[Python-apps-commits] r8431 - in packages/wapiti/trunk/debian (changelog wapiti.1)
adejong at users.alioth.debian.org
adejong at users.alioth.debian.org
Fri Apr 6 13:55:06 UTC 2012
Date: Friday, April 6, 2012 @ 13:55:04
Author: adejong
Revision: 8431
* Fix spelling error and small wording change in manual page (thanks
lintian).
Modified:
packages/wapiti/trunk/debian/changelog
packages/wapiti/trunk/debian/wapiti.1
Modified: packages/wapiti/trunk/debian/changelog
===================================================================
--- packages/wapiti/trunk/debian/changelog 2012-04-06 13:48:44 UTC (rev 8430)
+++ packages/wapiti/trunk/debian/changelog 2012-04-06 13:55:04 UTC (rev 8431)
@@ -13,8 +13,10 @@
* Switch to dh command sequencer and install file with dh_install instead
of a custom setup.py.
* Update Vcs-Browser field.
+ * Fix spelling error and small wording change in manual page (thanks
+ lintian).
- -- Arthur de Jong <adejong at debian.org> Fri, 06 Apr 2012 15:47:12 +0200
+ -- Arthur de Jong <adejong at debian.org> Fri, 06 Apr 2012 15:53:56 +0200
wapiti (1.1.6-3) unstable; urgency=low
Modified: packages/wapiti/trunk/debian/wapiti.1
===================================================================
--- packages/wapiti/trunk/debian/wapiti.1 2012-04-06 13:48:44 UTC (rev 8430)
+++ packages/wapiti/trunk/debian/wapiti.1 2012-04-06 13:55:04 UTC (rev 8431)
@@ -68,7 +68,7 @@
.br
print help page
.SH EFFICIENCY
-Wapiti is developed in python and use a library I made called lswww. This web spider library does the most of the work. Unfortunately, the html parsers module within python only works with well formated html pages so lswww fails to extract informations from bad-coded webpages. Tidy can clean these webpages on the fly for us so lswww will give pretty good results. In order to make Wapiti far more efficient, you should:
+Wapiti is developed in Python and use a library called lswww. This web spider library does the most of the work. Unfortunately, the html parsers module within python only works with well formed html pages so lswww fails to extract information from bad-coded webpages. Tidy can clean these webpages on the fly for us so lswww will give pretty good results. In order to make Wapiti far more efficient, you should:
.PP
apt-get install python-utidylib python-ctypes
.SH AUTHOR
More information about the Python-apps-commits
mailing list