[wicked] Horde :: Wicked ::Patches to solve problems with web robots

Jan Schneider jan at horde.org
Sat Dec 18 04:30:23 PST 2004


Zitat von Chuck Hagenbuch <chuck at horde.org>:

> Quoting kracker <thekracker at gmail.com>:
>
>> Since web robots crawl through a web page and hit every link on a page
>> in the order they are presented, this creates a problem for the wicked
>> because it's unlock link comes before the history link and the history
>> page contains links to revert to a previous revision without
>> confirmation, so robots who crawl through a site can easily /
>> unknowingly revert large chunks of your content until your wicked
>> installation in total disarray.
>
> This is a clumsy way of doing it; you should just Browser::isRobot() to
> disallow
> robots to make changes.
>
>> I started this email after I wrote and tested a wicked patch to lock a
>> wiki page immediately after a page is saved (after an edit, see below
>> ) so that the wiki pages are by default always locked and must be
>> unlocked to edit to reduce the chance of the wiki being unlocked and
>> then reverted (only unlocked pages can be reverted .... )
>
> That's unnecessary and bad for general wiki useability.
>
>> Below is are two patches, the first is for the auto_lock feature, the
>> second is to not process wiki pages for the IP address of the
>> fastsearch.net robot (which is nice because it kills the app if the
>> robot tries to use it but lets other robots which are not so
>> destructive continue to troll the wiki (i kinda like google cache :) )
>
> Again, use Browser::isRobot().

And post your patches to bugs.horde.org where they can be tracked easier.

Jan.

-- 
Do you need professional PHP or Horde consulting?
http://horde.org/consulting/


More information about the wicked mailing list