[BlueOnyx:02301] Re: DFix update

Ken Marcus - Precision Web Hosting, Inc kenlists at precisionweb.net
Wed Sep 2 17:01:17 -05 2009


----- Original Message ----- 
From: "Michael Aronoff" <maronoff at gmail.com>
To: "'BlueOnyx General Mailing List'" <blueonyx at blueonyx.it>
Sent: Wednesday, September 02, 2009 2:42 PM
Subject: [BlueOnyx:02300] Re: DFix update


> Florian wrote:
>> i am getting the intention it tends to block/unblock search engine
>> crawlers/robots that look for (not existing) robots.txt files here on my
>> machine...
>
> I like the feature.  I raised the ERRORHITS to 50 from 10, so many more
> errors are required before blocking happens.  Also I keep an eye on the
> emails and I added a few IP's I kept seeing, that was clearly googlebot 
> and
> some other search engines too. Now it appears to be behaving well and has
> caught a number of bots or zombies looking to exploitable php file that 
> will
> run a remote inclusion.
>
> Good stuff.
> _______________________________________________________________
> Michael Aronoff - West Hills, CA
>


I don't actually use dfix, but if I did have it,  I would not want it to 
look at the httpd error logs unless it had a complete search engine IP 
address whitelist.  The common scenario would be: major site redesign --> 
all search engines trip the dfix --> they all drop you since they can no 
longer get to your site.

Possibly it could be an option to turn this on or off.  Also, I don't think 
it should check for html or htm 404s since that is not where someone would 
be looking for an exploitable script.


----
Ken Marcus
Ecommerce Web Hosting by
Precision Web Hosting, Inc.
http://www.precisionweb.net






More information about the Blueonyx mailing list