如何在Apache根级别阻止一些机器人和引用页面

当我使用Lighttpd时,可以通过这样的条目轻松实现。 所以所有的网站都受到保护

Wget机器人:

$HTTP["useragent"] =~ "Wget" { $HTTP["url"] =~ "^/tagi(.*)" { # $HTTP["url"] =~ "" { url.access-deny = ( "" ) } $HTTP["url"] =~ "^/tags(.*)" { url.access-deny = ( "" ) } $HTTP["url"] =~ "^/kom.php(.*)" { url.access-deny = ( "" ) } $HTTP["querystring"] =~ "^(.*)strony(.*)" { url.access-deny = ( "" ) } $HTTP["querystring"] =~ "^(.*)page(.*)" { url.access-deny = ( "" ) } $HTTP["url"] =~ "^(.*)/www/delivery/lg.php(.*)" { url.access-deny = ( "" ) } $HTTP["url"] =~ "^(.*)/reklamy/(.*)" { url.access-deny = ( "" ) } $HTTP["url"] =~ "^(.*)/ads/(.*)" { url.access-deny = ( "" ) } $HTTP["url"] =~ "^(.*)/www/delivery/ck.php(.*)" { url.access-deny = ( "" ) } } 

有虚假stream量的网站:

 $HTTP["referer"] =~ "(.*)surfing.php(.*)" { url.access-deny = ( "" ) } $HTTP["referer"] =~ "(.*)promote.php(.*)" { url.access-deny = ( "" ) } $HTTP["referer"] =~ "(.*)trafficadder.php(.*)" { url.access-deny = ( "" ) } $HTTP["referer"] =~ "(.*)traffic.php(.*)" { url.access-deny = ( "" ) } $HTTP["referer"] =~ ".*loic*." { url.access-deny = ( "" ) } $HTTP["referer"] =~ ".*autosurf*." { url.access-deny = ( "" ) } 

如何在Apache中做到这一点? 我不想把这个添加到.htaccess。

你可以使用mod_rewrite,这需要一点努力。 这里有一些出发点:

http://httpd.apache.org/docs/2.4/rewrite/access.html

特别注意“机器人阻塞”一节: http : //httpd.apache.org/docs/2.4/rewrite/access.html#blocking-of-robots

另请参阅: http : //en.linuxreviews.org/HOWTO_stop_automated_spam-bots_using_.htaccess