<html>
<head>
<style>
.hmmessage P
{
margin:0px;
padding:0px
}
body.hmmessage
{
font-size: 10pt;
font-family:Verdana
}
</style>
</head>
<body class='hmmessage'>
<br>If I were to ban by ip, what if it were only one bad machine in a large network behind a router.....will it block the entire network?<br><br>Thanks again,<br><br>Jeremy<br><br>> Date: Mon, 23 Feb 2009 16:13:03 -0800<br>> From: matt@elrod.ca<br>> To: jeremygwa@hotmail.com<br>> CC: victoria-pm@pm.org<br>> Subject: Re: [VPM] link 'bot' protection<br>> <br>> <br>> I should think you would want to "throttle" bots by timing<br>> their requests and temporarily banning IPs that exceed a<br>> speed limit. You can specify a preferred delay in your<br>> robots.txt file to give fair warning.<br>> <br>> Granted, giving bots a chance to exceed your speed limit<br>> gives them a chance to slurp some of your data, but if<br>> your code blocks them after a dozen or so rapid requests,<br>> they won't get far.<br>> <br>> The user-agent variable is easily forged, so speed of<br>> requests is the only reliable way of spotting bots that<br>> I am aware of.<br>> <br>> Matt Elrod<br>> <br>> Jer A wrote:<br>> > hi all,<br>> > <br>> > I am designing a website service.<br>> > <br>> > how do i prevent automated bots and link scrapers and cross-site scripts <br>> > from access to the site, without hindering the user experience, as well <br>> > as hindering the performance of the host/server/site?<br>> > <br>> > My site is not graphic intensive, and I do not think anyone would be <br>> > interest at grabbing anything that is graphical, only Information/Data.<br>> > <br>> > I have thought of banning ip's by parsing log files, but what should I <br>> > look for that is 'fishy'?<br>> > <br>> > Thanks in advance for all advice/help.<br>> > <br>> > Regards,<br>> > Jeremy<br>> <br><br /><hr />The new Windows Live Messenger. <a href='http://www.microsoft.com/windows/windowslive/products/messenger.aspx' target='_new'>You don’t want to miss this.</a></body>
</html>