why do we block bad robots?

greg's picture

He has: 1,581 posts

Joined: Nov 2005

From the tutorials I have read on many sites, they, and what all 'experts' seem to tell me, the common practice is to block the known bad bots

wouldn't it be much better to have an international database of the GOOD bots?
and have an .htaccess or robots.txt file that allows all those good bots and blocks anything else

then if you're not good, your name dont go on the list
and "if your names not on the list you're not getting in"

so any new bots made get reviewed, if bad they dont go on the list, if good the reviewing body/organisation adds them to the bot list
site owners update them every whenever

seems to me it's done the wrong way around!

JeevesBond's picture

He has: 3,956 posts

Joined: Jun 2002

greg wrote: then if you're not good, your name dont go on the list
and "if your names not on the list you're not getting in"

This is an interesting idea, but there are a couple of problems with it:

  1. How do you tell the difference between a human and a bot? If you're names not on the list: pretend to be a human using Opera/Firefox/some other browser, then you'll be let through.
  2. Who controls the list? Is it going to be like Hotmail, where Microsoft charge $1,400 to be removed from their spammers blacklist?

The easiest way is to keep a blacklist of spamming IP's and botnames, it might be a good idea to have an international blacklist, but again it's a problem of who controls it?

Interesting stuff to think about though! Smiling

a Padded Cell our articles site!

Want to join the discussion? Create an account or log in if you already have one. Joining is fast, free and painless! We’ll even whisk you back here when you’ve finished.