why do we block bad robots?
From the tutorials I have read on many sites, they, and what all 'experts' seem to tell me, the common practice is to block the known bad bots
wouldn't it be much better to have an international database of the GOOD bots?
and have an .htaccess or robots.txt file that allows all those good bots and blocks anything else
then if you're not good, your name dont go on the list
and "if your names not on the list you're not getting in"
so any new bots made get reviewed, if bad they dont go on the list, if good the reviewing body/organisation adds them to the bot list
site owners update them every whenever
seems to me it's done the wrong way around!
JeevesBond posted this at 01:39 — 24th July 2007.
He has: 3,956 posts
Joined: Jun 2002
This is an interesting idea, but there are a couple of problems with it:
The easiest way is to keep a blacklist of spamming IP's and botnames, it might be a good idea to have an international blacklist, but again it's a problem of who controls it?
Interesting stuff to think about though!
a Padded Cell our articles site!
Want to join the discussion? Create an account or log in if you already have one. Joining is fast, free and painless! We’ll even whisk you back here when you’ve finished.