I just read, Microsoft mistakenly asks Google to block CNN, Wikipedia and U.S. Govt sites, among others on Yahoo! news. Frankly, this does not surprise me all that much.

While working in public relations measurement research I often saw multiple attempts at automation i.e., computer automated tone, automated subject population using key word searches and automated data collection. The one common denominator across all of the attempts that I witnessed was, in a word, accuracy. Sure, the automation was spot on occasionally, but never near 100%.

The key to accurate automated data lies in the human factor. It must rely on humans programming the automation and humans interpreting the automation. If a human does their job accurately, there’s still a problem, human error. So then, should automation be dumped all together? No way! It just needs constant testing and human interaction along with a qualifier that says, “this automated data is great, you can rely on it but you should know that there ARE potentials for error as with nearly all testing, scientific or otherwise.”

So, did Microsoft screw up? No, not really. Errors in automation are to be expected. Can they fix this? Sure, they need humans to tell their systems which sites to trust. And to monitor and test the software consistently and constantly!