Matt Cutts To The Quick: What Should Google Do About Webspam


Matt Cutts is asking what the Google webspam team should focus on. He’s getting quite a few responses to the question and you’d think there’d be more thought put into some of them. For instance, here’s the second comment on the blog thread:
Google Webspam Issue - What to Do?I think a paid link reporting firefox plugin would be helpful. Right click on the suspect link, select report and report – the plugin could then pass you extra detail like the location on the page, etc, and might save your team time and encourage more reports.
Perhaps this commenter isn’t aware that Google isn’t responsible for creating Firefox plugins. Or maybe they expect Googlers to stop what they are doing and start creating them. Come on, how about a real answer to Matt’s question?
OK, glad you asked:
Google should greatly punish the webspam through decreasing their PR.
You mean, like how they’re doing that with paid links? How would that work exactly? Who’s going to judge to determine whether one search result is spam or not? I think a better solution is to fix it so that spam doesn’t show up in the search engines at all.
But I do concur with this comment from Chris Bartow:
As far as results go, better local business results. If I search for [pizza small town, usa], the results are usually 90% full of yellow page, directory and review type sites. I think it would be more useful to show the “official” web sites of the businesses first.
I don’t know how many times I’ve conducted a local search and ended up getting the Yellow Pages or Yelp ahead of everything else. I’ve even seen the results that I was looking for on page 2 or 3. I’d definitely like to see better local results.
Here’s another comment I agree with:
My comment is mostly on being a new business with a ton of great content that gets stuck behind older businesses with less and perhaps more out of date content. It appears that PageRank trumps content in many instances (though not all).
I don’t know that I agree that PageRank trumps content, but I definitely agree that some older search results should not be there. Google’s age factor needs tweaking to eliminate information that is out of date and incorrect. This is particularly true in the SEO industry where out of date information can rise to the top simply because it is older and exists on an “authority” site. There needs to be some allowances for more recent information to outrank older information.
If you’d like to put in your 2 cents worth and tell Matt Cutts what you think the webpam team at Google should focus on, drive by Matt’s blog and drop a note.

0 comments:

Post a Comment