There was an error in this gadget

Thursday, July 28, 2011

DaniWeb Claims 110% Recovery from Google Panda Update ......

This week, it was confirmed that Google had made a minor adjustment to its Panda algorithm update, which has drastically altered the search engine’s results several times since its first iteration in February. 

As Google makes hundreds of algorithmic changes each year, Google downplayed this as any major shift. The official statement, as obtained by Barry Schwartz, was

“We’re continuing to iterate on our Panda algorithm as part of our commitment to returning high-quality sites to Google users. This most recent update is one of the roughly 500 changes we make to our ranking algorithms each year.”

It appears that it may be more major than we originally thought. We had seen a few comments from webmasters indicating that their rankings had somewhat improved, but now Dani Horowitz, whose DaniWeb discussion forum was an apparently innocent casualty of the Panda updates’s wrath tells WebProNews that the site has made a full “110% recovery” as a result of this most recent Panda tweak. 

When we interviewed Horowitz back in May, she told us about some various tactics she was engaging in, which seemed to be having positive effects on her site’s search referrals.

While what she was seeing was far from a full recovery, it was enough to give webmasters hope that they may be able to climb their way back up into Google’s good graces, despite having been victimized by the update. In other words, there were enough other ranking factors that sites could use to improve their rankings to avoid being totally deprived of search referrals at the hands of Panda – good news for those sites with quality content that were casualties of Google’s war on poor content.

At the time, DaniWeb had a long way to go, however, to reach the levels of traffic it was seeing from Google before. Even more interesting perhaps, was the fact that Google seemed to be ranking DaniWeb well for things that didn’t make sense, while things that that it ranked well for previously that did make sense, were sending traffic elsewhere.

“Panda 2.3 went live on July 23rd and traffic just instantly jumped back up to normal that very day,” Horowitz now tells us. “We’re now seeing traffic at the same pre-Panda highs in some countries, while other countries are even better than ever. Overall, we’re seeing more pageviews than ever before.”

Here’s a look at global visitors and US visitors respectively since the beginning of the year (that’s visitors, not pageviews



“Notice that US visitors were affected on February 24th while global traffic wasn’t severely impacted until a month and a half later,” Horowitz points out. “The decline coincided exactly with the first iteration of Panda and the recovery coincided exactly with the latest iteration of Panda.”

“All of the changes I’ve made were documented in the official Google Support thread or in the video interview I did with you guys,” she tells us. “In fact, I hadn’t made any recent changes immediately before the recovery. I haven’t yet had a chance to investigate any specific long tail keywords yet either. Google Webmaster Tools looks very different from what it looked like back in March as a result of all the work I’ve done, but nothing that stands out between this month and last.”

She did add in the Google Support thread, “There were no big changes made immediately before the site came back, with the exception of a significant increase in my Google AdWords budget.” She followed this up shortly after with, “I mentioned AdWords because we use it heavily to increase registrations, which directly results in an increase in posts per day. If there was a correlation, then it was a sudden increase in new content followed the penalty reversal.”

Here’s our previous interview with Dani, so you can gain more insight into the kinds of things she was doing in the first place:
We’ll keep our eyes peeled for more reports of full recoveries. I have to wonder how many wrongfully impacted sites have seen their rankings jump back up. Either way, provided that DaniWeb’s recovery was indeed a direct result from this latest Panda tweak, other victims might find hope in that Google does continue to “iterate” on the Panda algorithm.

Have you noticed a significant change in rankings since the latest iteration of the Panda update? Any more ill recoveries? Let us know.
We’ll keep our eyes peeled for more reports of full recoveries. I have to wonder how many wrongfully impacted sites have seen their rankings jump back up. Either way, provided that DaniWeb’s recovery was indeed a direct result from this latest Panda tweak, other victims might find hope in that Google does continue to “iterate” on the Panda algorithm.

Have you noticed a significant change in rankings since the latest iteration of the Panda update? Any more ill recoveries? Let us know.

Wednesday, June 29, 2011

Google Launches Google + To Battle Facebook

Google Launches Google+1 To Battle Facebook

Google has finally unveiled Google+1, the company’s top secret social layer that turns all of the search engine into one giant social network.

Google+, which begins rolling out a very limited field test on Tuesday, is the culmination of a year-long project led by Vic Gundotra, Google’s senior vice president of social. The project, which has been delayed several times, constitutes Google’s answer to Facebook.

The search giant’s new social project will be omnipresent on its products, thanks to a complete redesign of the navigation bar. The familiar gray strip at the top of every Google page will turn black, and come with several new options for accessing your Google+ profile, viewing notifications and instantly sharing content. The notification system is similar to how Facebook handles notifications, complete with a red number that increases with each additional notice.





Circles+

That’s where Google+ begins to diverge from Facebook, though. The focus of this social project is not on sharing with a mass group of friends, but on targeted sharing with your various social groups. To do this, Google uses a system called Circles.

Gundotra explained that most social media services (read: Facebook, Twitter) haven’t been successful with friend lists because they’ve been designed as a “tack-on” product rather than being integrated at every level. Gundotra also believes that current friend list products are awkward and not rewarding to use.



Google+ Circles is an attempt to address that challenge. The HTML5 system allows users to drag-and-drop their friends into different social circles for friends, family, classmates, co-workers and other custom groups. Users can drag groups of friends in and out of these circles.

One of the nice things about the product is its whimsical nature — a puff of smoke and a -1 animation appears when you remove a friend, and when you remove a social circle, it rolls away off the screen.


Photos Group Video Chat

It’s clear from the extended demo that Gundotra and his team have thought about every aspect and detail of Google+ thoroughly. The photo, video and mobile experiences are no exception.

Google has created a section specifically for viewing, managing and editing multimedia. The photo tab takes a user to all of the photos he or she has shared, as well as the ones he or she is tagged in. It’s not just photo tagging, though: Google+ includes an image editor (complete with Instagram-like photo effects), privacy options and sharing features.

The video chat feature might be one of the most interesting aspects of Google+. Gundotra and his team thought about why group chat hasn’t become a mainstream phenomenon. He compared it to knocking on a neighbor’s door at 8 p.m. — most people don’t do it because it isn’t a social norm. However, if a group of friends are sitting on a porch and you just happen to walk by, it’s almost rude not to say hi.

That’s the concept behind “Hangouts,” Google’s new group chat feature. Instead of directly asking a friend to join a group chat, users instead click “start a hangout” and they’re instantly in a video chatroom alone. At the same time, a message goes out to their social circles, letting them know that their friend is “hanging out.” The result, Google has found in internal testing, is that friends quickly join.

One cool feature of Hangouts is that it doesn’t place a chat window on the screen for each participant. Instead, Google changes the chat screen to whoever is currently talking. It quickly switches from video feed to video feed, moving faster in bigger groups. The maximum members in any video Hangout is 10, though users can get on a waitlist and wait for someone to leave.


Content Discovery Through Sparks



To spur sharing, Google has added a recommendation engine for finding interesting content. The feature, Google+ Sparks, is a collection of articles, videos, photos and other content grouped by interest. For example, the “Movies” spark will have a listing of recent and relevant content for that topic.

The system is algorithmic — it relies on information from other Google products (e.g. Google Search) as well as what is being shared via Google+ and through +1 buttons.

The goal, according to Gundotra, is to make it dead-simple for users to explore their interests and share what they find with their friends. Google+ is attempting to become the one-stop shop not only for sharing content, but for finding it as well. In some ways, it reminds us of Twitter and its mission to become an information network, and “instantly connect people everywhere to what’s most important to them.”



Mobile

Google will also be launching mobile apps for Google+, starting with Android. The Android app includes access to the Stream, Circles, Sparks and multimedia.

The addition of these features in a mobile app isn’t a surprise. What is a surprise, though, is the app’s auto-upload feature. Any photo or video you take on your phone through Google+ will automatically be uploaded to your computer, ready to share. These uploads aren’t public, but the next time you log onto your desktop, the photos button in the status bar will have a number, indicating how many new uploads are available for sharing. It keeps these photos and videos available for sharing for eight hours after upload.

Gundotra says that Google intends to launch apps for Google+ on other platforms in the future.




Conclusion

Google freely admitted to me during our conversation that its previous attempt at social, Google Buzz, did not live up to expectations. Bradley Horowitz, Google’s vice president of product, says that part of the problem was that Buzz was just “tacked on” as a link on millions of Gmail accounts, something that Google won’t be repeating. Horowitz also says that, unlike the Buzz rollout, Google+ is a project that will roll out in stages.

In many ways, it reminds us of Gmail’s rollout. Invites to Google’s email service were so sought after at one point that people were selling them for $50 or more on eBay. While that type of fervor may not hit Google+, we expect the artificial scarcity will drive up interest while giving Google time to work out the kinks.

No matter what Google says, Google+ is the company’s response to the rise of Facebook. The two companies are in heated competition for talent, page views and consumers. While Google controls the search market and has a strong presence on mobile with Android, it hasn’t been able to crack the social nut. Its most successful social product, YouTube, had to be acquired, and it still ranks as one of the most expensive acquisitions in the company’s history.

Has Google finally nailed social with Google+? We’re going to publish more of our thoughts on Google’s new social network in the next few hours, but we will say this: Google no longer gets a free pass in social. It must prove that it can draw users and keep them engaged in a way that doesn’t replicate Facebook or Twitter’s functionality. Only time will tell if Google has finally found its magical arrow.








Tuesday, June 28, 2011

How Google's Panda Update Changed SEO Best Practices Forever........

How Google's Panda Update Changed SEO Best Practices Forever
It's here! Google has released Panda update 2.2, just as Matt Cutts said they would at SMX Advanced here in Seattle a couple of weeks ago. This time around, Google has - among other things - improved their ability to detect scraper sites and banish them from the SERPs. Of course, the Panda updates are changes to Google's algorithm and are not merely manual reviews of sites in the index, so there is room for error (causing devastation for many legitimate webmasters and SEOs).
A lot of people ask what parts of their existing SEO practice they can modify and emphasize to recover from the blow, but alas, it's not that simple. In this week's Whiteboard Friday, Rand discusses how the Panda updates work and, more importantly, how Panda has fundamentally changed the best practices for SEO. Have you been Panda-abused? Do you have any tips for recuperating? Let us know in the comments!
 






Howdy, SEOmoz fans. Welcome to another edition of Whiteboard Friday. This week, we're talking about the very exciting, very interesting, very controversial Google Panda update.

Panda, also known as Farmer, was this update that Google came out with in March of this year, of 2011, that rejiggered a bunch of search results and pushed a lot of websites down in the rankings, pushed some websites up in the rankings, and people have been concerned about it ever since. It has actually had several updates and new versions of that implementation and algorithm come out. A lot of people have all these questions like, "Ah, what's going on around Panda?" There have been some great blog posts on SEOmoz talking about some of the technical aspects. But I want to discuss in this Whiteboard Friday some of the philosophical and theoretical aspects and how Google Panda really changes the way a lot of us need to approach SEO. 

So let's start with a little bit of Panda history. Google employs an engineer named Navneet Panda. The guy has done some awesome work. In fact, he was part of a patent application that Bill Slawski looked into where he found a great way to scale some machine learning algorithms. Now, machine learning algorithms, as you might be aware, are very computationally expensive and they take a long time to run, particularly if you have extremely large data sets, both of inputs and of outputs. If you want, you can research machine learning. It is an interesting fun tactic that computer scientists use and programmers use to find solutions to problems. But basically before Panda, machine learning scalability at Google was at level X, and after it was at the much higher level Y. So that was quite nice. Thanks to Navneet, right now they can scale up this machine learning.
What Google can do based on that is take a bunch of sites that people like more and a bunch of sites that people like less, and when I say like, what I mean is essentially what the quality raters, Google's quality raters, tell them this site is very enjoyable. This is a good site. I'd like to see this high in the search results. Versus things where the quality raters say, "I don't like to see this." Google can say, "Hey, you know what? We can take the intelligence of this quality rating panel and scale it using this machine learning process."

Here's how it works. Basically, the idea is that the quality raters tell Googlers what they like. They answer all these questions, and you can see Amit Singhal and Matt Cutts were interviewed by Wired Magazine. They talked about some of the things that were asked of these quality raters, like, "Would you trust this site with your credit card? Would you trust the medical information that this site gives you with your children? Do you think the design of this site is good?" All sorts of questions around the site's trustworthiness, credibility, quality, how much they would like to see it in the search results. Then they compare the difference.

The sites that people like more, they put in one group. The sites that people like less, they put in another group. Then they look at tons of metrics. All these different metrics, numbers, signals, all sorts of search signals that many SEOs suspect come from user and usage data metrics, which Google has not historically used as heavily. But they think that they use those in a machine learning process to essentially separate the wheat from the chaff. Find the ones that people like more and the ones that people like less. Downgrade the ones they like less. Upgrade the ones they like more. Bingo, you have the Panda update.

So, Panda kind of means something new and different for SEO. As SEOs, for a long time you've been doing the same kind of classic things. You've been building good content, making it accessible to search engines, doing good keyword research, putting those keywords in there, and then trying to get some links to it. But you have not, as SEOs, we never really had to think as much or as broadly about, "What is the experience of this website? Is it creating a brand that people are going to love and share and reward and trust?" Now we kind of have to think about that.

It is almost like the job of SEO has been upgraded from SEO to web strategist. Virtually everything you do on the Internet with your website can impact SEO today. That is especially true following Panda. The things that they are measuring is not, oh, these sites have better links than these sites. Some of these sites, in fact, have much better links than these sites. Some of these sites have what you and I might regard, as SEOs, as better content, more unique, robust, quality content, and yet, people, quality raters in particular, like them less or the things, the signals that predict that quality raters like those sites less are present in those types of sites.

Let's talk about a few of the specific things that we can be doing as SEOs to help with this new sort of SEO, this broader web content/web strategy portion of SEO.

First off, design and user experience. I know, good SEOs have been preaching design user experience for years because it tends to generate more links, people contribute more content to it, it gets more social signal shares and tweets and all this other sort of good second order effect. Now, it has a first order effect impact, a primary impact. If you can make your design absolutely beautiful, versus something like this where content is buffeted by advertising and you have to click next, next, next a lot. The content isn't all in one page. You cannot view it in that single page format. Boy, the content blocks themselves aren't that fun to read, even if it is not advertising that's surrounding them, even if it is just internal messaging or the graphics don't look very good. The site design feels like it was way back in the 1990s. All that stuff will impact the ability of this page, this site to perform. And don't forget, Google has actually said publicly that even if you have a great site, if you have a bunch of pages that are low quality on that site, they can drag down the rankings of the rest of the site. So you should try and block those for us or take them down. Wow. Crazy, right? That's what a machine learning algorithm, like Panda, will do. It will predicatively say, "Hey, you know what? We're seeing these features here, these elements, push this guy down."

Content quality matters a lot. So a lot of time, in the SEO world, people will say, "Well, you have to have good, unique, useful content." Not enough. Sorry. It's just not enough. There are too many people making too much amazing stuff on the Internet for good and unique and grammatically correct and spelled properly and describes the topic adequately to be enough when it comes to content. If you say, "Oh, I have 50,000 pages about 50,000 different motorcycle parts and I am just going to go to Mechanical Turk or I am going to go outsource, and I want a 100 word, two paragraphs about each one of them, just describe what this part is." You think to yourself, "Hey, I have good unique content." No, you have content that is going to be penalized by Panda. That is exactly what Panda is designed to do. It is designed to say this is content that someone wrote for SEO purposes just to have good unique content on the page, not content that makes everyone who sees it want to share it and say wow. Right?

If I get to a page about a motorcycle part and I am like, "God, not only is this well written, it's kind of funny. It's humorous. It includes some anecdotes. It's got some history of this part. It has great photos. Man, I don't care at all about motorcycle parts, and yet, this is just a darn good page. What a great page. If I were interested, I'd be tweeting about this, I'd share it. I'd send it to my uncle who buys motorcycles. I would love this page." That's what you have to optimize for. It is a totally different thing than optimizing for did I use the keyword at least three times? Did I put it in the title tag? Is it included in there? Is the rest of the content relevant to the keywords? Panda changes this. Changes it quite a bit.

Finally, you are going to be optimizing around user and usage metrics. Things like, when people come to your site, generally speaking compared to other sites in your niche or ranking for your keywords, do they spend a good amount of time on your site, or do they go away immediately? Do they spend a good amount of time? Are they bouncing or are they browsing? If you have a good browse rate, people are browsing 2, 3, 4 pages on average on a content site, that's decent. That's pretty good. If they're browsing 1.5 pages on some sites, like maybe specific kinds of news sites, that might actually be pretty good. That might be better than average. But if they are browsing like 1.001 pages, like virtually no one clicks on a second page, that might be weird. That might hurt you. Your click-through rate from the search results. When people see your title and your snippet and your domain name, and they go, "Ew, I don't know if I want to get myself involved in that. They've got like three hyphens in their domain name, and it looks totally spammy. I'm not going to get involved." Then that click-through rate is probably going to suffer and so are your rankings.

They are going to be looking at things like the diversity and quantity of traffic that comes to your site. Do lots of people from all around the world or all around your local region, your country, visit your website directly? They can measure this through Chrome. They can measure it through Android. They can measure it through the Google toolbar. They have all this user and usage metrics. They know where people are going on the Internet, where they spend time, how much time they spend, and what they do on those pages. They know about what happens from the search results too. Do people click from a result and then go right back to the search results and perform another search? Clearly, they were unhappy with that. They can take all these metrics and put them into the machine learning algorithm and then have Panda essentially recalculate. This why you see essentially Google doesn't issue updates every day or every week. It is about every 30 or 40 days that a new Panda update will come out because they are rejiggering all this stuff.

One of the things that people who get hit by Panda come up to me and say, "God, how are we ever going to get out of Panda? We've made all these changes. We haven't gotten out yet." I'm like, "Well, first off, you're not going to get out of it until they rejigger the results, and then there is no way that you are going to get out of it unless you change the metrics around your site." So if you go into your Analytics and you see that people are not spending longer on your pages, they are not enjoying them more, they are not sharing them more, they are not naturally linking to them more, your branded search traffic is not up, your direct type in traffic is not up, you see that none of these metrics are going up and yet you think you have somehow fixed the problems that Panda tries to solve for, you probably haven't.

I know this is frustrating. I know it's a tough issue. In fact, I think that there are sites that have been really unfairly hit. That sucks and they shouldn't be and Google needs to work on this. But I also know that I don't think Google is going to be making many changes. I think they are very happy with the way that Panda has gone from a search quality perspective and from a user happiness perspective. Their searchers are happier, and they are not seeing as much junk in the results. Google likes the way this is going. I think we are going to see more and more of this over time. It could even get more aggressive. I would urge you to work on this stuff, to optimize around these things, and to be ready for this new form of SEO.

Thanks everyone for watching. Look forward to some great comments, questions, feedback in the post. 



 

Saturday, May 21, 2011

SERP Alert: Google Social Search Goes Global


Google announced via its new official Search Blog that it is rolling out Social Search around the globe. This comes just days after Bing upped the ante in the social search game by integrating Facebook data in much more elaborate ways. Google’s social search, however, may prove useful in some cases, but you may see more content from strangers than you do from your real friends.




Does Google’s Social Search make results less relevant? Comment here. 

Google has been doing social search since 2009, and earlier this year it was updated to be more useful, with social results appearing throughout the SERP, as opposed to just in a cluster at the bottom of the SERP. Google says they’re mixed in based on relevance.


“For example, if you’re looking for information about low-light photography and your friend Marcin has written a blog post about it, that post may show up higher in your results with a clear annotation and picture of Marcin,” says Google software engineer Yohann Coppel.

“Social Search can help you find pages your friends have created, and it can also help you find links your contacts have shared on Twitter and other sites. If someone you’re connected to has publicly shared a link, we may show that link in your results with a clear annotation,” says Coppel. “So, if you’re looking for information about modern cooking and your colleague Adam shared a link about Modernist Cuisine, you’ll see an annotation and picture of Adam under the result. That way when you see Adam in the office, you’ll know he might be a good person to ask about his favorite modern cooking techniques.”

                            How Google Determines What to Show In Social Search Results

First of all, users must be logged into Google to get the benefits of social search. “If you’re signed in, Google makes a best guess about whose public content you may want to see in your results, including people from your Google Chat buddy list, your Google Contacts, the people you’re following in Google Reader and Buzz, and the networks you’ve linked from your Google profile or Google Account. For public networks like Twitter, Google finds your friends and sees who they’re publicly connected to as well,” explains Coppel.

Google deserves credit for giving users great deal of control about what people they’re using here, though they could still go further. You can go to your Google Dashboard, find the Social Circle and Content section, and edit accordingly. If you go to the “view social circle link” you can see every single person listed by:

  • Direct connections from your Google Chat buddies and contacts. It even shows you which of these people have content and which don’t. For the ones that do, it shows you which sites they have content on. One important thing to note: it actually does include Facebook Page content. For example, I’m connected to Danny Sullivan in my social circle, for example, and Google will show me updates from his Facebook page, as he has it linked to his Google Profile. What’s missing, however, is your personal Facebook network of friends (which in my opinion is the most valuable social data there currently is on the web, if you’re a Facebook user).
  • Direct connections from links through Google Profiles or Connected Accounts “For example, if you listed your Twitter account on your profile or if your Twitter posts appear in your public Buzz stream, then relevant content from people you follow on Twitter will show up in your search results,” Google explains in that section. “You can change these relationships by visiting the corresponding services and adding or removing connections.”
  • Secondary connections that are publicly associated with your direct connections. In other words – friends of friends (at least public friends of friends). There is a little less control here, unfortunately. You can’t remove these people from your social circle unless you remove the friend that’s connecting you to them.
    To me, this actually seems like a step backwards in relevancy of social search. You’re probably a lot less likely to care about what someone thinks just because they know someone you know, than you are if you actually know them. A lot of people don’t even care about what the people they actually do know think.
    Naturally, this is the biggest list and potential source of material for Google to draw from, making it more likely that you see results from people you don’t know than people you do.
A cool thing about the entire list is that you can click “show paths” next to any name that has content, and it will show you exactly how you’re connected. You can be linked to someone via Twitter, and if that person links their Twitter account to their Quora account, you might see their Quora content too. If that Quora account links to their Facebook account, you might see stuff from their Facebook account if you have permission to see that content (which if set to public or if you’re Facebook friends, you should be able to see it). 

Where are my friends?
I notice one gaping hole in Google’s social search strategy besides the lack of comprehensive Facebook integration (though it’s certainly connected to that). That would be the lack of a substantial amount of my actual closest friends. I can only assume that many users have a similar issue.

That’s exactly why Bing’s Facebook integration is a very important factor in its competition with Google. Bing, unlike Google, does tap into your actual Facebook friends for search relevancy (though there is plenty of room for improvement on Bing’s part as well). The Wajam browser extension is still currently a better solution to the problem, if yo ask me. It will add your Facebook and Twitter friends to your results on both Google and Bing. 


It is also for this reason (at least partially) that Google is competing more directly with Facebook now in social. Google wants users to develop the kinds of relationships among friends that people currently have on Facebook, on Google’s own network (which runs throughout various products, but ultimately the Google account, which is at the center of nearly everything – Gmail, YouTube, Buzz, Docs, Chrome OS, etc. The list goes on.

As long as Google and Facebook aren’t going to play nice together, Google needs to succeed in social to have the best search relevancy in the social part of search. And that part of search is clearly becoming more and more important. That’s simply one competitive advantage Bing has over Google right now. It’s also why Facebook itself is a threat to Google search in some ways.


It will be very interesting to see how far Google takes social search over time. We know Google is currently working on increasing its presence as a force in social, and the upcoming +1 button should play a significant part in that. As search gets more social, however, it presents new challenges for search engine optimization, and perhaps less significance on algorithm updates (like Panda) from the webmaster point of view. 


Social can not only be a signal of relevance on a personalized level, but if content is shared a lot, it can also be seen as a signal of quality, because people don’t share content that sucks, unless they’re doing it as a joke or using it as an example of what not to do (like I said, it’s just a “signal”). This is nothing new, but it shows the importance of diversifying your traffic sources.


If you rely heavily on search, as many of the big victims of the Panda update have, you will always be at the mercy of the search engines. If you can find ways to get more love from social networks and links from others, it’s bound to help you in search as well. 


Is Google’s social search helpful or does it miss the mark? Tell us what you think

Friday, May 13, 2011

Despite New Panda Guidelines, Google Still Burying Authoritative Results

Despite New Panda Guidelines, Google Still Burying Authoritative Results 

There are a lot of elements of Google’s Panda update to discuss, and we’ve certainly discussed many of them over the last few months, but let’s not lose sight of the reason the update was launched to begin with – to improve search quality. 

Do you think Google’s search results are better now? Tell us what you think.

While quality is often in the eye of the beholder, there are certain kinds of queries where the information being retrieved is simply more important than others. We’ve talked about this before, as it’s been a problem in some Google results.  One example we’ve looked at a few times is where an eHow article written by a freelance writer with no clear authority on cancer (and whose body of work includes a lot of plumbing-related articles) was ranking at the top of Googe’s results for the query “level 4 brain cancer” above numerous other sources that would seem to be of greater authority on such a subject. 




In fact, the article did get bumped down after the Panda update, but it does still rank number 2, followed by another result from eHow. Granted, this is just one example, and Demand Media has efforts in motion to improve its own content quality, but you get the point.
Queries related to things like health or law demand authoritative advice. Not SEO’d content.
We had a conversation with Mark Britton, founder and CEO of Avvo about this subject. Avvo is a site that offers Q&A forums where consumers can ask medical or legal questions and get responses from qualified doctors and lawyers. It provides apparently authoritative content in these two areas from certified professionals.

This seems like the kind of content that should be ranking well for a lot of these types of queries. Does it not? Britton thinks it’s “very important” for commentary from experts in the medical and legal fields to surface high in search results for relevant topics.
“There is a lot of noise both online and offline regarding health and legal issues,” he tells us. “This comes in the form of lay people, professional commentators and even celebrities who often offer advice that is well-intentioned but inherently inferior to that of a doctor or lawyer trained in the area. However, it is not always easy to get doctors and lawyers to speak. Some still look down on the Internet as a publishing or marketing vehicle. Others just downright fear it, as they have seen too many movies where someone says something on the Internet and they are subsequently hunted and killed by terrorist hackers.”

“There is always room for improvement — especially with our newer pages,” he says of Avvo’s own search rankings. “We just launched our doctor ratings directory and our free medical question and answer forum in November, and it will take some time for those pages to rank as well as our legally related pages.” Look at the results for a query like “Does type 2 diabetes shorten life expectancy?” Avvo’s page on the subject ranks on the second page, while eHow ranks at the top of the first. The Avvo result has actually fallen since I began writing this article. It used to be right below the number one result from eHow and the number 2 from Yahoo Answers.


EHow’s is an article (not very long by any means) by a guy whose bio says he “has been a freelance writer since 2007. He writes extensively in the fitness, mental health and travel sectors and his work has appeared in a range of print and online publications including Scazu Fitness and USAToday Travel Tips…[and] holds a Master of Arts in community psychology.”

Keep in mind that USA Today has a deal with Demand Media for travel tips. So that presumably means his Demand Media content is simply published by USA Today. Does “Master of Arts in community psychology” indicate more authority to answer a life/death question about type 2 diabetes than say a licensed and practicing MD? That’s who provided an answer on Avvo’s page, which just got pushed further down in the search results. 

If you change the query to something simpler like “type 2 diabetes life expectancy” eHow still ranks close to the top, and Avvo’s result slips to….get ready for it….page 18! That’s with various articles from places like eHow, EzineArticles and Suite101 (all victims of the Panda update) ranking ahead of it. Now, I’m not saying that Avvo’s result is necessarily the one ultimate result for this query and should necessarily be the highest ranked, but come on. Interestingly enough, the result was on page 3 for this query when I started writing the article (yesterday) and it’s slipped that much further into obscurity just since then. I wonder where it will be in another day. 

Google has given publishers a list of questions to ask themselves about their content, as guidelines the company goes by as it writes its algorithms. The very top one is “Would you trust the information presented in this article?” While neither of the articles provide any helpful links to sources of information, the Avvo article comes from a medical doctor. I think most people would find that slightly more trustworthy, even if the article isn’t as long or as well SEO’d. Here’s the eHow article. Here’s the Avvo one.

The second question on Google’s list is, “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?”
While Google makes it clear that these questions aren’t actual ranking signals, they must be used to determine the signals at least, and you have to wonder just how much weight authority on a topic carries.Britton maintains that ALL of the site’s advice comes from qualified professionals, claiming that this is one of the site’s “greatest differentiators.”

“We CERTIFY every doctor and lawyer offering free advice on the site in two principle ways: First, we verify with the state licensing authorities that the answering doctors or lawyers are licensed and in good standing,” he explains. “Second, we rate the professionals from 1 (“Extreme Caution”) to 10 (“Superb”), which was unheard of prior to Avvo’s entry into the professional ratings arena. We are big believers that not every doctor or lawyer is ‘Super’ or ‘Best’ which was the steady-state in professional ratings for decades.”
“This was really just an extension of the Yellow Pages model, where the ‘recommended’ professional is the one paying the most money to advertise,” he continues. “But consumers are getting wise and demanding greater transparency regarding the qualifications of their doctors and lawyers.”

“We have three ratings that speak to the expertise of our contributors: The Avvo Rating, client/patient ratings and peer endorsements,” says Britton. “For the Avvo Rating, we start with the state licensing authorities and collect all the information we can regarding a professional. We then load that information into our proprietary web crawler, which we call ‘Hoover.’ Hoover goes out and finds all the additional information it can regarding the professional. We match the licensing data with the Hoover data and then we score it. The scoring is based on those indicators of the professional’s reputation, experience and quality of work.”

Britton says Avvo was not really affected by Google’s Panda update. “We saw a small dip, but things came back fairly quickly.”
“While I understand the intent of Google’s latest update, I’m not sure they entirely hit their mark,” he says. “We noticed a number of pure lead-generation sites – i.e., sites that are selling leads to the highest bidder — jump ahead of us in certain key terms, which is not good for consumers.”
Avvo encourages people to ask questions on the site, claiming it its Q&A boasts a 97% response rate. Avvo asked us to let readers know that in support of Skin Awareness Month, it is donating $5 to the Melanoma Research Foundation for every doctor review during the month of May. 

Should authority and certification of expertise carry greater weight in Google’s search rankings? Comment here.

 

Wednesday, April 20, 2011

Google Panda Update Helps Local Search Results..........




As we continue to look at the fallout of the Google Panda update (more so since its international roll-out), we have yet more data to sink our teeth into.

Have you noticed an increase in local results since the Panda roll-out? Let us know.
CNET has now released some data, as it compiled about 100,000 Google results, testing Google.com in March, and then last week, just after the new update. The data reflects earlier reported data from SearchMetrics in terms of news sites benefiting and Demand Media’s eHow sliding.

One interesting element CNET’s data brings to the table, however, is that of Google’s localization and its relationship to the update.

“We also tested what happens if you connect to Google.com from an overseas Internet address. We picked one in London. We performed the same searches on the same day–the only variable that should have changed, in other words, was our location,” explains CNET chief political correspondent Declan McCullagh. “The results? Google engages in significant localization efforts, as you might imagine, with Yelp.com being the largest beneficiary by far.”

“In searches originating from the U.K., Yelp appeared only twice,” he adds. “In U.S. searches, by contrast, it was the ninth-most popular Web site, with both its topic and individual business pages weaved seamlessly into the main search results.”

SearchMetrics’ data did show yelp.co.uk as having a 29.59% boost in visibility.

Yelp picked up 45 first-page appearances for generic searches like “chocolate,” “cleaning,” “food,” “lights,” “laundry,” “tv,” and “weddings,” from a Califronia address, according to McCullagh, while Davidsbridal.com, BarnesandNoble.com, and Walgreens.com also benefited in the U.S. from localization.

It’s not all just big brands though.

Local-based results won big too, based on CNET’s testing. Not just local locations for big brands or local businesses, but locally-themed results.

“For our U.S. tests, we used an Internet address near Palo Alto, Calif., which prompted Google to rank nearby businesses and municipal Web sites near the top of search results,” McCullagh explains. “The City of Palo Alto’s Web site appears in the first page of search results for terms including ‘adventures,’ ‘art,’ ‘business,’ ‘gas,’ and ‘jobs.’ PaloAltoOnline.com makes repeat appearances (‘budget cuts,’ ‘restaurants’), as do Stanford, the Palo Alto Medical Foundation, and Mike’s Bikes.”

It’s no secret that Google has put a great deal more emphasis on local in recent times, but it’s interesting to see how this is playing out in light of the Panda update, which was seemingly unrelated (based more on content farms).

We saw how news sites and video sites appeared to come out as big winners, but this research does seem to indicate even more wins for local.

The benefits to Yelp are interesting, considering the tension there has been between Yelp and Google, regarding Google Place Pages and their use of Yelp reviews. Google’s own reviews system – Hotpot – has now found its way into Places, and right into organic search results themselves.

Google is also finding more ways to improve its local listings themselves. See the “open now” and local product listings, for example. Oh, and by the way, Google just launched Map Maker for the US, so users can add their “local knowledge” to the map.

Interestingly enough, as Google focuses more on local, the competition for local eyeballs is already heavily increasing. This is not just about search in the traditional sense. You have to factor in entities like Groupon, LivingSocial, Facebook, Foursquare, and others in these companies’ respective spaces (the lines between which are getting blurrier).

The more ways people obtain the information related to local businesses from sources outside of Google, the less they’ll need to search for that information with Google.

Are Google’s results better now? Tell us what you think

marketingseo

Friday, April 1, 2011

Meet the Plus One Google's Version

 Meet the Plus One Google's Version
Google is trying hard to move in on Facebook's social sharing territory. Yesterday, it unveiled the "Plus One" button. It's pretty much the same as the Facebook "Like" button, except it's for Google searches. When you click the "Plus One" button next to your Google search results, your friends will see that you've endorsed it on their own results.
If you want to start "plus oneing" things—yes Google is already using "plus one" as a verb, see video below—you have to opt in to the "experiment" here. If you're a regular Google user, you've probably noticed that some search results already have an extra line at the bottom telling you a Twitter or Facebook friend has mentioned that link. This new feature lets you annotate searches you want to share right there on the Google page, without actually cutting and pasting into Facebook or Twitter.
All of this social search information is enticing ambient information on your friends' lives, but the actual benefit of this feature is that "Plus Ones" are data that can help guide your clicking in a cluttered web world. Well, that's the theory anyway.
Google writes on its blog:
Say, for example, you’re planning a winter trip to Tahoe, Calif. When you do a search, you may now see a +1 from your slalom-skiing aunt next to the result for a lodge in the area. Or if you’re looking for a new pasta recipe, we’ll show you +1’s from your culinary genius college roommate. And even if none of your friends are baristas or caffeine addicts, we may still show you how many people across the web have +1’d your local coffee shop.



One useful way the "Plus One" is different from the "Like" button is that, over time, your "Plus Ones" will be archived as a kind of bookmarks file of endorsed searches. So you can check back on all the links you've endorsed this way.

And so can everyone else, unless you set your privacy setting right. So, like everything else in social media, "Plus One" (as a verb) with care.

Right now, you'll only see "Plus Ones" from people connected to you through Google contacts, like gchat or gmail, but the company may soon expand that to Twitter or other sites, they say. To find out just how public your Google search result endorsements will be, go to the Google Dashboard and check your connections and your settings.

Your network is probably much bigger than you think. That's fine if you want to become a web-lebrity brand. But not so great for job hunters with a few private hobbies best left off the C.V.

If you have any insight into more things Google may be looking at (specific to this update), discuss these in the comments as well.
marketingseo

Gmail Motion

 A new way to communicate 
The mouse and keyboard were invented before the Internet even existed. Since then, countless technological advancements have allowed for much more efficient human computer interaction. Why then do we continue to use outdated technology? Introducing Gmail Motion -- now you can control Gmail with your body. 













marketingseo

Sunday, March 6, 2011

Google Panda Algorithm Update

Google “Panda” Algorithm Update – What’s Known & What’s Possible 
Google Shares Some Clues, Impacted Sites Left Guessing

Google’s recent algorithm update aimed at improving the quality of search results has captured a great deal of attention – both positive and negative. The general consensus seems to be that the results are in fact better now, but still not perfect. Perfection will likely never be achieved, but there are still some glaring criticisms out there about Google’s most recent attempt.
Having had some time to reflect, what is your opinion of the update? Let us know in the comments.Despite the improvement in overall search quality in general, there have been many sites to suffer the consequences of the update – some deservedly and others maybe not so much. As Google will never reveal its secret recipe in its entirety, there are plenty of clues out there, and even facts that Google will share. You can criticize Google’s mystique all you want, but there’s not denying that they do communicate with the webmaster community to a great extent, even if they don’t always tell you everything you want to hear.
Google’s Matt Cutts and Amit Singhal – two of the most instrumental voices in the recent update – shared some clues and insights in an interview with Wired this week. Before we get to specifics, there were some interesting things mentioned by the two that are worth noting. For example, Caffeine, which sped Google’s indexing, led to a flood of content – both good and bad. This seems to have helped the “shallow” kinds of content that this most recent update targeted – not stuff that is quite spam, but…well, shallow. We also learned that Google calls the update “Panda”.

They revealed that prior to the update, they sent out documents to outside testers/raters, and asked them questions about quality. It would be interesting to know who these raters were, but no such luck there. Users were asked things like whether they would feel comfortable giving a site their credit card info or giving medicine from the site to their kids (I wonder if anyone was asked if they felt comfortable getting their brain cancer information from a freelance eHow writer with no credentials in the field), whether they considered the site to be authoritative, whether it would be ok in a magazine, whether it has “excessive” ads, and other questions. It would be great to be able to know more of those questions, but we can only work with what Google has revealed.
“And based on that, we basically formed some definition of what could be considered low quality,” Singhal is quoted as saying.
“We actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side,” said Cutts. “And you can really see mathematical reasons…”
“I got an e-mail from someone who wrote out of the blue and said, ‘Hey, a couple months ago, I was worried that my daughter had pediatric multiple sclerosis, and the content farms were ranking above government sites,’” Cutts is later quoted as saying. “Now, she said, the government sites are ranking higher. So I just wanted to write and say thank you.’”
Again, why is eHow still ranking for “level 4 brain cancer”?
Google says it still looks at feedback, and Cutts even said that if someone has a specific question about why a site dropped, he thinks it’s “fair and justifiable and defensible to tell them why that site dropped.” He also said that Google’s most recent algorithm contains signals that can be gamed (hence the lack of full transperency). In other words, it can still be optimized for.
Finally, the site Suite101, which data from SearchMetrics lists as the biggest loser in percentage (in its organic performance index) was brought up in the interview. Suite101 and eHow are often compared and labeled as “content farm” type sites. When asked why Suite101 took a much bigger hit than eHow, Cutts simply said, “I feel pretty confident about the algorithm on Suite 101.”
It would be very helpful to understand the differences Google sees between these two sites. It doesn’t seem very clear by looking through the sites that there are obvious differences in quality. I’m sure it varies on both
We reached out to Suite101 a few days ago for comment on the update and its impact, but have yet to receive a response. I’m even more interested to hear what they have to say, now that these comments have come out. Update: Suite101 referred us to an open letter from CEO Peter Berger to Google’s Matt Cutts.
CEO Peter Berger stressed the importance of quality in content when we spoke with him last year.
“Every week, several thousand people apply to become Suite101 writers,” he told us. “While we only accept a portion of applicants based on our non-negotiable quality standards, we do have many successful writers on our site who do not consider themselves ‘writers’.”
“We see it as Suite101′s mission to enable people – anyone who can write well and with deep understanding of a subject – to achieve their goals,” he said. “These might be earning money, addressing large audiences, building up a personal professional brand, or simply enjoying creative freedom in a nurturing, peer-oriented environment.”
Results from people with a deep understanding of a subject should lend themselves to quality. Whether or not Suite101 delivers on this is open for debate. Clearly Google doesn’t think so, practically making the site the poster-child of what not to do. The mysteries continue…
What we know Google is looking at with the Panda update:
User comfort level in the trust area (think credit card/medicine comments)
 Is it considered authoritative (this would apply some indication of expertise on topics covered, I would think)
 Is the content quality good enough for print? (I’ve seen plenty of crap printed)
 Are there too many ads? (How many are too many, and does the ad network matter?)
 We know Google has its definition of what could be considered low quality
 Google uses a “classifier” to draw a line in the sand
 We know that so far, Google has not used indications from the Chrome Extension (emphasis on so far. Google hinted in the past that this data could potentially be used to tweak the algorithm).
 Google looks at feedback, at least to some extent
 Based on comments from Cutts, Google will tell you why your site dropped (getting that communication flow going may not be the easiest thing to do, but I have personally witnessed Cutts sit down with someone at a conference and look at their site with them.)
 The algorithm can still be gamed. It can still be optimized for. (If you were hit by the update, there are things you can do to get back in Google’s good graces. In other words, you’re not necessarily banned just because of your brand.)
 Most of the changes in rankings will be done algorithmically, but Google will take manual action in some instances (see JC Penney)
If you use any auto-generated content keep it separated from the original high quality stuff, and block it from search engines. Google’s John Mu said recently, “If you do have such high-quality, unique and compelling content, I’d recommend separating it from the auto-generated rest of the site, and making sure that  the auto-generated part is blocked from crawling and indexing, so that search engines can focus on what makes your site unique and valuable to users world-wide.”
If you can think of anything else that is “known” about this update, please feel free to comment.
We won’t jump to any conclusions, but here are…
Some things that are possible that may be worth considering:
 Old fashioned design may play a role. Simply from the aesthetic point of view, this may make a site appear less trustworthy (less likely that consumers will be willing to give up their credit care info). We wonder if this played a role in the EzineArticles and Foner Books examples we looked at.
 There is possibly a threshold that can be crossed for what is considered too many ads before your site gets points taken off for quality. Some have tried reducing the number of ads (again, see EzineArticles) to try and boost rankings.
 Duplicate content (to some degree) may play a role in this recent update. EzineArticles, again, is a prime candidate for this. Articles from the site are published other places – probably the majority of the content from the site is duplicated at other places (besides just scrapers). eHow content is uniquely written for eHow. There are plenty of people that will suggest much of this content is rewritten based on other existing articles, but that’s beside the point. The content itself is unique to eHow (again, scrapers aside).
Other sites like Business Insider, The Huffington Post, and even the New York Times, CNN, and the Wall Street Journal will syndicate content from other blogs, but this duplicate content does not make up the majority of the content from these sites, and this is probably why it’s not frowned upon in these cases. Even WebProNews has had a blog partner program in place for years, in which we syndicate select posts from our partners, but this content has never dominated WebProNews. It’s never been the majority of what we publish, but a small percentage.
Excessive amounts of very short articles may be a factor taken into consideration, because if that’s the majority of what you put out, the majority of your content is likely “shallow”. Now sometimes, short posts are sufficient. Sometimes there’s just not that much to say, but if these kinds of posts dominate, there’s a good chance there IS more to say about a lot of it, and someone else probably IS saying it, which makes those people better candidates for better rankings.
eHow may still be ranking well at least partially because it has established a lot of backlinks over time. The nature of these links could come into play. There is some interesting discussion about this in a WebmasterWorld thread.
Better, consistent page structure could also play a role (as brought up in that same thread…look at ehow vs HubPages (which was hit by the update).
Update: PotPieGirl.com has some very interesting data, after running a test on seven key phrases that attract large amounts of spammy content. This might be very telling of at least one aspect of the Panda update. The following chart sayst it all. Look at the difference in percentages between EzineArticles and eHow.
Another dataset looks at the same phrases for articles just from the last month:
“In the last month, Ezine Articles has had close to 39,000 urls found/crawled in the Google index that have one of these 7 phrases on them. That means that 2.82% of the EzineArticles.com urls Google has found/crawled in the last month have this phrase on them,” says Jennifer (Pot Pie Girl), who put this data together. “That is almost 39 THOUSAND web pages in the Google index in the past month with one of those 7 phrases on them – from ONE SITE.”

If you have any insight into more things Google may be looking at (specific to this update), discuss these in the comments as well.
marketingseo

Thursday, March 3, 2011

Google Algorithm Update Casualties Speak.

Google Algorithm Update Casualties Speak
 
Last week, Google launched a major update to its algorithm, which was positioned as one that would go after content farms. While some sites that are often attached to that label were in fact hurt by the update, some other sites that aren’t generally considered content farms became casualties as well.

Was your site impacted by Google’s algorithm update? For better or worse? Let us know.


Now, it’s important to note that Google did not come out and use the phrase “content farm” when it announced the update, but the company used language similar to what it has used in the past when talking about content farms. In a nutshell, the algorithm was supposed to be aimed at reducing rankings for lower quality content. Those who found their rankings impacted negatively are not thrilled with having their content deemed as such, and some of the sites that were apparently devalued, do raise some eyebrows.

Take, for example, Cult of Mac. This is a tech blog that covers Apple news. It is often linked to by other sources, and frequently appears on Techmeme as a source. A lot of Apple enthusiasts visit the site on a regular basis for updates. Leander Kahney, the site’s editor and publisher, wrote a scathing post about Google’s update, proclaiming, “We’ve become a civilian casualty in the war against content farms...Why us? We have no idea. The changes Google has made to its system are secret. What makes it worse is that Google’s tinkering seems to have actually improved Demand Media’s page rank, while killing ours...We’re a blog, so we aggregate news stories like everyone else. But our posts are 100% original and we do a ton of original reporting...”

“We can go toe-to-toe with any other tech news site out there,” he wrote. “We break a ton of stuff. Go take a look at MacRumors, which is very good at giving credit, and see how often we're cited as the source of stories...Yes, we report other's stories, just like Engadget, MacRumors, AppleInsider, Wired, Daring Fireball and everyone else. That's the news business on the Web. It's a flow, a conversation...The question is whether we add value -- figure out what it means, if a rumor is credible, what the historical context is. We do that and we do it well. Plus we give clear credit where credit is due (unlike the original content stealers like Engadget and Mashable. Try to figure out what stories they ripped off from us).”  Note: those accusations appear to have been removed from the post. 

Even PRNewswire, the press release distribution service was devalued by Google’s update. Kahney also defended that site, after a commenter on his post mentioned it. He said, “...and for your information, PR newswire isn't a content farm either. It published press releases for thousands of companies. Crappy spam websites pull releases from its RSS feeds and republish it as pretend content -- which may be why it was down ranked by Google.”

Technorati got hit too. This site was once considered a darling among bloggers, and now apparently it’s been reduced to a low quality site clogging up the search results, based on Google’s doings. CEO Richard Jalichandra doesn’t appear to have acknowledged this:

Other sites more often associated with the content farm label, though they’ll pretty much all do everything they can to distance themselves from it, were also hit by the update - sites like Associated Content (run by Yahoo), Suite101, HubPages, Mahalo, EzineArticles, and others. Reports have indicated that Demand Media’s eHow - the site most often associated with the label, was actually helped by the update.

The notion that eHow was helped has been questioned. Erik Sherman at CBS looks at Compete data, and writes, “What seems to be a jump may be a normal increase, which raises the question of whether it would have been larger without the algorithm changes.”

However, if you do some searching in Google, you’ll probably notice that there is still a great deal of eHow content ranking well - and still under questionable circumstances (see “level 4 brain cancer” example discussed previously).

Still, Demand Media as a whole was not immune from the update. At least three of their sites were negatively impacted: Trails.com, Livestrong.com, and AnswerBag.com. After the update was announced,  Larry Fitzgibbon, Demand Media's EVP of Media and Operations, said: “As might be expected, a content library as diverse as ours saw some content go up and some go down in Google search results. This is consistent with what Google discussed on their blog post. It’s impossible to speculate how these or any changes made by Google impact any online business in the long term – but at this point in time, we haven’t seen a material net impact on our Content & Media business.”

Pia Chatterjee of HubPages tells us, “On our end we think that its really too soon to tell, as after any large update, all the traffic undergoes pretty serious upheaval. All these numbers will be very different in about 7/10 days. What is worrying is that the update did not seem to do what it was supposed to, which was penalize poor content. The fact that e-how has remained untouched is proof of that!”

“Our CEO, Paul Edmondson says:  We are confident that over time the proven quality of our writers' content will be attractive to users. We have faith in Google's ability to tune results post major updates and are optimistic that the cream will rise back to the top in the coming weeks, which has been our experience with past updates.”

EzineArticles CEO Chris Knight wrote a blog post about how his site was affected, and what he is doing to try and get back up in the rankings.  "While we adamantly disagree with anyone who places the 'Content Farm' label on EzineArticles.com, we were not immune to this algorithm change," he wrote. "Traffic was down 11.5% on Thursday and over 35% on Friday. In our life-to-date, this is the single most significant reduction in market trust we've experienced from Google."

To try and get back into Google's good graces, EzineArticles is doing things like reducing the number of article submissions accepted by over 10% - rejecting articles that "are not unique enough". It will no longer accept article submissions through a Wordpress Plugin. They're reducing the number of ads per page. They're raising the minimum article word count to 400. They're "raising the bar" on keyword density limits. They're removing articles considered "thin and spammy", and will put greater focus on rejection of advertorial articles. Submitted articles are required to be exclusive to the submitter (but won't be required to be unique to EzineArticles). 

Knight also considered adding a Nofollow attribute to links in the article, as “icing in the cake to further prove to Matt Cutts and Google” that they’re not trying to “game Google” or let their authors do so. Interestingly enough, Knight decided to hold off on adding Nofollow after complaints from authors.

The first author to complain, in fact, even said, “Not sure what Pollyanna planet you're from but let me assure you, EzineArticles does not exist 'to provide information that is beneficial to the readers.' EzineArticles is a business, not a government organization or charity. EzineArticles was created to make its owner(s) money. There's nothing wrong with that, but don't fool yourself into thinking they're a bunch of do-gooders. By the same token, the majority of us who publish on EzineArticles don't do so to benefit readers. We too are running businesses, and EzineArticles helps our own websites get traffic and ultimately sales."

Yeah, I think Google frowns upon that whole “we’re not writing to benefit readers” thing.

Another element of this whole algorithm update is that so far, it is only active in the U.S. Once Google expands it into other countries, the sites that have seen their traffic drop off so far may be in for an even bigger shock.
By the way, there are a lot more sites impacted than those discussed in this article.

In an interview with On the Media, Google’s Matt Cutts was asked: “You have so much market share; you are so much the only game in town at this point that you can enforce these things unilaterally, without hearing or due process, putting the whole online world more or less at your mercy. Is there any process by which the people who are affected by algorithm changes and updates can make a case for themselves?”

Cutts responded:
We have a webmaster forum where you can show up and ask questions, and Google employees keep an eye on that forum. And, in fact, if you've been hit with a, what we call a “manual action,” there’s something called a “reconsideration request,” which essentially is an appeal that says, ah, I'm sorry that I was hiding text or doing keyword stuffing and I've corrected the problem, could you review this?

And over time, we've, I think, done more communication than any other search engine in terms of sending messages to people whose site has been hacked or who have issues and then trying to be open so that if people want to give us feedback, we listen to that.”

Cutts later said, “Any change will have some losses, but hopefully a lot more wins than losses.”
It does seem that Google may be willing to ackwowledge some errors in judgement on the matter, if this exchange between Cutts and Kahney is any indication:

Were there more wins than losses with this update? How's the search quality looking to you? Tell us what you think..........