Sunday, March 6, 2011

Google Panda Algorithm Update

Google “Panda” Algorithm Update – What’s Known & What’s Possible 
Google Shares Some Clues, Impacted Sites Left Guessing

Google’s recent algorithm update aimed at improving the quality of search results has captured a great deal of attention – both positive and negative. The general consensus seems to be that the results are in fact better now, but still not perfect. Perfection will likely never be achieved, but there are still some glaring criticisms out there about Google’s most recent attempt.
Having had some time to reflect, what is your opinion of the update? Let us know in the comments.Despite the improvement in overall search quality in general, there have been many sites to suffer the consequences of the update – some deservedly and others maybe not so much. As Google will never reveal its secret recipe in its entirety, there are plenty of clues out there, and even facts that Google will share. You can criticize Google’s mystique all you want, but there’s not denying that they do communicate with the webmaster community to a great extent, even if they don’t always tell you everything you want to hear.
Google’s Matt Cutts and Amit Singhal – two of the most instrumental voices in the recent update – shared some clues and insights in an interview with Wired this week. Before we get to specifics, there were some interesting things mentioned by the two that are worth noting. For example, Caffeine, which sped Google’s indexing, led to a flood of content – both good and bad. This seems to have helped the “shallow” kinds of content that this most recent update targeted – not stuff that is quite spam, but…well, shallow. We also learned that Google calls the update “Panda”.

They revealed that prior to the update, they sent out documents to outside testers/raters, and asked them questions about quality. It would be interesting to know who these raters were, but no such luck there. Users were asked things like whether they would feel comfortable giving a site their credit card info or giving medicine from the site to their kids (I wonder if anyone was asked if they felt comfortable getting their brain cancer information from a freelance eHow writer with no credentials in the field), whether they considered the site to be authoritative, whether it would be ok in a magazine, whether it has “excessive” ads, and other questions. It would be great to be able to know more of those questions, but we can only work with what Google has revealed.
“And based on that, we basically formed some definition of what could be considered low quality,” Singhal is quoted as saying.
“We actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side,” said Cutts. “And you can really see mathematical reasons…”
“I got an e-mail from someone who wrote out of the blue and said, ‘Hey, a couple months ago, I was worried that my daughter had pediatric multiple sclerosis, and the content farms were ranking above government sites,’” Cutts is later quoted as saying. “Now, she said, the government sites are ranking higher. So I just wanted to write and say thank you.’”
Again, why is eHow still ranking for “level 4 brain cancer”?
Google says it still looks at feedback, and Cutts even said that if someone has a specific question about why a site dropped, he thinks it’s “fair and justifiable and defensible to tell them why that site dropped.” He also said that Google’s most recent algorithm contains signals that can be gamed (hence the lack of full transperency). In other words, it can still be optimized for.
Finally, the site Suite101, which data from SearchMetrics lists as the biggest loser in percentage (in its organic performance index) was brought up in the interview. Suite101 and eHow are often compared and labeled as “content farm” type sites. When asked why Suite101 took a much bigger hit than eHow, Cutts simply said, “I feel pretty confident about the algorithm on Suite 101.”
It would be very helpful to understand the differences Google sees between these two sites. It doesn’t seem very clear by looking through the sites that there are obvious differences in quality. I’m sure it varies on both
We reached out to Suite101 a few days ago for comment on the update and its impact, but have yet to receive a response. I’m even more interested to hear what they have to say, now that these comments have come out. Update: Suite101 referred us to an open letter from CEO Peter Berger to Google’s Matt Cutts.
CEO Peter Berger stressed the importance of quality in content when we spoke with him last year.
“Every week, several thousand people apply to become Suite101 writers,” he told us. “While we only accept a portion of applicants based on our non-negotiable quality standards, we do have many successful writers on our site who do not consider themselves ‘writers’.”
“We see it as Suite101′s mission to enable people – anyone who can write well and with deep understanding of a subject – to achieve their goals,” he said. “These might be earning money, addressing large audiences, building up a personal professional brand, or simply enjoying creative freedom in a nurturing, peer-oriented environment.”
Results from people with a deep understanding of a subject should lend themselves to quality. Whether or not Suite101 delivers on this is open for debate. Clearly Google doesn’t think so, practically making the site the poster-child of what not to do. The mysteries continue…
What we know Google is looking at with the Panda update:
User comfort level in the trust area (think credit card/medicine comments)
 Is it considered authoritative (this would apply some indication of expertise on topics covered, I would think)
 Is the content quality good enough for print? (I’ve seen plenty of crap printed)
 Are there too many ads? (How many are too many, and does the ad network matter?)
 We know Google has its definition of what could be considered low quality
 Google uses a “classifier” to draw a line in the sand
 We know that so far, Google has not used indications from the Chrome Extension (emphasis on so far. Google hinted in the past that this data could potentially be used to tweak the algorithm).
 Google looks at feedback, at least to some extent
 Based on comments from Cutts, Google will tell you why your site dropped (getting that communication flow going may not be the easiest thing to do, but I have personally witnessed Cutts sit down with someone at a conference and look at their site with them.)
 The algorithm can still be gamed. It can still be optimized for. (If you were hit by the update, there are things you can do to get back in Google’s good graces. In other words, you’re not necessarily banned just because of your brand.)
 Most of the changes in rankings will be done algorithmically, but Google will take manual action in some instances (see JC Penney)
If you use any auto-generated content keep it separated from the original high quality stuff, and block it from search engines. Google’s John Mu said recently, “If you do have such high-quality, unique and compelling content, I’d recommend separating it from the auto-generated rest of the site, and making sure that  the auto-generated part is blocked from crawling and indexing, so that search engines can focus on what makes your site unique and valuable to users world-wide.”
If you can think of anything else that is “known” about this update, please feel free to comment.
We won’t jump to any conclusions, but here are…
Some things that are possible that may be worth considering:
 Old fashioned design may play a role. Simply from the aesthetic point of view, this may make a site appear less trustworthy (less likely that consumers will be willing to give up their credit care info). We wonder if this played a role in the EzineArticles and Foner Books examples we looked at.
 There is possibly a threshold that can be crossed for what is considered too many ads before your site gets points taken off for quality. Some have tried reducing the number of ads (again, see EzineArticles) to try and boost rankings.
 Duplicate content (to some degree) may play a role in this recent update. EzineArticles, again, is a prime candidate for this. Articles from the site are published other places – probably the majority of the content from the site is duplicated at other places (besides just scrapers). eHow content is uniquely written for eHow. There are plenty of people that will suggest much of this content is rewritten based on other existing articles, but that’s beside the point. The content itself is unique to eHow (again, scrapers aside).
Other sites like Business Insider, The Huffington Post, and even the New York Times, CNN, and the Wall Street Journal will syndicate content from other blogs, but this duplicate content does not make up the majority of the content from these sites, and this is probably why it’s not frowned upon in these cases. Even WebProNews has had a blog partner program in place for years, in which we syndicate select posts from our partners, but this content has never dominated WebProNews. It’s never been the majority of what we publish, but a small percentage.
Excessive amounts of very short articles may be a factor taken into consideration, because if that’s the majority of what you put out, the majority of your content is likely “shallow”. Now sometimes, short posts are sufficient. Sometimes there’s just not that much to say, but if these kinds of posts dominate, there’s a good chance there IS more to say about a lot of it, and someone else probably IS saying it, which makes those people better candidates for better rankings.
eHow may still be ranking well at least partially because it has established a lot of backlinks over time. The nature of these links could come into play. There is some interesting discussion about this in a WebmasterWorld thread.
Better, consistent page structure could also play a role (as brought up in that same thread…look at ehow vs HubPages (which was hit by the update).
Update: PotPieGirl.com has some very interesting data, after running a test on seven key phrases that attract large amounts of spammy content. This might be very telling of at least one aspect of the Panda update. The following chart sayst it all. Look at the difference in percentages between EzineArticles and eHow.
Another dataset looks at the same phrases for articles just from the last month:
“In the last month, Ezine Articles has had close to 39,000 urls found/crawled in the Google index that have one of these 7 phrases on them. That means that 2.82% of the EzineArticles.com urls Google has found/crawled in the last month have this phrase on them,” says Jennifer (Pot Pie Girl), who put this data together. “That is almost 39 THOUSAND web pages in the Google index in the past month with one of those 7 phrases on them – from ONE SITE.”

If you have any insight into more things Google may be looking at (specific to this update), discuss these in the comments as well.
marketingseo

Thursday, March 3, 2011

Google Algorithm Update Casualties Speak.

Google Algorithm Update Casualties Speak
 
Last week, Google launched a major update to its algorithm, which was positioned as one that would go after content farms. While some sites that are often attached to that label were in fact hurt by the update, some other sites that aren’t generally considered content farms became casualties as well.

Was your site impacted by Google’s algorithm update? For better or worse? Let us know.


Now, it’s important to note that Google did not come out and use the phrase “content farm” when it announced the update, but the company used language similar to what it has used in the past when talking about content farms. In a nutshell, the algorithm was supposed to be aimed at reducing rankings for lower quality content. Those who found their rankings impacted negatively are not thrilled with having their content deemed as such, and some of the sites that were apparently devalued, do raise some eyebrows.

Take, for example, Cult of Mac. This is a tech blog that covers Apple news. It is often linked to by other sources, and frequently appears on Techmeme as a source. A lot of Apple enthusiasts visit the site on a regular basis for updates. Leander Kahney, the site’s editor and publisher, wrote a scathing post about Google’s update, proclaiming, “We’ve become a civilian casualty in the war against content farms...Why us? We have no idea. The changes Google has made to its system are secret. What makes it worse is that Google’s tinkering seems to have actually improved Demand Media’s page rank, while killing ours...We’re a blog, so we aggregate news stories like everyone else. But our posts are 100% original and we do a ton of original reporting...”

“We can go toe-to-toe with any other tech news site out there,” he wrote. “We break a ton of stuff. Go take a look at MacRumors, which is very good at giving credit, and see how often we're cited as the source of stories...Yes, we report other's stories, just like Engadget, MacRumors, AppleInsider, Wired, Daring Fireball and everyone else. That's the news business on the Web. It's a flow, a conversation...The question is whether we add value -- figure out what it means, if a rumor is credible, what the historical context is. We do that and we do it well. Plus we give clear credit where credit is due (unlike the original content stealers like Engadget and Mashable. Try to figure out what stories they ripped off from us).”  Note: those accusations appear to have been removed from the post. 

Even PRNewswire, the press release distribution service was devalued by Google’s update. Kahney also defended that site, after a commenter on his post mentioned it. He said, “...and for your information, PR newswire isn't a content farm either. It published press releases for thousands of companies. Crappy spam websites pull releases from its RSS feeds and republish it as pretend content -- which may be why it was down ranked by Google.”

Technorati got hit too. This site was once considered a darling among bloggers, and now apparently it’s been reduced to a low quality site clogging up the search results, based on Google’s doings. CEO Richard Jalichandra doesn’t appear to have acknowledged this:

Other sites more often associated with the content farm label, though they’ll pretty much all do everything they can to distance themselves from it, were also hit by the update - sites like Associated Content (run by Yahoo), Suite101, HubPages, Mahalo, EzineArticles, and others. Reports have indicated that Demand Media’s eHow - the site most often associated with the label, was actually helped by the update.

The notion that eHow was helped has been questioned. Erik Sherman at CBS looks at Compete data, and writes, “What seems to be a jump may be a normal increase, which raises the question of whether it would have been larger without the algorithm changes.”

However, if you do some searching in Google, you’ll probably notice that there is still a great deal of eHow content ranking well - and still under questionable circumstances (see “level 4 brain cancer” example discussed previously).

Still, Demand Media as a whole was not immune from the update. At least three of their sites were negatively impacted: Trails.com, Livestrong.com, and AnswerBag.com. After the update was announced,  Larry Fitzgibbon, Demand Media's EVP of Media and Operations, said: “As might be expected, a content library as diverse as ours saw some content go up and some go down in Google search results. This is consistent with what Google discussed on their blog post. It’s impossible to speculate how these or any changes made by Google impact any online business in the long term – but at this point in time, we haven’t seen a material net impact on our Content & Media business.”

Pia Chatterjee of HubPages tells us, “On our end we think that its really too soon to tell, as after any large update, all the traffic undergoes pretty serious upheaval. All these numbers will be very different in about 7/10 days. What is worrying is that the update did not seem to do what it was supposed to, which was penalize poor content. The fact that e-how has remained untouched is proof of that!”

“Our CEO, Paul Edmondson says:  We are confident that over time the proven quality of our writers' content will be attractive to users. We have faith in Google's ability to tune results post major updates and are optimistic that the cream will rise back to the top in the coming weeks, which has been our experience with past updates.”

EzineArticles CEO Chris Knight wrote a blog post about how his site was affected, and what he is doing to try and get back up in the rankings.  "While we adamantly disagree with anyone who places the 'Content Farm' label on EzineArticles.com, we were not immune to this algorithm change," he wrote. "Traffic was down 11.5% on Thursday and over 35% on Friday. In our life-to-date, this is the single most significant reduction in market trust we've experienced from Google."

To try and get back into Google's good graces, EzineArticles is doing things like reducing the number of article submissions accepted by over 10% - rejecting articles that "are not unique enough". It will no longer accept article submissions through a Wordpress Plugin. They're reducing the number of ads per page. They're raising the minimum article word count to 400. They're "raising the bar" on keyword density limits. They're removing articles considered "thin and spammy", and will put greater focus on rejection of advertorial articles. Submitted articles are required to be exclusive to the submitter (but won't be required to be unique to EzineArticles). 

Knight also considered adding a Nofollow attribute to links in the article, as “icing in the cake to further prove to Matt Cutts and Google” that they’re not trying to “game Google” or let their authors do so. Interestingly enough, Knight decided to hold off on adding Nofollow after complaints from authors.

The first author to complain, in fact, even said, “Not sure what Pollyanna planet you're from but let me assure you, EzineArticles does not exist 'to provide information that is beneficial to the readers.' EzineArticles is a business, not a government organization or charity. EzineArticles was created to make its owner(s) money. There's nothing wrong with that, but don't fool yourself into thinking they're a bunch of do-gooders. By the same token, the majority of us who publish on EzineArticles don't do so to benefit readers. We too are running businesses, and EzineArticles helps our own websites get traffic and ultimately sales."

Yeah, I think Google frowns upon that whole “we’re not writing to benefit readers” thing.

Another element of this whole algorithm update is that so far, it is only active in the U.S. Once Google expands it into other countries, the sites that have seen their traffic drop off so far may be in for an even bigger shock.
By the way, there are a lot more sites impacted than those discussed in this article.

In an interview with On the Media, Google’s Matt Cutts was asked: “You have so much market share; you are so much the only game in town at this point that you can enforce these things unilaterally, without hearing or due process, putting the whole online world more or less at your mercy. Is there any process by which the people who are affected by algorithm changes and updates can make a case for themselves?”

Cutts responded:
We have a webmaster forum where you can show up and ask questions, and Google employees keep an eye on that forum. And, in fact, if you've been hit with a, what we call a “manual action,” there’s something called a “reconsideration request,” which essentially is an appeal that says, ah, I'm sorry that I was hiding text or doing keyword stuffing and I've corrected the problem, could you review this?

And over time, we've, I think, done more communication than any other search engine in terms of sending messages to people whose site has been hacked or who have issues and then trying to be open so that if people want to give us feedback, we listen to that.”

Cutts later said, “Any change will have some losses, but hopefully a lot more wins than losses.”
It does seem that Google may be willing to ackwowledge some errors in judgement on the matter, if this exchange between Cutts and Kahney is any indication:

Were there more wins than losses with this update? How's the search quality looking to you? Tell us what you think..........


Friday, February 25, 2011

Social Network for Kids

A Safe Social Network for Kids

Since it seems that nearly everyone is on Facebook, it is natural that kids would want to get on the site that their older siblings, parents and even grandparents consider fun. The content on Facebook, however, is not geared toward children. In fact, Facebook actually has a policy that prevents children under the age of 13 from joining the site. It also strongly recommends parental participation with minors.Since kids always find ways to do what they want, many children are joining the site by lying about their age, which is a growing concern for parents. First Lady Michelle Obama is one of those concerned parents and even said on The Today Show that she didn’t want either of her girls on Facebook.

Social networking site Everloop hopes to provide a solution for both parents and children. It is said to be just like Facebook but with content that is geared toward children between the ages of 8-13. In addition, it contains controls for parents that allow them to monitor what their children are doing on the site.“One of the things that Everloop is solving is really giving children under the age of 13 their own social utility, or what we call, their own social graph,” said Tim Donovan, Everloop’s CSO.
Everloop is in compliance with the Children’s Online Privacy Protection Act (COPPA), which means that a parent must authenticate a child before it is permitted on the site. This compliance also ensures that Everloop cannot gather personal information from kids for marketing purposes. Getting parental consent additionally helps prevent sexual predators from obtaining access to the site.
For children, the experience is very similar to that of a user on Facebook. Children have access to a video network and game arcade, and they can create and join groups based on entertainment, learning, and more. They can also customize their own profiles and can take part in IM chatting, SMS, and VoIP.
“Right now, a child under 13, their community experience is disparate. So, they go to YouTube to watch videos, they’ll go to Nickelodeon to or Disney XD to play casual games, they’ll go to Facebook and they’ll sneak onto Facebook to be part of a larger social experience, so we’re collapsing all of that into one experience on Everloop,” said Donovan.
Parents can also customize what their child does on Everloop and give him or her the power to email, IM, chat, etc. They can also enable reporting settings that notify them when their child takes certain actions on the site.
“How do I keep my child’s privacy and information protected? How do I have more insight into the activities that my child is engaged in when they’re online? How do I have more controls over their behavior and their engagement in the social community? So, Everloop solves all those problems for parents,” Donovan points out.
Not only does Everloop want to give parents control and make the process convenient for them, but it also wants to let children feel like they have control as well. If children didn’t have some level of power, they would not be interested in the site at all. Donovan also said that Everloop has to be as cool as Facebook in order to attract kids.
“The bottom line is this, if it’s not cool, kids won’t use it. So, coolness comes from being relevant, coolness comes from having the bleeding edge of technology, coolness comes from… thousands and thousands of opportunities and experiences,” he said.
Everloop also recently announced that it is partnering with i-Safe, the leading publisher of media literacy and educator of digital citizenship, in an effort to bring social media to the classroom. The two organizations will begin to roll out their platform in April.
marketingseo

Sunday, February 13, 2011

Google Vs. Bing: Competition Is Heating Up

Google Vs. Bing: Competition Is Heating Up

You’ve probably heard by now that Google  recently accused Microsoft’s Bing of stealing its search results. Bing (sort of) denied the claim but came back and accused Google of click fraud, the practice often associated with spammers. A back-and-forth stream of strong words and accusations has resulted thus beginning, what appears to be, a long drawn-out saga.



It all began when Danny Sullivan published an article exposing a Google experiment in which it tested Bing. According to Michael Gray of Atlas Web Service, the test, essentially, showed that Bing used the data from Google’s toolbar to duplicate its search results, a move that Google considers “copying.”
Gray went on to explain to WebProNews that the accusation of click fraud is “a little far-reaching.” Although the technology was the same, it didn’t cost Bing any money since there weren’t any PPC campaigns involved. He said that if Google did suspect that Bing was copying them, this method was the only way it would have found out the truth. So, who’s right, and who’s wrong? Gray believes that both companies are in the wrong to an extent. Based on his analysis, Microsoft was wrong to take the data from the toolbar and use it in their ranking algorithm without testing it further.

Google’s wrongdoing, on the other hand, stems from past events. As he explains, Ask introduced universal search long before Google did, and Yahoo introduced Yahoo Instant long before Google released its version of it. In addition, Gray points out that Google seems to make product announcements at other people’s press events and play it off as a coincidence. Although Google, typically, says that it has been working on these products for long periods of time, some people interpret their actions in each of these scenarios differently.The timing of this latest turn of events seemed to be somewhat of a coincidence as well since Sullivan’s article was published just before both companies were set to take the stage at the Farsight Summit.
“Google’s playing hardball and they’re a serious, competitive company; they like to hold onto their market share, and they’re not taking things laying down,” he said.As for the lesson for marketers in all this, Gray said that marketers need to expand their efforts beyond SEO to include other areas, such as social media.He also pointed out that this situation is “good news for Bing” because it means that Google considers them as a viable competitor.
How do you think this saga will play out, and how will it impact the search industry?

Thursday, February 3, 2011

Advancing Strategy Social Marketing

Advancing Strategy for Social Marketing

"When it comes to digital marketing I believe marketers need to be more strategists & research minded than idea evaluators and implementers."
After discussing social media this year with senior marketers from several large brands, the implementer reference in the above tweet by Shiv Singh really resonates with me.
More brands are taking (social) community management activities back in house while seeking outside expertise to continue guiding decisions around social strategies and applications.
When it comes to the day-to-day of social marketing, corporate competence is rising -- and the "yeah, I get that, but what's next?" mentality is placing a higher demand on strategy with expectations of research (or at least experience) to back it up.
As I've been preparing to speak about Facebook marketing with custom applications at next week's Online Marketing Summit, I've found a common thread in the key takeaways pertains more to strategy than turn-key tactics. The following is a preview of a couple key topics I'll discuss as part of that presentation.
Game Mechanics for Custom Facebook Applications
For those of you sick of hearing about it, I'll start by saying game mechanics are not a magic silver bullet -- and I took great delight in hearing Gowalla CEO Josh Williams proclaim "we don't need no stinkin' badges" at last month's
However, like Williams, those who have an established understanding of game mechanics are better positioned to get ahead. Why? Because it's a matter of better knowing how human behavior works.
If you're aware of certain ingredients that foster a higher propensity for sharing a social experience on Facebook, then you may realize higher fan growth and engagement as a result of implementation.
I touched on the Sanrio/Hello Kitty gifts application as an example of this when discussing social intelligence for Facebook marketing.
Another recent and impressive implementation of game mechanics (and overall digital strategy) is Vail Resort's EpixMix, which is also promoted on the Facebook page.
Although the application doesn't reside on Facebook, the Connect functionality takes full advantage of Facebook sharing via passive, automated check-ins at six separate ski resorts, all enabled by an RFID chip embedded in your ski pass.
"Passive" means you don't need to pull out a mobile device for checking in. Updates to your Facebook feed are automatically posted based on your location with the pass, and one-time Facebook authorization.
A leading game mechanic in play for EpicMix is the use of more than 200 ski pins (digital "stinkin' badges") you can earn based on locations you ski at each resort, total feet of elevation skied and more. Although Vail Resort's CEO, Rob Katz, wasn't specific about adoption rate when asked last month, he was very clear about the fact that users signing on to share in Facebook exceeded expectations.
Game on.
Strategic Modeling for Social Strategies
While game mechanics address specific strategies from a human behavior perspective, the bigger and equally important picture pertains to how all elements of social marketing work together for the good of a business.
A valuable, but often overlooked practice is to adopt a model that facilitates a framework for strategy. There are a range of options with strategic models, but the one I follow is a layered ("Four Cs") approach: 

 
Content: This is the foundational element, focusing not only on the type of content (video, infographic, written, etc.) but also how to apply supporting research to guide its development and/or justification.



• Context: Think of this second layer as platforms enabling the display and distribution of your content. Facebook, for example, would be an element of context in this model.

•    Campaigns: This layer puts the context in action, addressing key variables around planning, implementation, supporting applications, visibility efforts, communication, and measurement.

•    Community: As the top layer, the strategic focus centers on loyalty achieved through specific campaigns, advocacy, or customer experiences. Community should be viewed as long-term, with the expectation of learning that can be applied to future iterations of strategy and research.

Practically speaking, we as marketers should be both implementers and "idea evaluators." But as strategists, we're called to a higher accountability -- one that distinguishes originality from repurposing, and activity from productivity. 




marketingseo