Tuesday, June 28, 2011

How Google's Panda Update Changed SEO Best Practices Forever........

How Google's Panda Update Changed SEO Best Practices Forever
It's here! Google has released Panda update 2.2, just as Matt Cutts said they would at SMX Advanced here in Seattle a couple of weeks ago. This time around, Google has - among other things - improved their ability to detect scraper sites and banish them from the SERPs. Of course, the Panda updates are changes to Google's algorithm and are not merely manual reviews of sites in the index, so there is room for error (causing devastation for many legitimate webmasters and SEOs).
A lot of people ask what parts of their existing SEO practice they can modify and emphasize to recover from the blow, but alas, it's not that simple. In this week's Whiteboard Friday, Rand discusses how the Panda updates work and, more importantly, how Panda has fundamentally changed the best practices for SEO. Have you been Panda-abused? Do you have any tips for recuperating? Let us know in the comments!
 






Howdy, SEOmoz fans. Welcome to another edition of Whiteboard Friday. This week, we're talking about the very exciting, very interesting, very controversial Google Panda update.

Panda, also known as Farmer, was this update that Google came out with in March of this year, of 2011, that rejiggered a bunch of search results and pushed a lot of websites down in the rankings, pushed some websites up in the rankings, and people have been concerned about it ever since. It has actually had several updates and new versions of that implementation and algorithm come out. A lot of people have all these questions like, "Ah, what's going on around Panda?" There have been some great blog posts on SEOmoz talking about some of the technical aspects. But I want to discuss in this Whiteboard Friday some of the philosophical and theoretical aspects and how Google Panda really changes the way a lot of us need to approach SEO. 

So let's start with a little bit of Panda history. Google employs an engineer named Navneet Panda. The guy has done some awesome work. In fact, he was part of a patent application that Bill Slawski looked into where he found a great way to scale some machine learning algorithms. Now, machine learning algorithms, as you might be aware, are very computationally expensive and they take a long time to run, particularly if you have extremely large data sets, both of inputs and of outputs. If you want, you can research machine learning. It is an interesting fun tactic that computer scientists use and programmers use to find solutions to problems. But basically before Panda, machine learning scalability at Google was at level X, and after it was at the much higher level Y. So that was quite nice. Thanks to Navneet, right now they can scale up this machine learning.
What Google can do based on that is take a bunch of sites that people like more and a bunch of sites that people like less, and when I say like, what I mean is essentially what the quality raters, Google's quality raters, tell them this site is very enjoyable. This is a good site. I'd like to see this high in the search results. Versus things where the quality raters say, "I don't like to see this." Google can say, "Hey, you know what? We can take the intelligence of this quality rating panel and scale it using this machine learning process."

Here's how it works. Basically, the idea is that the quality raters tell Googlers what they like. They answer all these questions, and you can see Amit Singhal and Matt Cutts were interviewed by Wired Magazine. They talked about some of the things that were asked of these quality raters, like, "Would you trust this site with your credit card? Would you trust the medical information that this site gives you with your children? Do you think the design of this site is good?" All sorts of questions around the site's trustworthiness, credibility, quality, how much they would like to see it in the search results. Then they compare the difference.

The sites that people like more, they put in one group. The sites that people like less, they put in another group. Then they look at tons of metrics. All these different metrics, numbers, signals, all sorts of search signals that many SEOs suspect come from user and usage data metrics, which Google has not historically used as heavily. But they think that they use those in a machine learning process to essentially separate the wheat from the chaff. Find the ones that people like more and the ones that people like less. Downgrade the ones they like less. Upgrade the ones they like more. Bingo, you have the Panda update.

So, Panda kind of means something new and different for SEO. As SEOs, for a long time you've been doing the same kind of classic things. You've been building good content, making it accessible to search engines, doing good keyword research, putting those keywords in there, and then trying to get some links to it. But you have not, as SEOs, we never really had to think as much or as broadly about, "What is the experience of this website? Is it creating a brand that people are going to love and share and reward and trust?" Now we kind of have to think about that.

It is almost like the job of SEO has been upgraded from SEO to web strategist. Virtually everything you do on the Internet with your website can impact SEO today. That is especially true following Panda. The things that they are measuring is not, oh, these sites have better links than these sites. Some of these sites, in fact, have much better links than these sites. Some of these sites have what you and I might regard, as SEOs, as better content, more unique, robust, quality content, and yet, people, quality raters in particular, like them less or the things, the signals that predict that quality raters like those sites less are present in those types of sites.

Let's talk about a few of the specific things that we can be doing as SEOs to help with this new sort of SEO, this broader web content/web strategy portion of SEO.

First off, design and user experience. I know, good SEOs have been preaching design user experience for years because it tends to generate more links, people contribute more content to it, it gets more social signal shares and tweets and all this other sort of good second order effect. Now, it has a first order effect impact, a primary impact. If you can make your design absolutely beautiful, versus something like this where content is buffeted by advertising and you have to click next, next, next a lot. The content isn't all in one page. You cannot view it in that single page format. Boy, the content blocks themselves aren't that fun to read, even if it is not advertising that's surrounding them, even if it is just internal messaging or the graphics don't look very good. The site design feels like it was way back in the 1990s. All that stuff will impact the ability of this page, this site to perform. And don't forget, Google has actually said publicly that even if you have a great site, if you have a bunch of pages that are low quality on that site, they can drag down the rankings of the rest of the site. So you should try and block those for us or take them down. Wow. Crazy, right? That's what a machine learning algorithm, like Panda, will do. It will predicatively say, "Hey, you know what? We're seeing these features here, these elements, push this guy down."

Content quality matters a lot. So a lot of time, in the SEO world, people will say, "Well, you have to have good, unique, useful content." Not enough. Sorry. It's just not enough. There are too many people making too much amazing stuff on the Internet for good and unique and grammatically correct and spelled properly and describes the topic adequately to be enough when it comes to content. If you say, "Oh, I have 50,000 pages about 50,000 different motorcycle parts and I am just going to go to Mechanical Turk or I am going to go outsource, and I want a 100 word, two paragraphs about each one of them, just describe what this part is." You think to yourself, "Hey, I have good unique content." No, you have content that is going to be penalized by Panda. That is exactly what Panda is designed to do. It is designed to say this is content that someone wrote for SEO purposes just to have good unique content on the page, not content that makes everyone who sees it want to share it and say wow. Right?

If I get to a page about a motorcycle part and I am like, "God, not only is this well written, it's kind of funny. It's humorous. It includes some anecdotes. It's got some history of this part. It has great photos. Man, I don't care at all about motorcycle parts, and yet, this is just a darn good page. What a great page. If I were interested, I'd be tweeting about this, I'd share it. I'd send it to my uncle who buys motorcycles. I would love this page." That's what you have to optimize for. It is a totally different thing than optimizing for did I use the keyword at least three times? Did I put it in the title tag? Is it included in there? Is the rest of the content relevant to the keywords? Panda changes this. Changes it quite a bit.

Finally, you are going to be optimizing around user and usage metrics. Things like, when people come to your site, generally speaking compared to other sites in your niche or ranking for your keywords, do they spend a good amount of time on your site, or do they go away immediately? Do they spend a good amount of time? Are they bouncing or are they browsing? If you have a good browse rate, people are browsing 2, 3, 4 pages on average on a content site, that's decent. That's pretty good. If they're browsing 1.5 pages on some sites, like maybe specific kinds of news sites, that might actually be pretty good. That might be better than average. But if they are browsing like 1.001 pages, like virtually no one clicks on a second page, that might be weird. That might hurt you. Your click-through rate from the search results. When people see your title and your snippet and your domain name, and they go, "Ew, I don't know if I want to get myself involved in that. They've got like three hyphens in their domain name, and it looks totally spammy. I'm not going to get involved." Then that click-through rate is probably going to suffer and so are your rankings.

They are going to be looking at things like the diversity and quantity of traffic that comes to your site. Do lots of people from all around the world or all around your local region, your country, visit your website directly? They can measure this through Chrome. They can measure it through Android. They can measure it through the Google toolbar. They have all this user and usage metrics. They know where people are going on the Internet, where they spend time, how much time they spend, and what they do on those pages. They know about what happens from the search results too. Do people click from a result and then go right back to the search results and perform another search? Clearly, they were unhappy with that. They can take all these metrics and put them into the machine learning algorithm and then have Panda essentially recalculate. This why you see essentially Google doesn't issue updates every day or every week. It is about every 30 or 40 days that a new Panda update will come out because they are rejiggering all this stuff.

One of the things that people who get hit by Panda come up to me and say, "God, how are we ever going to get out of Panda? We've made all these changes. We haven't gotten out yet." I'm like, "Well, first off, you're not going to get out of it until they rejigger the results, and then there is no way that you are going to get out of it unless you change the metrics around your site." So if you go into your Analytics and you see that people are not spending longer on your pages, they are not enjoying them more, they are not sharing them more, they are not naturally linking to them more, your branded search traffic is not up, your direct type in traffic is not up, you see that none of these metrics are going up and yet you think you have somehow fixed the problems that Panda tries to solve for, you probably haven't.

I know this is frustrating. I know it's a tough issue. In fact, I think that there are sites that have been really unfairly hit. That sucks and they shouldn't be and Google needs to work on this. But I also know that I don't think Google is going to be making many changes. I think they are very happy with the way that Panda has gone from a search quality perspective and from a user happiness perspective. Their searchers are happier, and they are not seeing as much junk in the results. Google likes the way this is going. I think we are going to see more and more of this over time. It could even get more aggressive. I would urge you to work on this stuff, to optimize around these things, and to be ready for this new form of SEO.

Thanks everyone for watching. Look forward to some great comments, questions, feedback in the post. 



 

Saturday, May 21, 2011

SERP Alert: Google Social Search Goes Global


Google announced via its new official Search Blog that it is rolling out Social Search around the globe. This comes just days after Bing upped the ante in the social search game by integrating Facebook data in much more elaborate ways. Google’s social search, however, may prove useful in some cases, but you may see more content from strangers than you do from your real friends.




Does Google’s Social Search make results less relevant? Comment here. 

Google has been doing social search since 2009, and earlier this year it was updated to be more useful, with social results appearing throughout the SERP, as opposed to just in a cluster at the bottom of the SERP. Google says they’re mixed in based on relevance.


“For example, if you’re looking for information about low-light photography and your friend Marcin has written a blog post about it, that post may show up higher in your results with a clear annotation and picture of Marcin,” says Google software engineer Yohann Coppel.

“Social Search can help you find pages your friends have created, and it can also help you find links your contacts have shared on Twitter and other sites. If someone you’re connected to has publicly shared a link, we may show that link in your results with a clear annotation,” says Coppel. “So, if you’re looking for information about modern cooking and your colleague Adam shared a link about Modernist Cuisine, you’ll see an annotation and picture of Adam under the result. That way when you see Adam in the office, you’ll know he might be a good person to ask about his favorite modern cooking techniques.”

                            How Google Determines What to Show In Social Search Results

First of all, users must be logged into Google to get the benefits of social search. “If you’re signed in, Google makes a best guess about whose public content you may want to see in your results, including people from your Google Chat buddy list, your Google Contacts, the people you’re following in Google Reader and Buzz, and the networks you’ve linked from your Google profile or Google Account. For public networks like Twitter, Google finds your friends and sees who they’re publicly connected to as well,” explains Coppel.

Google deserves credit for giving users great deal of control about what people they’re using here, though they could still go further. You can go to your Google Dashboard, find the Social Circle and Content section, and edit accordingly. If you go to the “view social circle link” you can see every single person listed by:

  • Direct connections from your Google Chat buddies and contacts. It even shows you which of these people have content and which don’t. For the ones that do, it shows you which sites they have content on. One important thing to note: it actually does include Facebook Page content. For example, I’m connected to Danny Sullivan in my social circle, for example, and Google will show me updates from his Facebook page, as he has it linked to his Google Profile. What’s missing, however, is your personal Facebook network of friends (which in my opinion is the most valuable social data there currently is on the web, if you’re a Facebook user).
  • Direct connections from links through Google Profiles or Connected Accounts “For example, if you listed your Twitter account on your profile or if your Twitter posts appear in your public Buzz stream, then relevant content from people you follow on Twitter will show up in your search results,” Google explains in that section. “You can change these relationships by visiting the corresponding services and adding or removing connections.”
  • Secondary connections that are publicly associated with your direct connections. In other words – friends of friends (at least public friends of friends). There is a little less control here, unfortunately. You can’t remove these people from your social circle unless you remove the friend that’s connecting you to them.
    To me, this actually seems like a step backwards in relevancy of social search. You’re probably a lot less likely to care about what someone thinks just because they know someone you know, than you are if you actually know them. A lot of people don’t even care about what the people they actually do know think.
    Naturally, this is the biggest list and potential source of material for Google to draw from, making it more likely that you see results from people you don’t know than people you do.
A cool thing about the entire list is that you can click “show paths” next to any name that has content, and it will show you exactly how you’re connected. You can be linked to someone via Twitter, and if that person links their Twitter account to their Quora account, you might see their Quora content too. If that Quora account links to their Facebook account, you might see stuff from their Facebook account if you have permission to see that content (which if set to public or if you’re Facebook friends, you should be able to see it). 

Where are my friends?
I notice one gaping hole in Google’s social search strategy besides the lack of comprehensive Facebook integration (though it’s certainly connected to that). That would be the lack of a substantial amount of my actual closest friends. I can only assume that many users have a similar issue.

That’s exactly why Bing’s Facebook integration is a very important factor in its competition with Google. Bing, unlike Google, does tap into your actual Facebook friends for search relevancy (though there is plenty of room for improvement on Bing’s part as well). The Wajam browser extension is still currently a better solution to the problem, if yo ask me. It will add your Facebook and Twitter friends to your results on both Google and Bing. 


It is also for this reason (at least partially) that Google is competing more directly with Facebook now in social. Google wants users to develop the kinds of relationships among friends that people currently have on Facebook, on Google’s own network (which runs throughout various products, but ultimately the Google account, which is at the center of nearly everything – Gmail, YouTube, Buzz, Docs, Chrome OS, etc. The list goes on.

As long as Google and Facebook aren’t going to play nice together, Google needs to succeed in social to have the best search relevancy in the social part of search. And that part of search is clearly becoming more and more important. That’s simply one competitive advantage Bing has over Google right now. It’s also why Facebook itself is a threat to Google search in some ways.


It will be very interesting to see how far Google takes social search over time. We know Google is currently working on increasing its presence as a force in social, and the upcoming +1 button should play a significant part in that. As search gets more social, however, it presents new challenges for search engine optimization, and perhaps less significance on algorithm updates (like Panda) from the webmaster point of view. 


Social can not only be a signal of relevance on a personalized level, but if content is shared a lot, it can also be seen as a signal of quality, because people don’t share content that sucks, unless they’re doing it as a joke or using it as an example of what not to do (like I said, it’s just a “signal”). This is nothing new, but it shows the importance of diversifying your traffic sources.


If you rely heavily on search, as many of the big victims of the Panda update have, you will always be at the mercy of the search engines. If you can find ways to get more love from social networks and links from others, it’s bound to help you in search as well. 


Is Google’s social search helpful or does it miss the mark? Tell us what you think

Friday, May 13, 2011

Despite New Panda Guidelines, Google Still Burying Authoritative Results

Despite New Panda Guidelines, Google Still Burying Authoritative Results 

There are a lot of elements of Google’s Panda update to discuss, and we’ve certainly discussed many of them over the last few months, but let’s not lose sight of the reason the update was launched to begin with – to improve search quality. 

Do you think Google’s search results are better now? Tell us what you think.

While quality is often in the eye of the beholder, there are certain kinds of queries where the information being retrieved is simply more important than others. We’ve talked about this before, as it’s been a problem in some Google results.  One example we’ve looked at a few times is where an eHow article written by a freelance writer with no clear authority on cancer (and whose body of work includes a lot of plumbing-related articles) was ranking at the top of Googe’s results for the query “level 4 brain cancer” above numerous other sources that would seem to be of greater authority on such a subject. 




In fact, the article did get bumped down after the Panda update, but it does still rank number 2, followed by another result from eHow. Granted, this is just one example, and Demand Media has efforts in motion to improve its own content quality, but you get the point.
Queries related to things like health or law demand authoritative advice. Not SEO’d content.
We had a conversation with Mark Britton, founder and CEO of Avvo about this subject. Avvo is a site that offers Q&A forums where consumers can ask medical or legal questions and get responses from qualified doctors and lawyers. It provides apparently authoritative content in these two areas from certified professionals.

This seems like the kind of content that should be ranking well for a lot of these types of queries. Does it not? Britton thinks it’s “very important” for commentary from experts in the medical and legal fields to surface high in search results for relevant topics.
“There is a lot of noise both online and offline regarding health and legal issues,” he tells us. “This comes in the form of lay people, professional commentators and even celebrities who often offer advice that is well-intentioned but inherently inferior to that of a doctor or lawyer trained in the area. However, it is not always easy to get doctors and lawyers to speak. Some still look down on the Internet as a publishing or marketing vehicle. Others just downright fear it, as they have seen too many movies where someone says something on the Internet and they are subsequently hunted and killed by terrorist hackers.”

“There is always room for improvement — especially with our newer pages,” he says of Avvo’s own search rankings. “We just launched our doctor ratings directory and our free medical question and answer forum in November, and it will take some time for those pages to rank as well as our legally related pages.” Look at the results for a query like “Does type 2 diabetes shorten life expectancy?” Avvo’s page on the subject ranks on the second page, while eHow ranks at the top of the first. The Avvo result has actually fallen since I began writing this article. It used to be right below the number one result from eHow and the number 2 from Yahoo Answers.


EHow’s is an article (not very long by any means) by a guy whose bio says he “has been a freelance writer since 2007. He writes extensively in the fitness, mental health and travel sectors and his work has appeared in a range of print and online publications including Scazu Fitness and USAToday Travel Tips…[and] holds a Master of Arts in community psychology.”

Keep in mind that USA Today has a deal with Demand Media for travel tips. So that presumably means his Demand Media content is simply published by USA Today. Does “Master of Arts in community psychology” indicate more authority to answer a life/death question about type 2 diabetes than say a licensed and practicing MD? That’s who provided an answer on Avvo’s page, which just got pushed further down in the search results. 

If you change the query to something simpler like “type 2 diabetes life expectancy” eHow still ranks close to the top, and Avvo’s result slips to….get ready for it….page 18! That’s with various articles from places like eHow, EzineArticles and Suite101 (all victims of the Panda update) ranking ahead of it. Now, I’m not saying that Avvo’s result is necessarily the one ultimate result for this query and should necessarily be the highest ranked, but come on. Interestingly enough, the result was on page 3 for this query when I started writing the article (yesterday) and it’s slipped that much further into obscurity just since then. I wonder where it will be in another day. 

Google has given publishers a list of questions to ask themselves about their content, as guidelines the company goes by as it writes its algorithms. The very top one is “Would you trust the information presented in this article?” While neither of the articles provide any helpful links to sources of information, the Avvo article comes from a medical doctor. I think most people would find that slightly more trustworthy, even if the article isn’t as long or as well SEO’d. Here’s the eHow article. Here’s the Avvo one.

The second question on Google’s list is, “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?”
While Google makes it clear that these questions aren’t actual ranking signals, they must be used to determine the signals at least, and you have to wonder just how much weight authority on a topic carries.Britton maintains that ALL of the site’s advice comes from qualified professionals, claiming that this is one of the site’s “greatest differentiators.”

“We CERTIFY every doctor and lawyer offering free advice on the site in two principle ways: First, we verify with the state licensing authorities that the answering doctors or lawyers are licensed and in good standing,” he explains. “Second, we rate the professionals from 1 (“Extreme Caution”) to 10 (“Superb”), which was unheard of prior to Avvo’s entry into the professional ratings arena. We are big believers that not every doctor or lawyer is ‘Super’ or ‘Best’ which was the steady-state in professional ratings for decades.”
“This was really just an extension of the Yellow Pages model, where the ‘recommended’ professional is the one paying the most money to advertise,” he continues. “But consumers are getting wise and demanding greater transparency regarding the qualifications of their doctors and lawyers.”

“We have three ratings that speak to the expertise of our contributors: The Avvo Rating, client/patient ratings and peer endorsements,” says Britton. “For the Avvo Rating, we start with the state licensing authorities and collect all the information we can regarding a professional. We then load that information into our proprietary web crawler, which we call ‘Hoover.’ Hoover goes out and finds all the additional information it can regarding the professional. We match the licensing data with the Hoover data and then we score it. The scoring is based on those indicators of the professional’s reputation, experience and quality of work.”

Britton says Avvo was not really affected by Google’s Panda update. “We saw a small dip, but things came back fairly quickly.”
“While I understand the intent of Google’s latest update, I’m not sure they entirely hit their mark,” he says. “We noticed a number of pure lead-generation sites – i.e., sites that are selling leads to the highest bidder — jump ahead of us in certain key terms, which is not good for consumers.”
Avvo encourages people to ask questions on the site, claiming it its Q&A boasts a 97% response rate. Avvo asked us to let readers know that in support of Skin Awareness Month, it is donating $5 to the Melanoma Research Foundation for every doctor review during the month of May. 

Should authority and certification of expertise carry greater weight in Google’s search rankings? Comment here.

 

Wednesday, April 20, 2011

Google Panda Update Helps Local Search Results..........




As we continue to look at the fallout of the Google Panda update (more so since its international roll-out), we have yet more data to sink our teeth into.

Have you noticed an increase in local results since the Panda roll-out? Let us know.
CNET has now released some data, as it compiled about 100,000 Google results, testing Google.com in March, and then last week, just after the new update. The data reflects earlier reported data from SearchMetrics in terms of news sites benefiting and Demand Media’s eHow sliding.

One interesting element CNET’s data brings to the table, however, is that of Google’s localization and its relationship to the update.

“We also tested what happens if you connect to Google.com from an overseas Internet address. We picked one in London. We performed the same searches on the same day–the only variable that should have changed, in other words, was our location,” explains CNET chief political correspondent Declan McCullagh. “The results? Google engages in significant localization efforts, as you might imagine, with Yelp.com being the largest beneficiary by far.”

“In searches originating from the U.K., Yelp appeared only twice,” he adds. “In U.S. searches, by contrast, it was the ninth-most popular Web site, with both its topic and individual business pages weaved seamlessly into the main search results.”

SearchMetrics’ data did show yelp.co.uk as having a 29.59% boost in visibility.

Yelp picked up 45 first-page appearances for generic searches like “chocolate,” “cleaning,” “food,” “lights,” “laundry,” “tv,” and “weddings,” from a Califronia address, according to McCullagh, while Davidsbridal.com, BarnesandNoble.com, and Walgreens.com also benefited in the U.S. from localization.

It’s not all just big brands though.

Local-based results won big too, based on CNET’s testing. Not just local locations for big brands or local businesses, but locally-themed results.

“For our U.S. tests, we used an Internet address near Palo Alto, Calif., which prompted Google to rank nearby businesses and municipal Web sites near the top of search results,” McCullagh explains. “The City of Palo Alto’s Web site appears in the first page of search results for terms including ‘adventures,’ ‘art,’ ‘business,’ ‘gas,’ and ‘jobs.’ PaloAltoOnline.com makes repeat appearances (‘budget cuts,’ ‘restaurants’), as do Stanford, the Palo Alto Medical Foundation, and Mike’s Bikes.”

It’s no secret that Google has put a great deal more emphasis on local in recent times, but it’s interesting to see how this is playing out in light of the Panda update, which was seemingly unrelated (based more on content farms).

We saw how news sites and video sites appeared to come out as big winners, but this research does seem to indicate even more wins for local.

The benefits to Yelp are interesting, considering the tension there has been between Yelp and Google, regarding Google Place Pages and their use of Yelp reviews. Google’s own reviews system – Hotpot – has now found its way into Places, and right into organic search results themselves.

Google is also finding more ways to improve its local listings themselves. See the “open now” and local product listings, for example. Oh, and by the way, Google just launched Map Maker for the US, so users can add their “local knowledge” to the map.

Interestingly enough, as Google focuses more on local, the competition for local eyeballs is already heavily increasing. This is not just about search in the traditional sense. You have to factor in entities like Groupon, LivingSocial, Facebook, Foursquare, and others in these companies’ respective spaces (the lines between which are getting blurrier).

The more ways people obtain the information related to local businesses from sources outside of Google, the less they’ll need to search for that information with Google.

Are Google’s results better now? Tell us what you think

marketingseo

Friday, April 1, 2011

Meet the Plus One Google's Version

 Meet the Plus One Google's Version
Google is trying hard to move in on Facebook's social sharing territory. Yesterday, it unveiled the "Plus One" button. It's pretty much the same as the Facebook "Like" button, except it's for Google searches. When you click the "Plus One" button next to your Google search results, your friends will see that you've endorsed it on their own results.
If you want to start "plus oneing" things—yes Google is already using "plus one" as a verb, see video below—you have to opt in to the "experiment" here. If you're a regular Google user, you've probably noticed that some search results already have an extra line at the bottom telling you a Twitter or Facebook friend has mentioned that link. This new feature lets you annotate searches you want to share right there on the Google page, without actually cutting and pasting into Facebook or Twitter.
All of this social search information is enticing ambient information on your friends' lives, but the actual benefit of this feature is that "Plus Ones" are data that can help guide your clicking in a cluttered web world. Well, that's the theory anyway.
Google writes on its blog:
Say, for example, you’re planning a winter trip to Tahoe, Calif. When you do a search, you may now see a +1 from your slalom-skiing aunt next to the result for a lodge in the area. Or if you’re looking for a new pasta recipe, we’ll show you +1’s from your culinary genius college roommate. And even if none of your friends are baristas or caffeine addicts, we may still show you how many people across the web have +1’d your local coffee shop.



One useful way the "Plus One" is different from the "Like" button is that, over time, your "Plus Ones" will be archived as a kind of bookmarks file of endorsed searches. So you can check back on all the links you've endorsed this way.

And so can everyone else, unless you set your privacy setting right. So, like everything else in social media, "Plus One" (as a verb) with care.

Right now, you'll only see "Plus Ones" from people connected to you through Google contacts, like gchat or gmail, but the company may soon expand that to Twitter or other sites, they say. To find out just how public your Google search result endorsements will be, go to the Google Dashboard and check your connections and your settings.

Your network is probably much bigger than you think. That's fine if you want to become a web-lebrity brand. But not so great for job hunters with a few private hobbies best left off the C.V.

If you have any insight into more things Google may be looking at (specific to this update), discuss these in the comments as well.
marketingseo

Gmail Motion

 A new way to communicate 
The mouse and keyboard were invented before the Internet even existed. Since then, countless technological advancements have allowed for much more efficient human computer interaction. Why then do we continue to use outdated technology? Introducing Gmail Motion -- now you can control Gmail with your body. 













marketingseo

Sunday, March 6, 2011

Google Panda Algorithm Update

Google “Panda” Algorithm Update – What’s Known & What’s Possible 
Google Shares Some Clues, Impacted Sites Left Guessing

Google’s recent algorithm update aimed at improving the quality of search results has captured a great deal of attention – both positive and negative. The general consensus seems to be that the results are in fact better now, but still not perfect. Perfection will likely never be achieved, but there are still some glaring criticisms out there about Google’s most recent attempt.
Having had some time to reflect, what is your opinion of the update? Let us know in the comments.Despite the improvement in overall search quality in general, there have been many sites to suffer the consequences of the update – some deservedly and others maybe not so much. As Google will never reveal its secret recipe in its entirety, there are plenty of clues out there, and even facts that Google will share. You can criticize Google’s mystique all you want, but there’s not denying that they do communicate with the webmaster community to a great extent, even if they don’t always tell you everything you want to hear.
Google’s Matt Cutts and Amit Singhal – two of the most instrumental voices in the recent update – shared some clues and insights in an interview with Wired this week. Before we get to specifics, there were some interesting things mentioned by the two that are worth noting. For example, Caffeine, which sped Google’s indexing, led to a flood of content – both good and bad. This seems to have helped the “shallow” kinds of content that this most recent update targeted – not stuff that is quite spam, but…well, shallow. We also learned that Google calls the update “Panda”.

They revealed that prior to the update, they sent out documents to outside testers/raters, and asked them questions about quality. It would be interesting to know who these raters were, but no such luck there. Users were asked things like whether they would feel comfortable giving a site their credit card info or giving medicine from the site to their kids (I wonder if anyone was asked if they felt comfortable getting their brain cancer information from a freelance eHow writer with no credentials in the field), whether they considered the site to be authoritative, whether it would be ok in a magazine, whether it has “excessive” ads, and other questions. It would be great to be able to know more of those questions, but we can only work with what Google has revealed.
“And based on that, we basically formed some definition of what could be considered low quality,” Singhal is quoted as saying.
“We actually came up with a classifier to say, okay, IRS or Wikipedia or New York Times is over on this side, and the low-quality sites are over on this side,” said Cutts. “And you can really see mathematical reasons…”
“I got an e-mail from someone who wrote out of the blue and said, ‘Hey, a couple months ago, I was worried that my daughter had pediatric multiple sclerosis, and the content farms were ranking above government sites,’” Cutts is later quoted as saying. “Now, she said, the government sites are ranking higher. So I just wanted to write and say thank you.’”
Again, why is eHow still ranking for “level 4 brain cancer”?
Google says it still looks at feedback, and Cutts even said that if someone has a specific question about why a site dropped, he thinks it’s “fair and justifiable and defensible to tell them why that site dropped.” He also said that Google’s most recent algorithm contains signals that can be gamed (hence the lack of full transperency). In other words, it can still be optimized for.
Finally, the site Suite101, which data from SearchMetrics lists as the biggest loser in percentage (in its organic performance index) was brought up in the interview. Suite101 and eHow are often compared and labeled as “content farm” type sites. When asked why Suite101 took a much bigger hit than eHow, Cutts simply said, “I feel pretty confident about the algorithm on Suite 101.”
It would be very helpful to understand the differences Google sees between these two sites. It doesn’t seem very clear by looking through the sites that there are obvious differences in quality. I’m sure it varies on both
We reached out to Suite101 a few days ago for comment on the update and its impact, but have yet to receive a response. I’m even more interested to hear what they have to say, now that these comments have come out. Update: Suite101 referred us to an open letter from CEO Peter Berger to Google’s Matt Cutts.
CEO Peter Berger stressed the importance of quality in content when we spoke with him last year.
“Every week, several thousand people apply to become Suite101 writers,” he told us. “While we only accept a portion of applicants based on our non-negotiable quality standards, we do have many successful writers on our site who do not consider themselves ‘writers’.”
“We see it as Suite101′s mission to enable people – anyone who can write well and with deep understanding of a subject – to achieve their goals,” he said. “These might be earning money, addressing large audiences, building up a personal professional brand, or simply enjoying creative freedom in a nurturing, peer-oriented environment.”
Results from people with a deep understanding of a subject should lend themselves to quality. Whether or not Suite101 delivers on this is open for debate. Clearly Google doesn’t think so, practically making the site the poster-child of what not to do. The mysteries continue…
What we know Google is looking at with the Panda update:
User comfort level in the trust area (think credit card/medicine comments)
 Is it considered authoritative (this would apply some indication of expertise on topics covered, I would think)
 Is the content quality good enough for print? (I’ve seen plenty of crap printed)
 Are there too many ads? (How many are too many, and does the ad network matter?)
 We know Google has its definition of what could be considered low quality
 Google uses a “classifier” to draw a line in the sand
 We know that so far, Google has not used indications from the Chrome Extension (emphasis on so far. Google hinted in the past that this data could potentially be used to tweak the algorithm).
 Google looks at feedback, at least to some extent
 Based on comments from Cutts, Google will tell you why your site dropped (getting that communication flow going may not be the easiest thing to do, but I have personally witnessed Cutts sit down with someone at a conference and look at their site with them.)
 The algorithm can still be gamed. It can still be optimized for. (If you were hit by the update, there are things you can do to get back in Google’s good graces. In other words, you’re not necessarily banned just because of your brand.)
 Most of the changes in rankings will be done algorithmically, but Google will take manual action in some instances (see JC Penney)
If you use any auto-generated content keep it separated from the original high quality stuff, and block it from search engines. Google’s John Mu said recently, “If you do have such high-quality, unique and compelling content, I’d recommend separating it from the auto-generated rest of the site, and making sure that  the auto-generated part is blocked from crawling and indexing, so that search engines can focus on what makes your site unique and valuable to users world-wide.”
If you can think of anything else that is “known” about this update, please feel free to comment.
We won’t jump to any conclusions, but here are…
Some things that are possible that may be worth considering:
 Old fashioned design may play a role. Simply from the aesthetic point of view, this may make a site appear less trustworthy (less likely that consumers will be willing to give up their credit care info). We wonder if this played a role in the EzineArticles and Foner Books examples we looked at.
 There is possibly a threshold that can be crossed for what is considered too many ads before your site gets points taken off for quality. Some have tried reducing the number of ads (again, see EzineArticles) to try and boost rankings.
 Duplicate content (to some degree) may play a role in this recent update. EzineArticles, again, is a prime candidate for this. Articles from the site are published other places – probably the majority of the content from the site is duplicated at other places (besides just scrapers). eHow content is uniquely written for eHow. There are plenty of people that will suggest much of this content is rewritten based on other existing articles, but that’s beside the point. The content itself is unique to eHow (again, scrapers aside).
Other sites like Business Insider, The Huffington Post, and even the New York Times, CNN, and the Wall Street Journal will syndicate content from other blogs, but this duplicate content does not make up the majority of the content from these sites, and this is probably why it’s not frowned upon in these cases. Even WebProNews has had a blog partner program in place for years, in which we syndicate select posts from our partners, but this content has never dominated WebProNews. It’s never been the majority of what we publish, but a small percentage.
Excessive amounts of very short articles may be a factor taken into consideration, because if that’s the majority of what you put out, the majority of your content is likely “shallow”. Now sometimes, short posts are sufficient. Sometimes there’s just not that much to say, but if these kinds of posts dominate, there’s a good chance there IS more to say about a lot of it, and someone else probably IS saying it, which makes those people better candidates for better rankings.
eHow may still be ranking well at least partially because it has established a lot of backlinks over time. The nature of these links could come into play. There is some interesting discussion about this in a WebmasterWorld thread.
Better, consistent page structure could also play a role (as brought up in that same thread…look at ehow vs HubPages (which was hit by the update).
Update: PotPieGirl.com has some very interesting data, after running a test on seven key phrases that attract large amounts of spammy content. This might be very telling of at least one aspect of the Panda update. The following chart sayst it all. Look at the difference in percentages between EzineArticles and eHow.
Another dataset looks at the same phrases for articles just from the last month:
“In the last month, Ezine Articles has had close to 39,000 urls found/crawled in the Google index that have one of these 7 phrases on them. That means that 2.82% of the EzineArticles.com urls Google has found/crawled in the last month have this phrase on them,” says Jennifer (Pot Pie Girl), who put this data together. “That is almost 39 THOUSAND web pages in the Google index in the past month with one of those 7 phrases on them – from ONE SITE.”

If you have any insight into more things Google may be looking at (specific to this update), discuss these in the comments as well.
marketingseo

Thursday, March 3, 2011

Google Algorithm Update Casualties Speak.

Google Algorithm Update Casualties Speak
 
Last week, Google launched a major update to its algorithm, which was positioned as one that would go after content farms. While some sites that are often attached to that label were in fact hurt by the update, some other sites that aren’t generally considered content farms became casualties as well.

Was your site impacted by Google’s algorithm update? For better or worse? Let us know.


Now, it’s important to note that Google did not come out and use the phrase “content farm” when it announced the update, but the company used language similar to what it has used in the past when talking about content farms. In a nutshell, the algorithm was supposed to be aimed at reducing rankings for lower quality content. Those who found their rankings impacted negatively are not thrilled with having their content deemed as such, and some of the sites that were apparently devalued, do raise some eyebrows.

Take, for example, Cult of Mac. This is a tech blog that covers Apple news. It is often linked to by other sources, and frequently appears on Techmeme as a source. A lot of Apple enthusiasts visit the site on a regular basis for updates. Leander Kahney, the site’s editor and publisher, wrote a scathing post about Google’s update, proclaiming, “We’ve become a civilian casualty in the war against content farms...Why us? We have no idea. The changes Google has made to its system are secret. What makes it worse is that Google’s tinkering seems to have actually improved Demand Media’s page rank, while killing ours...We’re a blog, so we aggregate news stories like everyone else. But our posts are 100% original and we do a ton of original reporting...”

“We can go toe-to-toe with any other tech news site out there,” he wrote. “We break a ton of stuff. Go take a look at MacRumors, which is very good at giving credit, and see how often we're cited as the source of stories...Yes, we report other's stories, just like Engadget, MacRumors, AppleInsider, Wired, Daring Fireball and everyone else. That's the news business on the Web. It's a flow, a conversation...The question is whether we add value -- figure out what it means, if a rumor is credible, what the historical context is. We do that and we do it well. Plus we give clear credit where credit is due (unlike the original content stealers like Engadget and Mashable. Try to figure out what stories they ripped off from us).”  Note: those accusations appear to have been removed from the post. 

Even PRNewswire, the press release distribution service was devalued by Google’s update. Kahney also defended that site, after a commenter on his post mentioned it. He said, “...and for your information, PR newswire isn't a content farm either. It published press releases for thousands of companies. Crappy spam websites pull releases from its RSS feeds and republish it as pretend content -- which may be why it was down ranked by Google.”

Technorati got hit too. This site was once considered a darling among bloggers, and now apparently it’s been reduced to a low quality site clogging up the search results, based on Google’s doings. CEO Richard Jalichandra doesn’t appear to have acknowledged this:

Other sites more often associated with the content farm label, though they’ll pretty much all do everything they can to distance themselves from it, were also hit by the update - sites like Associated Content (run by Yahoo), Suite101, HubPages, Mahalo, EzineArticles, and others. Reports have indicated that Demand Media’s eHow - the site most often associated with the label, was actually helped by the update.

The notion that eHow was helped has been questioned. Erik Sherman at CBS looks at Compete data, and writes, “What seems to be a jump may be a normal increase, which raises the question of whether it would have been larger without the algorithm changes.”

However, if you do some searching in Google, you’ll probably notice that there is still a great deal of eHow content ranking well - and still under questionable circumstances (see “level 4 brain cancer” example discussed previously).

Still, Demand Media as a whole was not immune from the update. At least three of their sites were negatively impacted: Trails.com, Livestrong.com, and AnswerBag.com. After the update was announced,  Larry Fitzgibbon, Demand Media's EVP of Media and Operations, said: “As might be expected, a content library as diverse as ours saw some content go up and some go down in Google search results. This is consistent with what Google discussed on their blog post. It’s impossible to speculate how these or any changes made by Google impact any online business in the long term – but at this point in time, we haven’t seen a material net impact on our Content & Media business.”

Pia Chatterjee of HubPages tells us, “On our end we think that its really too soon to tell, as after any large update, all the traffic undergoes pretty serious upheaval. All these numbers will be very different in about 7/10 days. What is worrying is that the update did not seem to do what it was supposed to, which was penalize poor content. The fact that e-how has remained untouched is proof of that!”

“Our CEO, Paul Edmondson says:  We are confident that over time the proven quality of our writers' content will be attractive to users. We have faith in Google's ability to tune results post major updates and are optimistic that the cream will rise back to the top in the coming weeks, which has been our experience with past updates.”

EzineArticles CEO Chris Knight wrote a blog post about how his site was affected, and what he is doing to try and get back up in the rankings.  "While we adamantly disagree with anyone who places the 'Content Farm' label on EzineArticles.com, we were not immune to this algorithm change," he wrote. "Traffic was down 11.5% on Thursday and over 35% on Friday. In our life-to-date, this is the single most significant reduction in market trust we've experienced from Google."

To try and get back into Google's good graces, EzineArticles is doing things like reducing the number of article submissions accepted by over 10% - rejecting articles that "are not unique enough". It will no longer accept article submissions through a Wordpress Plugin. They're reducing the number of ads per page. They're raising the minimum article word count to 400. They're "raising the bar" on keyword density limits. They're removing articles considered "thin and spammy", and will put greater focus on rejection of advertorial articles. Submitted articles are required to be exclusive to the submitter (but won't be required to be unique to EzineArticles). 

Knight also considered adding a Nofollow attribute to links in the article, as “icing in the cake to further prove to Matt Cutts and Google” that they’re not trying to “game Google” or let their authors do so. Interestingly enough, Knight decided to hold off on adding Nofollow after complaints from authors.

The first author to complain, in fact, even said, “Not sure what Pollyanna planet you're from but let me assure you, EzineArticles does not exist 'to provide information that is beneficial to the readers.' EzineArticles is a business, not a government organization or charity. EzineArticles was created to make its owner(s) money. There's nothing wrong with that, but don't fool yourself into thinking they're a bunch of do-gooders. By the same token, the majority of us who publish on EzineArticles don't do so to benefit readers. We too are running businesses, and EzineArticles helps our own websites get traffic and ultimately sales."

Yeah, I think Google frowns upon that whole “we’re not writing to benefit readers” thing.

Another element of this whole algorithm update is that so far, it is only active in the U.S. Once Google expands it into other countries, the sites that have seen their traffic drop off so far may be in for an even bigger shock.
By the way, there are a lot more sites impacted than those discussed in this article.

In an interview with On the Media, Google’s Matt Cutts was asked: “You have so much market share; you are so much the only game in town at this point that you can enforce these things unilaterally, without hearing or due process, putting the whole online world more or less at your mercy. Is there any process by which the people who are affected by algorithm changes and updates can make a case for themselves?”

Cutts responded:
We have a webmaster forum where you can show up and ask questions, and Google employees keep an eye on that forum. And, in fact, if you've been hit with a, what we call a “manual action,” there’s something called a “reconsideration request,” which essentially is an appeal that says, ah, I'm sorry that I was hiding text or doing keyword stuffing and I've corrected the problem, could you review this?

And over time, we've, I think, done more communication than any other search engine in terms of sending messages to people whose site has been hacked or who have issues and then trying to be open so that if people want to give us feedback, we listen to that.”

Cutts later said, “Any change will have some losses, but hopefully a lot more wins than losses.”
It does seem that Google may be willing to ackwowledge some errors in judgement on the matter, if this exchange between Cutts and Kahney is any indication:

Were there more wins than losses with this update? How's the search quality looking to you? Tell us what you think..........


Friday, February 25, 2011

Social Network for Kids

A Safe Social Network for Kids

Since it seems that nearly everyone is on Facebook, it is natural that kids would want to get on the site that their older siblings, parents and even grandparents consider fun. The content on Facebook, however, is not geared toward children. In fact, Facebook actually has a policy that prevents children under the age of 13 from joining the site. It also strongly recommends parental participation with minors.Since kids always find ways to do what they want, many children are joining the site by lying about their age, which is a growing concern for parents. First Lady Michelle Obama is one of those concerned parents and even said on The Today Show that she didn’t want either of her girls on Facebook.

Social networking site Everloop hopes to provide a solution for both parents and children. It is said to be just like Facebook but with content that is geared toward children between the ages of 8-13. In addition, it contains controls for parents that allow them to monitor what their children are doing on the site.“One of the things that Everloop is solving is really giving children under the age of 13 their own social utility, or what we call, their own social graph,” said Tim Donovan, Everloop’s CSO.
Everloop is in compliance with the Children’s Online Privacy Protection Act (COPPA), which means that a parent must authenticate a child before it is permitted on the site. This compliance also ensures that Everloop cannot gather personal information from kids for marketing purposes. Getting parental consent additionally helps prevent sexual predators from obtaining access to the site.
For children, the experience is very similar to that of a user on Facebook. Children have access to a video network and game arcade, and they can create and join groups based on entertainment, learning, and more. They can also customize their own profiles and can take part in IM chatting, SMS, and VoIP.
“Right now, a child under 13, their community experience is disparate. So, they go to YouTube to watch videos, they’ll go to Nickelodeon to or Disney XD to play casual games, they’ll go to Facebook and they’ll sneak onto Facebook to be part of a larger social experience, so we’re collapsing all of that into one experience on Everloop,” said Donovan.
Parents can also customize what their child does on Everloop and give him or her the power to email, IM, chat, etc. They can also enable reporting settings that notify them when their child takes certain actions on the site.
“How do I keep my child’s privacy and information protected? How do I have more insight into the activities that my child is engaged in when they’re online? How do I have more controls over their behavior and their engagement in the social community? So, Everloop solves all those problems for parents,” Donovan points out.
Not only does Everloop want to give parents control and make the process convenient for them, but it also wants to let children feel like they have control as well. If children didn’t have some level of power, they would not be interested in the site at all. Donovan also said that Everloop has to be as cool as Facebook in order to attract kids.
“The bottom line is this, if it’s not cool, kids won’t use it. So, coolness comes from being relevant, coolness comes from having the bleeding edge of technology, coolness comes from… thousands and thousands of opportunities and experiences,” he said.
Everloop also recently announced that it is partnering with i-Safe, the leading publisher of media literacy and educator of digital citizenship, in an effort to bring social media to the classroom. The two organizations will begin to roll out their platform in April.
marketingseo

Sunday, February 13, 2011

Google Vs. Bing: Competition Is Heating Up

Google Vs. Bing: Competition Is Heating Up

You’ve probably heard by now that Google  recently accused Microsoft’s Bing of stealing its search results. Bing (sort of) denied the claim but came back and accused Google of click fraud, the practice often associated with spammers. A back-and-forth stream of strong words and accusations has resulted thus beginning, what appears to be, a long drawn-out saga.



It all began when Danny Sullivan published an article exposing a Google experiment in which it tested Bing. According to Michael Gray of Atlas Web Service, the test, essentially, showed that Bing used the data from Google’s toolbar to duplicate its search results, a move that Google considers “copying.”
Gray went on to explain to WebProNews that the accusation of click fraud is “a little far-reaching.” Although the technology was the same, it didn’t cost Bing any money since there weren’t any PPC campaigns involved. He said that if Google did suspect that Bing was copying them, this method was the only way it would have found out the truth. So, who’s right, and who’s wrong? Gray believes that both companies are in the wrong to an extent. Based on his analysis, Microsoft was wrong to take the data from the toolbar and use it in their ranking algorithm without testing it further.

Google’s wrongdoing, on the other hand, stems from past events. As he explains, Ask introduced universal search long before Google did, and Yahoo introduced Yahoo Instant long before Google released its version of it. In addition, Gray points out that Google seems to make product announcements at other people’s press events and play it off as a coincidence. Although Google, typically, says that it has been working on these products for long periods of time, some people interpret their actions in each of these scenarios differently.The timing of this latest turn of events seemed to be somewhat of a coincidence as well since Sullivan’s article was published just before both companies were set to take the stage at the Farsight Summit.
“Google’s playing hardball and they’re a serious, competitive company; they like to hold onto their market share, and they’re not taking things laying down,” he said.As for the lesson for marketers in all this, Gray said that marketers need to expand their efforts beyond SEO to include other areas, such as social media.He also pointed out that this situation is “good news for Bing” because it means that Google considers them as a viable competitor.
How do you think this saga will play out, and how will it impact the search industry?

Thursday, February 3, 2011

Advancing Strategy Social Marketing

Advancing Strategy for Social Marketing

"When it comes to digital marketing I believe marketers need to be more strategists & research minded than idea evaluators and implementers."
After discussing social media this year with senior marketers from several large brands, the implementer reference in the above tweet by Shiv Singh really resonates with me.
More brands are taking (social) community management activities back in house while seeking outside expertise to continue guiding decisions around social strategies and applications.
When it comes to the day-to-day of social marketing, corporate competence is rising -- and the "yeah, I get that, but what's next?" mentality is placing a higher demand on strategy with expectations of research (or at least experience) to back it up.
As I've been preparing to speak about Facebook marketing with custom applications at next week's Online Marketing Summit, I've found a common thread in the key takeaways pertains more to strategy than turn-key tactics. The following is a preview of a couple key topics I'll discuss as part of that presentation.
Game Mechanics for Custom Facebook Applications
For those of you sick of hearing about it, I'll start by saying game mechanics are not a magic silver bullet -- and I took great delight in hearing Gowalla CEO Josh Williams proclaim "we don't need no stinkin' badges" at last month's
However, like Williams, those who have an established understanding of game mechanics are better positioned to get ahead. Why? Because it's a matter of better knowing how human behavior works.
If you're aware of certain ingredients that foster a higher propensity for sharing a social experience on Facebook, then you may realize higher fan growth and engagement as a result of implementation.
I touched on the Sanrio/Hello Kitty gifts application as an example of this when discussing social intelligence for Facebook marketing.
Another recent and impressive implementation of game mechanics (and overall digital strategy) is Vail Resort's EpixMix, which is also promoted on the Facebook page.
Although the application doesn't reside on Facebook, the Connect functionality takes full advantage of Facebook sharing via passive, automated check-ins at six separate ski resorts, all enabled by an RFID chip embedded in your ski pass.
"Passive" means you don't need to pull out a mobile device for checking in. Updates to your Facebook feed are automatically posted based on your location with the pass, and one-time Facebook authorization.
A leading game mechanic in play for EpicMix is the use of more than 200 ski pins (digital "stinkin' badges") you can earn based on locations you ski at each resort, total feet of elevation skied and more. Although Vail Resort's CEO, Rob Katz, wasn't specific about adoption rate when asked last month, he was very clear about the fact that users signing on to share in Facebook exceeded expectations.
Game on.
Strategic Modeling for Social Strategies
While game mechanics address specific strategies from a human behavior perspective, the bigger and equally important picture pertains to how all elements of social marketing work together for the good of a business.
A valuable, but often overlooked practice is to adopt a model that facilitates a framework for strategy. There are a range of options with strategic models, but the one I follow is a layered ("Four Cs") approach: 

 
Content: This is the foundational element, focusing not only on the type of content (video, infographic, written, etc.) but also how to apply supporting research to guide its development and/or justification.



• Context: Think of this second layer as platforms enabling the display and distribution of your content. Facebook, for example, would be an element of context in this model.

•    Campaigns: This layer puts the context in action, addressing key variables around planning, implementation, supporting applications, visibility efforts, communication, and measurement.

•    Community: As the top layer, the strategic focus centers on loyalty achieved through specific campaigns, advocacy, or customer experiences. Community should be viewed as long-term, with the expectation of learning that can be applied to future iterations of strategy and research.

Practically speaking, we as marketers should be both implementers and "idea evaluators." But as strategists, we're called to a higher accountability -- one that distinguishes originality from repurposing, and activity from productivity. 




marketingseo

Friday, January 14, 2011

Will Search Drive Mobile Ad Revenues?

Will Search Drive Mobile Ad Revenues?

Last month, BIA/Kelsey released its annual mobile forecast. It projects mobile ad revenues in the U.S. to grow from $490 million in 2009 to $2.9 billion in 2014, a compound annual growth rate of 43 percent.But more interesting than the total revenue pie is the breakdown of formats that drive this growth. SMS and display ads currently lead in revenue but are projected to be eclipsed by the faster moving mobile search ad category over the next five years. 


 

By the Numbers
So why is that? There are intricate formulas to devise these projections, unique to the way that each of these formats are bought and sold. Inputs include search volume, ad coverage, page views, CPCs, CPMs, etc.
Aggregate revenue for top mobile ad networks are also used to confirm figures. Along these lines, Google's announced $2 billion global mobile run rate was affirming, given its estimated 60 percent share of the U.S. mobile ad market (including AdMob).
But looking back over why search ad revenue will accelerate so rapidly, a few interesting theories arise. First, it's projected that the mobile web is expected to grow at a faster pace than the native apps that have erstwhile ruled the smartphone environment.
Because search is the front door to browser-based experiences, this bodes well for search volume and thus revenues. Add in the fact that the immediacy and commercial intent of mobile users drives search ad "CTRs and CPCs higher than desktop equivalents.
Back to the premise that the mobile web will grow faster than apps, this is a bone of contention as the industry-wide "apps vs. mobile web" debate rages on. This is also one of the increasing points of friction between Apple and Google.
Google's core search business compels it to push for a world where the browser is the front door. Comparatively, Apple's app-centric universe spreads content and features into little self-defined buckets where search isn't quite as necessary.
This is much behind Google's outspoken support for the mobile web and its own practice to "develop first" for the mobile web for products like Gmail, Latitude, YouTube, and others. 

World Wild Web
But more so than Google's sway and the rest of the factors above, it could really just end up being a combination of economics and improving mobile browsers that push users and developers toward the mobile web.
Things like HTML5 allow developers to build mobile websites, (a.k.a. web apps), with features previously reserved for native apps. And it's much cheaper to build a web app and reach many more users across platforms.
As these factors take hold, the point is that we'll see more and better content fill the mobile web. By comparison, it now resembles the Wild West environment we saw on the desktop 15 years ago, where content is lacking, hard to find, and under-optimized.
Mobile ad network Chitika reports that only 4 percent of top online domains have optimized mobile sites. It's no wonder why most mainstream mobile users flock to app stores instead.
But this could all change as many of the factors above coalesce and as more content comes online. In parallel, we'll also see mobile users get better and more comfortable at searching the mobile web -- just like they did on the desktop over the past decade.
And don't forget parallel technologies that will make searching easier such as voice and visual search. This includes bar code scanners, voice search, and other inputs that are more intuitive than tapping a tiny keyboard.
Google, again in support of boosting mobile search volume, has made lots of investment in these areas, such as Goggles and voice actions for Android. It even announced that a surprisingly high 25 percent for its mobile searches executed with voice.
To tie all of this to a monetization engine, Google is increasingly adding options to AdWords to build mobile search campaigns. In 2010 it launched mobile pay-per-call ads and hyperlocal ad targeting. 

Repeating History
Through all of this, we'll start to see the mobile web become a much more functional, substantive, and friendly place to search. Monetization will follow. 

Don't forget, desktop computing over the past five years shifted from being a client-centric environment to one that's more browser-based, where content and software reside in the cloud. We'll see a similar shift in mobile.
Not to be so down on apps -- they aren't going away any time soon. If anything, Apple's move to bring them to tablets and now the desktop will ensure a solid future. But the mobile web will see faster growth.
That of course means more search volume. Combined with higher CTRs and CPCs than desktop search, it becomes a matter of arithmetic to plot a fairly healthy roadmap for mobile search ad revenue.

marketingseo