Saturday, June 7, 2014

Backlinks = Rankings, Rankings = Traffic – Deal With It

Backlinks = Rankings, Rankings = Traffic – Deal With It

Link building, done correctly, is hard work. It's laborious and filled with lots of rejection. In many ways, it's like telemarketing – nobody likes it, but it pays off. In fact, link building still works better than anything else to boost organic rankings.

Google's Matt Cutts recently confirmed the continuing value of links here and here. The key quotes:
...backlinks…are a really, really big win in terms of quality for search results ... backlink relevance still really, really helps in making sure we return the best, most relevant, most topical set of search results.

...backlinks still have many, many years left in them ... over time backlinks will become a little less important ... we will continue to use links in order to assess the basic reputation of pages and websites.
So what are the takeaways from these videos? Most commenters picked up on natural language processing and authorship, as probable ranking factors moving forward, without acknowledging the core message of the videos – links still matter a lot. And they will continue to matter for many years.
I don't blame anyone for chalking these videos up to Google FUD (fear, uncertainty, and doubt), but the evidence doesn't stop there. I'm sure that you're familiar with the Penguin algorithm. How about manual penalties, for unnatural links? If links didn't matter, would there be an entire industry dedicated to link audits and sanitizing backlink profiles? Would Google spend so much time and resources battling spammy links? Of course not.

Still, there is a whole contingency of "link deniers" proclaiming that "link building is dead." These folks are just as fervent in their beliefs as the "truthers" and "birthers" despite factual evidence that runs to the contrary. (There's a pretty good chance you will read their comments, below.)
So why are so many people running away from the single most important task in building organic rankings? The answer is pretty simple. Not only is link building hard, but if done improperly, it can result in a penalty and in the most extreme cases can even get you sued!
Can you blame SEO professionals for running away from that hot mess? Of course you can – and you should.

Every marketing campaign focused on building organic rankings needs a link building component. Thousands, if not millions, of pages of great content are published on the web daily – most will never be seen by human eyes. Great content alone, in a competitive niche, rarely ranks without links.
There's a big difference between link building (baiting, earning) and link spamming. The kinds of links that matter are the ones that are editorially given. Links with innate value, not necessarily SEO value. These links require human intervention for placement. A link that can be dropped automatically by anyone has little value and often leads to abuse and trouble.
So, what are some effective techniques for building links in 2014? Actually, the same strategies advised by Cutts way back on March 4, 2010 still hold up today:
·         Create controversy: Use it sparingly like spice. The occasional rant is best and if over-used, loses its effectiveness.
·         Use humor: Offered as a "softer" alternative to controversy. Can be equally effective – especially if original. (The Oatmeal has built a franchise on funny)
·         Participate in blog and forum communities: Not as a spammer, but as an interested community member who gives back to the community by answering questions that help people. This builds credibility and opens up opportunities to attract links.
·         Publish original research: Doing a little work to dig into a subject can get a lot of links.
·         Use social media: Think about where your target audience spends their time. Is it Facebook, Twitter, Instagram? You need to be there as well. Like blog and forum communities, getting to know people via social media opens up link opportunities.
·         Create a "Top X List": Like controversy, this is best used sparingly or it can get old fast
·         Blog frequently and establish yourself as an authority in your field: If authorship had been in place when this video was produced, I'm sure that would have been mentioned, as well.
·         Create how-tos and tutorials: They may not attract a ton of links, but a few good links can have a huge impact – especially on the long tail. These are also a natural for video.
·         Create a useful product and give it away for free: Firefox extensions, Chrome extensions, WordPress plugins, anything open source.

Bottom Line
Cheap, easy, automated link spamming is no longer an option for those in it for the long haul. (Notice that I didn't say that it's dead or doesn't work.)

Editorial link building is alive and well and more powerful than ever before. Getting position one for a keyword is no longer the only KPI to measure, but it's still an important metric to pay attention to as a means to drive organic traffic to your website.
Article resources :http://searchenginewatch.com/article/2347618/Backlinks-Rankings-Rankings-Traffic-Deal-With-It





Thursday, December 19, 2013

Improving Url Removals On Third Party At Google Webmaster

Improving Url Removals On Third Party 


Wednesday, December 18, 2013 at 6:55 AM

Webmaster level: all

Content on the Internet changes or disappears, and occasionally it's helpful to have search results for it updated quickly. Today we launched our improved public URL removal tool to make it easier to request updates based on changes on other people's websites. You can find it at




This tool is useful for removals on other peoples' websites. You could use this tool if a page has been removed completely, or if it was just changed and you need to have the snippet & cached page removed. If you're the webmaster of the site, then using the Webmaster Tools URL removal feature is faster & easier.

How to request a page be removed from search results

If the page itself was removed completely, you can request that it's removed from Google's search results. For this, it's important that the page returns the proper HTTP result code (403, 404, or 410), has a noindex robots meta tag, or is blocked by the robots.txt (blocking via robots.txt may not prevent indexing of the URL permanently). You can check the HTTP result code with a HTTP header checker. While we attempt to recognize "soft-404" errors, having the website use a clear response code is always preferred. Here's how to submit a page for removal:
  1. Enter the URL of the page. As before, this needs to be the exact URL as indexed in our search results. Here's how to find the URL.
  2. The analysis tool will confirm that the page is gone. Confirm the request to complete the submission.
  3. There's no step three!

How to request a page's cache & snippet be removed from search results

If the page wasn't removed, you can also use this tool to let us know that a text on a page (such as a name) has been removed or changed. It'll remove the snippet & cached page in Google's search results until our systems have been able to reprocess the page completely (it won't affect title or ranking). In addition to the page's URL, you'll need at least one word that used to be on the page but is now removed. You can learn more about cache removals in our Help Center.
  1. Enter the URL of the page which has changed. This needs to be the exact URL as indexed in our search results. Here's how to find the URL.
  2. Confirm that the page has been updated or removed, and confirm that the cache & snippet are outdated (do not match the current content).
  3. Now, enter a word that no longer appears on the live page, but which is still visible in the cache or snippet. See our previous blog post on removals for more details.

You can find out more about URL removals in our Help Center, as well as in our earlier blog posts on removing URLs & directoriesremoving & updating cached contentremoving content you don't own, and tracking requests + what not to remove.

We hope these changes make it easier for you to submit removal requests! We welcome your feedback in our removals help forum category, where other users may also be able to help with more complicated removal issues.


Wednesday, September 4, 2013

How to Use Google Trends for SEO

How to Use Google Trends for SEO 

Chuck Price, September 3, 2013



Google Trends is one of the best and most versatile tools available for SEO. It is the marketing equivalent of the Leatherman or Swiss Army knife. If you could only use one SEO tool to develop an Internet marketing campaign, this product would be a serious contender.

Working with Google Trends

When performing a search on Google Trends, you have the option to set four variables or parameters (default shown in bold):

Web Search – Image search – News Search – Product Search – YouTube Search
Worldwide – Option to choose a specific Country
2004-Present – Past 7 Days – 30 Days – 90 Days – 12 Months- Choose a Year
All Categories – Arts & Entertainment – Autos & Vehicles – Beauty & fitness – Books & literature – Business & industrial – Computers & electronics – Finance – Food & drink – Games
You can compare up to five search terms or groupings at one time, with up to 25 search terms in each grouping.

For example:

pen + pencil + paper (grouping 1)
stapler + tape + notebook + ruler (grouping 2)
eraser + paper clip (grouping 3)


By using the + sign between your search terms you are telling Google that you want to include searches for pen or pencil or paper.



Google also displays Hot Searches and Top Charts in Google Trends, listing the top searches of the day as well as popular searches by category.


Having all of this data available is great, but knowing what to do with it is even better. Following is a guide on how to use this information for SEO.


Keyword Research
Since Google Trends doesn't give actual search numbers, it works best when used in combination with the Keyword Planner. Google Trends will show a "normalized" or relative level of interest over time for a prospective keyword phrase. It also allows you to compare the level of interest among potential target phrases.


Let's say you're selling car parts. When does interest in car parts peak? What potentially drives more traffic; the search phrase "car parts" or "auto parts"?




Product Searches have more than doubled since fall 2010. Clearly, the interest is there, but you should do a competitive analysis, before jumping into any space.

Geo-Targeting

Google Trends breaks down the search data by region. As you can see below, there is some level of interest in auto parts across the entire U.S., with the greatest level coming from Georgia and Florida.





Drill down further and you will see that Atlanta is a particularly strong market:


If you're doing local SEO or geo-targeted PPC, this data is invaluable.

News Jacking

Newsjacking suddenly, is all the rage in SEO. According to David Meerman Scott, it's "the process by which you inject ideas or angles into breaking news, in real-time, in order to generate media coverage for yourself or your business."

If Hot Searches didn't exist, someone would create it for newsjacking. The newsjacking formula is a simple one:

Choose a trending topic.
Blog about it.
Tweet it (using the established hashtag).
Don't be a moron (e.g., don't try to capitalize on tragedy).
Here's a great example of newsjacking in Bongo Bongo land.

Content Creation


Top Charts is the perfect resource for developing content ideas that people are actually interested in. Sticking with the car parts theme, navigate to Car Companies, click on "BMW", then click on "explore" in the right column.


Looks like a blog post about the BMW i3 and / or the BMW electric car would garnish some interest. If the term "Breakout" appears under Rating, the searches for that phrase have jumped by +5,000 percent.

Link Building

Links are still a primary driver of rankings. By creating content that people are looking for and want to read, you will attract links. Links are a measure of success when reviewing the outcome of your content marketing efforts.

Video Content

Poop. That's right; poop is the top result when I do a Google Trends search for "YouTube" with the search parameter set to YouTube:


I sure wasn't expecting to find an explosion of YouTube Poop (+250 percent since '08) and that's precisely the point of using this tool for video content research. Congratulations to California, with a search volume index of 100 on this one.

To play this game at home:

Navigate to Google Trends.
Enter your keywords.
Change "Web Search" to "YouTube" search.
Brainstorm
Is there an idea that you can use for your niche?
Is there a trend that you can capitalize on?
This data may also be used for video optimization:

Creating great titles.
Using the right tags.
Optimizing descriptions.

Brand Monitoring
This one only works for "big Brands" with sufficient search volume. In the case cited below, three of four competitors are static, but one company is clearly in the zone. AutoZone.


Takeaway

Google has a voracious appetite for fresh topical content. Google Trends is the single best tool available to develop content ideas that will garner traffic and links. If you haven't been using this tool for SEO purposes, you should check it out now.




Tuesday, May 28, 2013

Matt Cutts On 10 New SEO Changes

Matt Cutts On 10 New SEO Changes At Google In Next Few Months

Yesterday, Google's Matt Cutts did something he doesn't often do - he pre-announced changes Google will be implementing to the ranking and indexing algorithms in the next few months. Specifically, he mentioned about ten changes coming to Google's search results and algorithms over this summer - in the "next few months" he said.
Of course, Matt, Google's head of search spam, adds a disclaimer that timelines and priorities may change between now and then - but this is what is scheduled currently.
Here is the video:

Now, I go into detail on each of the ten points at Search Engine Land but here is the summary of those details:
  1. Major Penguin Update
  2. Advertorial Spam
  3. Spammy Queries Being Looked At
  4. Going Upstream At Link Spammers
  5. More Sophisticated Link Analysis
  6. Improvements On Hacked Sites
  7. Authority Boost In Algorithm
  8. Panda To Soften
  9. Domain Clusters In SERPs
  10. Improved Webmaster Communication
Since I will be offline the next two days, I may do more detailed scheduled blog posts about each one of these. For now, read Search Engine Land and watch Matt's video.
Forum discussion at WebmasterWorld & Hacker News.
Article reference:http://www.seroundtable.com/google-seo-changes-16782.html

Google's Major Penguin Update

Google's Major Penguin Update Coming In Weeks. It Will Be Big.....


On Friday, Google's head of search spam, Matt Cutts announced on Twitter that the Penguin update we areexpecting this year, will be coming in the next few weeks.
Matt Cutts said, "we do expect to roll out Penguin 2.0 (next generation of Penguin) sometime in the next few weeks."
This has sent shockwaves through the webmaster and SEO industry over the weekend. We know the next generation Penguin update is a major revision to the existing one. Matt said the previous ones were minor updates. To take you back, we had an update on May 24, 2012 andOctober 5, 2012. Matt said on Twitter that those were more minor, he would have named them 1.1 and 1.2 and that Google is naming this new update version 2.0.
We are calling it the 4th update to Penguin, but yea, this is expected to be huge. We past the anniversary of the Penguin update and many SEOs and webmasters have yet to recover.
Now with the next generation update, many SEOs are hopeful of recovery but terrified that their efforts will end up being futile. Why? Well, even if they did manage to clean up their sites and do everything to warrant a release of the initial Penguin algorithm, with the new algorithm in place, who knows what else they may have triggered.
Danny Sullivan has an excellent write up on this Penguin release and the history around it.
Trust me, I will be all over this when I see signs in the forums about this update. So stay tuned, brace yourself and trust me - webmasters will survive and grow from this.
Forum discussion at Google Webmaster Help, WebmasterWorld and DigitalPoint Forums.
Update: Here is a video from Matt Cutts where he talks about Penguin 2.0, and many other topics. It was released today:





Article reference:http://www.seroundtable.com/google-penguin-four-16775.html
bestmarketingseo.com

Wednesday, April 17, 2013

Link Networks: Don't Build Your SEO Strategy on a House of Cards

Link Networks: Don't Build Your SEO Strategy on a House of Cards
 
Links are important for rankings. We all know that.
However, in the rush to get those rankings (that lead to converting traffic of course) webmasters and business owners can be tempted to take shortcuts for easy wins.
One of the most popular (and dangerous) ways that you can grab up a bunch of links quickly is by using a link network (also called a blog network, site network, or sometimes article network). It's one-stop shopping for links in large quantities.
Sounds great, right? You deal with one person, give them the keywords you want to rank for, and you can get 500 links tomorrow.
Again, an easy win – until you remember that in link building, there really are no easy wins.
Here's how you can better understand what a network is, how to identify networked sites, and whether those easy links are too good to be true.

What is a Link Network?

It seems there is mass confusion about link networks. Heck, even in my own office, we'll occasionally argue about whether we've actually found a true network, and if we have? We will argue about whether it's a bad one.
Simply, a link (or site/blog/article) network is a group of sites that are connected. They can be owned by one person or multiple people, their connections can be as obvious as a badge displayed that proudly identifies the site as a member of X network or as covert as a footprint uncovered by lots of digging.
From my experience, there are many immediate site tells that indicate a site may be a member of a network:
  • Language on the site. From "Proud partner in ABC Network" to "See our other networked sites" the key here is the wording about networks.
  • Network badges.
  • Page that lists a ton of other sites. This can be linked with the anchor "Friends" or "Partners" and doesn't always indicate a network, but it does indicate the need for attention.
There are a few others that require some digging once you think sites are connected as network members. I mention these because my experience has been that many webmasters won't be upfront with you and will offer you links on various sites while swearing they aren't connected in any way.
  • Same or very similar template used for multiple sites.
  • Same Google Analytics number or Google AdSense number used. You can use ewhois for this. 
  • Same site owner for loads of different sites.
  • Same IP address. This one is tricky in case there's shared hosting involved but it can be useful. It just doesn't guarantee that the sites sharing the same IP address are networked of course.
There are also immediate webmaster tells if you're in contact.
  • Email signature lists 10 or more other sites.
  • Webmaster contacts you and says he has some great new sites for you to look at.
  • Webmaster sends you a list of sites he owns without you asking.
 

Recent Issues With Networks

Go search for [network penalized] or [network deindexed] and see those results. Scary stuff isn't it?
Networks can get deindexed or their links can be devalued, which is the same result for you if you're basing your link foundation off those sites.
My biggest concern with networks is the quality, though. Unless it's a really good one, the quality of the sites connected tends to be pretty low. There's a lot of duplicate content, excessive cross-linking between sites, and duplicate social signals.
One other giant problem? Networked links aren't free. If you get caught using them, you're getting caught for buying links, basically.

What Does Google Think About All This?

Let's not forget to check Google's Webmaster Guidelines, which, as they related to links, seem to be getting tighter and tighter. They warn against the use of link schemes and specifically mention “using automated programs or services to create links to your site.”
Getting links on a network takes very little time and is obviously quite unnatural. If that's not a scheme, I'm not sure what is.
Even if you aren't dealing with a true network (and are instead dealing with a lone webmaster who has an unofficial one where he just happens to own 100 sites and can quickly add your link to each one) the key here is the shortcut taken to get links.
Many times you'll immediately know that the sites putting up your links are members of a network simply because you've contracted with someone for that exact service. However, as with anything, there are unscrupulous companies who will simply not inform you that the sites they're getting links on are networked. Therefore you need to ask questions and do your own research so that you aren't solely relying upon the word of someone who may not have your best interests at heart.
Now, I have no problem with people understanding risks and asking for risky techniques. My problem is with clients not understanding risk and getting talked into doing something detrimental without being properly informed of the danger.
Just as it isn't enough to say "buying links is risky" it's not enough to say that networks can be risky. Clients need to be informed of what can happen if the networks hosting their links get caught and deindexed.
My link building agency runs into networks every single day. Some of them are good but many of them are bad if not downright dangerous.
Many potential clients still ask for that kind of service, too, despite all the publicity surrounding some of the big ones getting caught. I'm much less paranoid about the ones we encounter doing discovery than I am about the ones that come to my link builders in a giant spreadsheet, unsolicited.
In my mind, the worst networks are a house of cards. Think about what would happen if your site ranked well off a network that got deindexed or penalized.
I've known people with sites made up of links that mainly came off a network and when it got hit, they lost a lot of money. Fast forward a year and some of them are still struggling to get back to where they were.

Link Networks: The Good (Or OK For Now)

As I said, some of them seem ok for now. I do worry about the future in case they get nailed, but it would be silly to say that all networked sites are worthless or dangerous.
Members of the network are indexed in Google, ranking for key terms and their brand, not excessively cross-linked to other network member sites, don't share the same ip address with the majority of the sites, aren't all owned by the same person or couple of people, and don't seem to exist just to sell links. There isn't a giant master list of members posted on every networked site. The majority of the sites have decent Google Toolbar PR. Searching for the network name doesn't generate tons of negative results.

The Not-So-Great Link Networks

A few of the members of the network are not indexed in Google. Some of them don't rank for any terms that you can find. Many of them share the same ip address. Searching for the network name generates lots of negative results.

The Really Bad Link Networks

Most sites have no Toolbar PageRank, are not indexed in Google and if they are, they don't rank for the brand/URL or any snippets from the homepage. Most sites post a list of the other members and link to them. Duplicate or very thin content is obvious. Wording on site is poorly done.

Bottom Line

I am definitely very paranoid about networks and have become much less tolerant of them over the past year, but I do realize that getting a link from a few networked sites here and there isn't going to seriously hurt most sites that have decent links for the most part.
The real danger lies in only working with networks. Some sites might not be a member of a network when you secure a link on them but get bought and added, so it's not something that you can completely control, either.
Just be careful with networks, as they can be too good to be true. Just remember that whether you think a blog network is good or bad, it doesn't mean that the search engines will agree with you. What they think is the bottom line when it comes right down to it.

Article reference:http://searchenginewatch.com/article/2261642/Link-Networks-Dont-Build-Your-SEO-Strategy-on-a-House-of-Cards?wt.mc_ev=click&WT.tsrc=Email&utm_term=&utm_content=Link%20Networks%3A%20Don%27t%20Build%20Your%20SEO%20Strategy%20on%20a%20House%20of%20Cards&utm_campaign=04%2F16%2F13%20-%20SEW%20Daily&utm_source=Search%20Engine%20Watch%20Daily&utm_medium=Email

marketing seo

Monday, April 8, 2013

SEO Basics: 8 Essentials When Optimizing Your Site


8 Essentials When Optimizing Your Site


Basic search engine optimization (SEO) is fundamental. And essential. SEO will help you position your website properly to be found at the most critical points in the buying process or when people need your site.

What are search engines looking for? How can you build your website in a way that will please both your visitors/customers, as well as Google, Bing, and other search engines? Most importantly, how can SEO help your web presence become more profitable?

During the Introduction to SEO session at SES New York, Carolyn Shelby (@CShel), Director of SEO, Chicago Tribune/435 Digital, fully explained the extreme value SEO can deliver to a site, and stressed the importance of basic SEO using the following analogy: "Skipping the basics and spending all your time and money on social and 'fancy stuff' is the same as skipping brushing your teeth and showering, but buying white strips and wearing expensive cologne," Shelby said. Although the Introduction to SEO session was intended for industry newcomers, Shelby's tips offer important reminders for even experienced SEO professionals who have been optimizing sites for years.

What is SEO, Exactly?

The goal of foundational SEO isn't to cheat or "game" the search engines. The purpose of SEO is to:
  • Create a great, seamless user experience.
  • Communicate to the search engines your intentions so they can recommend your website for relevant searches.
1. Your Website is Like a Cake

Your links, paid search, and social media acts as the icing, but your content, information architecture, content management system, and infrastructure act as the sugar and makes the cake. Without it, your cake is tasteless, boring, and gets thrown in the trash.

2. What Search Engines Are Looking For

Search engines want to do their jobs as best as possible by referring users to websites and content that is the most relevant to what the user is looking for. So how is relevancy determined?
  • Content: Is determined by the theme that is being given, the text on the page, and the titles and descriptions that are given.
  • Performance: How fast is your site and does it work properly?
  • Authority: Does your site have good enough content to link to or do other authoritative sites use your website as a reference or cite the information that's available?
  • User Experience: How does the site look? Is it easy to navigate around? Does it look safe? Does it have a high bounce rate?

3. What Search Engines Are NOT Looking For

Search engine spiders only have a certain amount of data storage, so if you're performing shady tactics or trying to trick them, chances are you're going to hurt yourself in the long run. Items the search engines don't want are:
  • Keyword Stuffing: Overuse of keywords on your pages.
  • Purchased Links: Buying links will get you nowhere when it comes to SEO, so be warned.
  • Poor User Experience: Make it easy for the user to get around. Too many ads and making it too difficult for people to find content they're looking for will only increase your bounce rate. If you know your bounce rate it will help determine other information about your site. For example, if it's 80 percent or higher and you have content on your website, chances are something is wrong.

4. Know Your Business Model

While this is pretty obvious, so many people tend to not sit down and just focus on what their main goals are. Some questions you need to ask yourself are:
  • What defines a conversion for you?
  • Are you selling eyeballs (impressions) or what people click on?
  • What are your goals?
  • Do you know your assets and liabilities?
5. Don't Forget to Optimize for Multi-Channels

Keyword strategy is not only important to implement on-site, but should extend to other off-site platforms, which is why you should also be thinking about multi-channel optimization. These multi-channel platforms include:
  • Facebook
  • Twitter
  • LinkedIn
  • Email
  • Offline, such as radio and TV ads
Being consistent with keyword phrases within these platforms will not only help your branding efforts, but also train users to use specific phrases you're optimizing for.

6. Be Consistent With Domain Names

Domain naming is so important to your overall foundation, so as a best practice you're better off using sub-directory root domains (example.com/awesome) versus sub-domains (awesome.example.com). Some other best practices with domain names are:
  • Consistent Domains: If you type in www.example.com, but then your type in just example.com and the "www" does not redirect to www.example.com, that means the search engines are seeing two different sites. This isn't effective for your overall SEO efforts as it will dilute your inbound links, as external sites will be linking to www.example.com and example.com.
  • Keep it Old School: Old domains are better than new ones, but if you're buying an old domain, make sure that the previous owner didn't do anything shady to cause the domain to get penalized.
  • Keywords in URL: Having keywords you're trying to rank for in your domain will only help your overall efforts.

7. Optimizing for Different Types of Results

In addition to optimizing for the desktop experience, make sure to focus on mobile and tablet optimization as well as other media.
  • Create rich media content like video, as it's easier to get a video to rank on the first page than it is to get a plain text page to rank.
  • Optimize your non-text content so search engines can see it. If your site uses Flash or PDFs, make sure you read up on the latest best practices so search engines can crawl that content and give your site credit for it.

8. Focus on Your Meta Data Too

Your content on your site should have title tags and meta descriptions.
  • Meta keywords are pretty much ignored by search engines nowadays, but if you still use them, make sure it talks specifically to that page and that it is also formatted correctly.
  • Your meta description should be unique and also speak to that specific page. Duplicate meta descriptions from page to page will not get you anywhere.
Title tags should also be unique! Think your title as a 4-8 word ad, so do your best to entice the reader so they want to click and read more.

Summary

You should always keep SEO in the forefront of your mind, and always follow best practices. Skipping the basics of SEO will only leave your site's foundation a mess and prevent you from fully maximizing revenue opportunities.

Article reference:http://searchenginewatch.com/article/2259693/SEO-Basics-8-Essentials-When-Optimizing-Your-Site

marketingseo




Friday, March 8, 2013

6 Big Myths About SEO Google

6 Big Myths About SEO


Your understanding of the way Google works is probably three or four years out of date--and that's an eternity in Web time.


In the world of online marketing, misinformation abounds--and it gets compounded exponentially by an incredibly dynamic and rapidly evolving world. Most of the things you think you know (but don't) about search-engine optimization, or SEO, may have been true a few years ago but have changed; one of the following was always a myth.
Here are some of the myths you need to move beyond to get smarter about SEO..





Myth 1: Metatag Descriptions Help Your Rankings

Not anymore; in fact, metatags are no longer even indexed by Google and Bing. But don't ignore them altogether: Your metatags form the text that is displayed along with your link in the search results--and a more compelling description will compel more users to click on your listing instead of on others.
Here's example of ours; the metatag is everything below the URL.



Myth 2: The More Inbound Links, the Better

False. In all the recent updates to Google's algorithm, the search giant has made it a core priority to have quality trump quantity. Gone are the days of having thousands of superlow-quality links driving up rankings; in fact, creating those links can look spammy and get your site penalized.Focus on obtaining links from sites that are relevant to your products, services, or industry--and on having those links be surrounded by relevant text. A blog review about your "blue widget" that links to your site is far more valuable than a rogue link for "blue widget" stuck in the footer or sidebar of some site--even a highly ranked one.

Myth 3: PageRank Still Matters

Google's infamous PageRank (named after Google co-founder and now-CEO Larry Page, mind you) is a 1-to-10 ranking of the overall authority of every website; the bigger the number, the higher the rank. In years past, this seemingly all-powerful number dominated the attention of SEO experts.
But today, Google's algorithm has evolved well beyond any single indicator. The PageRank still exists, and if all things are equal, a higher PageRank trumps a lower one--but factors such as relevance and context matter, too.
As with inbound links: If you run a dental practice in Los Angeles, it's better to have a link from a site that reviews doctors and dentists in L.A., even if it has a PageRank of 4, than to have a paid link with no context in a huge site with a higher PageRank of 7. 

Myth 4: Google Prefers Keyword-Rich Domains

In years past, Google seemed to put a disproportionate amount of emphasis on keywords in the domain name (what you may think of as the URL). For example, vinylhousesiding.com would almost certainly be ranked first in a search for vinyl house siding.
Not anymore, says Google. If vinylhousesiding.com is in fact the more relevant, authoritative site on the topic, it will probably still rank first--but not because of its domain name alone.

Myth 5: Websites Must Be 'Submitted' to Search Engines

In 2001, yes, this was the case--indeed, this was the first service that my company, Wpromote, ever provided. But in 2012? Not at all. At this point, if there is any connection from any site to yours, your site will be quickly discovered by Google.Note that being indexed is a far cry from achieving high rankings--but that initial step of submission is no longer needed or helpful.

Myth 6: Good SEO Is Basically About Trickery

False, false, false. Although there are still some SEO experts out there who go about their business trying to "trick Google," this is absolutely not the way to provide good, lasting SEO.
Good SEO is about creating a relevant, informative website, with unique content and great user experience, and encouraging the sharing and distribution of great content to drive organic publicity and links back to your site.In the end, this is exactly what Google explicitly wants to reward with high rankings--so it is anything but "tricking" the search engines.


Article reference: http://www.inc.com/michael-mothner/seo-marketing-myths.html
bestmarketingseo.com



Tuesday, December 25, 2012

Google Panda Update vs Google Penguin Updates


The SEO community has been a buzz this past week with the latest update from Google, named Penguin. Penguin came down the pipeline last week, right on the tail of the latest Panda update. Since most of the big updates in the past year have been focused on Panda, many site owners are left wondering what the real differences between Panda and Penguin are. Here is a breakdown:

Google Panda Update Overview:

According to Google’s official blog post when Panda launched,

This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.

Basically, Panda updates are designed to target pages that aren’t necessarily spam but aren’t great quality. This was the first ever penalty that went after “thin content,” and the sites that were hit hardest by the first Panda update were content farms (hence why it was originally called the Farmer update), where users could publish dozens of low-quality, keyword stuffed articles that offered little to no real value for the reader. Many publishers would submit the same article to a bunch of these content farms just to get extra links.


Panda is a site wide penalty, which means that if “enough” (no specific number) pages of your site were flagged for having thin content, your entire site could be penalized. Panda was also intended to stop scrappers (sites that would republish other company’s content) from outranking the original author’s content.

Here is a breakdown of all the Panda updates and their release dates. If your site’s traffic took a major hit around one of these times there is a good chance it was flagged by Panda

1. Panda 1.0 (aka the Farmer Update) on February 24th 2011
2. Panda 2.0 on April 11th 2011. (Panda impacts all English speaking countries)
3. Panda 2.1 on May 9th 2011 or so
4. Panda 2.2 on June 18th 2011 or so.
5. Panda 2.3 on around July 22nd 2011.
6. Panda 2.4 in August 2011(Panda goes international)
7. Panda 2.5 on September 28th 2011
8. Panda 2.5.1 on October 9th 2011
9. Panda 2.5.2 on October 13th 2011
10. Panda 2.5.3 on October 19/20th 2011
11. Panda 3.1 on November 18th 2011
12. Panda 3.2 on about January 15th 2012
13. Panda 3.3 on about February 26th 2012
14. Panda 3.4 on March 23rd 2012
15. Panda 3.5 on April 19th 2012

Search Engine Land recently created this great Google Panda update infographic to help walk site owners through the many versions of the Google Panda updates.

Many site owners complained that even after they made changes to their sites in order to be more “Panda friendly,” their sites didn’t automatically recover. Panda updates do not happen at regular intervals, and Google doesn’t re-index every site each time, so some site owners were forced to deal with low traffic for several months until Google got around to re-crawling their website and taking note of any positive changes.
Search Engine Land recently created this great Google Panda update infographic to help walk site owners through the many versions of the Google Panda updates.

Many site owners complained that even after they made changes to their sites in order to be more “Panda friendly,” their sites didn’t automatically recover. Panda updates do not happen at regular intervals, and Google doesn’t re-index every site each time, so some site owners were forced to deal with low traffic for several months until Google got around to re-crawling their website and taking note of any positive changes.

Google Penguin Update Overview:

The Google Penguin Update launched on April 24. According to the Google blog, Penguin is an “important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.” Google mentions that typical black hat SEO tactics like keyword stuffing (long considered webspam) would get a site in trouble, but less obvious tactics (link incorporating irrelevant outgoing links into a page of content) would also cause Penguin to flag your site. Says Google,

Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

Site owners should be sure to check their Google Webmaster accounts for any messages from Google warning about your past spam activity and a potential penalty. Google says that Penguin has impacted about 3.1% of queries (compared to Panda 1.0’s 12%). If you saw major traffic losses between April 24th and April 25th, chances are Penguin is the culprit, even though Panda 3.5 came out around the same time.

Unfortunately, Google has yet to outline exactly what signals Penguin is picking up on, so many site owners that were negatively impacted are in the dark as to where they want wrong with their onsite SEO. Many in the SEO community have speculated that some contributing factors to Penguin might be things like:

1. Aggressive exact-match anchor text
2. Overuse of exact-match domains
3. Low-quality article marketing & blog spam
4. Keyword stuffing in internal/outbound links

It’s important to remember that Panda is an algorithm update, not a manual penalty. A reconsideration request to Google won’t make much a difference–you’ll have to repair your site and wait for a refresh before your site will recover.  As always do not panic if you are seeing a down turn in traffic, in the past when there is a major Google update like this things often rebound.  If you do think you have some sort of SEO penalty as a result of either the Google Panda or Google Penguin updates, please contact your SEO service provider to help or start trouble shooting.

Article reference:brickmarketing.com/blog/panda-penguin-updates.htm
bestmarketingseo.com






Wednesday, April 25, 2012

What is this "over optimization" that everyone is talking about?


Google’s “over optimization” Penalty 

1.  The most common infraction is keyword stuffing/spamming/abusing etc.  When desperate for rankings people will often resort to just repeating keywords as often as they can.  You need to focus on proper keyword usage and the best test for this is to read your content out loud.  You'll know if it sounds keyword stuffed and you can go back and edit it.  One way to avoid keyword stuffing is to use different variations of your words and phrases and look for synonyms that you can get in there.  That eliminates the over usage of one phrase and it creates more phrases you can possibly be found for.

2.    Lots of low quality links all with the same or very similar anchor text. This dates back to the days of the good old fashioned link exchange - "you link to me and I'll link to you and by the way, let's both have everyone linking to us use the same anchor text and all link to only the homepage".  Not cool these days.  As with most things in SEO, it's about balance and well rounded diversity.  You want quality links to many different pages, all with different anchor text (that is relevant to the page you are linking to).  These kind of links are hard to obtain but Google doesn't care.  It's what they want.  Most people want a quick easy link fix and it was never a good idea but it's even more important that you avoid it these days.

3.    Too little content on the page but lots of "optimization"

4.    Lots of duplicate content taken from sites that rank well.

5.    There is some theory that too many sites being pointed/redirected to your main site could hurt you.  I've heard a lot of talk about this one but not seen a lot of proof

6.    Link farms/networks. There are two ways this can hurt you.  If you are running a link farm - which means too many links on a page, none with any real text or purpose other than giving out links.  The second way is having a link farm/network linking to your site (especially if the same ones have multiple links back to your site).

It actually surprises me that everyone is making such a big deal about this - over optimization has always been a bad thing.  It's been called spamming and black hat (which also involves cloaking and various other nefarious things) but the bottom line is Google has been continually ramping up their efforts to weed out bad sites.  Each time it happens and a group of bad sites take a hit, everyone is surprised and rushes to analyze it.  The truth is you do need to optimize but it needs to be in a more "natural" way.  Avoid some pretty basic things and be strong in the areas you should be.
As I said, this really isn't new.  We went through this months ago.  I mean, wasn't Panda aimed at the quality of sites?  You bet it was!

All this really means is Google is getting more aggressive about cleaning up the search results.  I am actually good with that.  Sites that aren't participating in any of the problem areas have nothing to worry about. If you have a quality site that focuses on "evergreen" optimization, you won't have to worry each time there is an algorithm shake up.

Article Resources Write by Jennifer Horowitz

Thursday, February 23, 2012

Is Cloud Computing Good For SEO?

 Is Cloud Computing Good For SEO?

Cloud computing is often thought about in terms of data storage, not web development. But as it becomes more popular you’ll start to see many more uses for it. For data storage purposes, cloud computing was thought to be a much needed relief for many companies. But it turns out that even Google has security issues with keeping data secure. So if Google can’t do it, who can?

Someday someone is going to get the bright idea to build a website in the cloud. What will happen?

First, I predict the site will be hacked. If data storage is unsecure in the cloud then how much more will web development be? Hackers can already sneak into WordPress and other CMS systems. If those systems were cloud-based they’d be even that much more insecure. And an unsecure server is bad for search engine optimization and your website.


If a hacker gets into your server, he could really do some damage. Link dropping, malicious malware downloads, just to name a couple. Some webmasters on traditional servers have already experienced these issues with WordPress and other CMSs. Just by not updating their software. Now put that non-updated software on the cloud. See the issue?

You could lose your search engine rankings for some of these issues. All because of a lack of security in the cloud.
article sources:http://www.searchengineoptimizationjournal.com/2009/07/16/cloud-computing-seo/ 

The Impact of Cloud Computing on SEO

The Impact of Cloud Computing on SEO


What is Cloud Computing?

Cloud computing has been developing in the last few years and it promises to be the new business model for many companies. With cloud computing there is no need to own the physical infrastructure. Hardware and software capacities are rented through a provider. Users can rent virtual computers on which they can run their own applications, allowing access/share data through the cloud (Internet) at any given moment. Most importantly cloud computing allows sites hosted in the cloud to be accessed rapidly and in the user’s local language, as WebPages are served through a cluster of servers closer to the user’s IP location.

Is Hosting location important for SEO?
Hosting location and country Top Level Domain (TLD) have always been important factors for SEO to target search results specific to a country. While TLD indicated the site’s origin (e.g. .com.au for Australia) to search engines (SEs), hosting has been significantly important to identify the geo-location of a website as SEs have been looking at the IP address of the site to detect the server location. If a site was hosted on a server that was physically located in a country, then that site would have been included in the country-specific searches even if this had a generic TLD domain name. SEO recommendation for a site targeting Australia would have been to choose a hosting provider coming from Australia rather than the US. Hosting location would have especially affected sites using generic TLDs such as .org, .info, .biz, etc. Google also allows webmasters to set the geo-location within Webmaster Tools to help them target a specific market.

Cloud computing is changing the hosting location factor
Cloud computing is going to change the hosting location factor. The reason behind this is that sites can be hosted anywhere in the cloud, however, the web pages are served locally. So sites hosted in the cloud that have different versions (US, UK, AU etc) initially created to target specific countries, will all look like local sites to Google and therefore will start competing against each other. The site that has the highest authority will eventually outranks the others – e.g. if the US version of a site has the strongest authority, this would eventually outrank the AU version within the Australian market.
As cloud computing becomes more popular, Search Engines would change their algorithms to take this into account. This is going to revolutionise the way that sites operate across different countries and in different languages.


Google Caffeine and Cloud Computing

In June 2010 Google released the new Caffeine update, providing a new search indexing system allowing Google to index web pages on an enormous scale. Page speed became a factor in SEO and it started playing an important role in the rankings of websites in search results (especially for sites aiming to rank in a competitive space). One of the key strengths of cloud computing is the faster delivery of web pages, significantly improving the page loading time. Cloud computing allows the distribution of resources more efficiently and effectively and can have a huge impact on a site’s loading time.

Google and Cloud Computing
Google has been promoting cloud computing for quite some time. Matt Cutts on his Blog “Why cloud services rock” mentioned how “hosted services and storing data in the cloud [on someone else’s servers] can be better than doing it yourself”. Matt Cutts, in his video on “Big Changes to search in the next few years”, also commented that “As more people get comfortable with online computing, more of them will choose to store their data from onsite hardrives and store it in the “cloud”. As a result searching in the cloud for relevant information will become increasingly important”.





Article Source :http://www.bruceclay.com/blog/2011/04/3749/