Friday, March 8, 2013

6 Big Myths About SEO Google

6 Big Myths About SEO


Your understanding of the way Google works is probably three or four years out of date--and that's an eternity in Web time.


In the world of online marketing, misinformation abounds--and it gets compounded exponentially by an incredibly dynamic and rapidly evolving world. Most of the things you think you know (but don't) about search-engine optimization, or SEO, may have been true a few years ago but have changed; one of the following was always a myth.
Here are some of the myths you need to move beyond to get smarter about SEO..





Myth 1: Metatag Descriptions Help Your Rankings

Not anymore; in fact, metatags are no longer even indexed by Google and Bing. But don't ignore them altogether: Your metatags form the text that is displayed along with your link in the search results--and a more compelling description will compel more users to click on your listing instead of on others.
Here's example of ours; the metatag is everything below the URL.



Myth 2: The More Inbound Links, the Better

False. In all the recent updates to Google's algorithm, the search giant has made it a core priority to have quality trump quantity. Gone are the days of having thousands of superlow-quality links driving up rankings; in fact, creating those links can look spammy and get your site penalized.Focus on obtaining links from sites that are relevant to your products, services, or industry--and on having those links be surrounded by relevant text. A blog review about your "blue widget" that links to your site is far more valuable than a rogue link for "blue widget" stuck in the footer or sidebar of some site--even a highly ranked one.

Myth 3: PageRank Still Matters

Google's infamous PageRank (named after Google co-founder and now-CEO Larry Page, mind you) is a 1-to-10 ranking of the overall authority of every website; the bigger the number, the higher the rank. In years past, this seemingly all-powerful number dominated the attention of SEO experts.
But today, Google's algorithm has evolved well beyond any single indicator. The PageRank still exists, and if all things are equal, a higher PageRank trumps a lower one--but factors such as relevance and context matter, too.
As with inbound links: If you run a dental practice in Los Angeles, it's better to have a link from a site that reviews doctors and dentists in L.A., even if it has a PageRank of 4, than to have a paid link with no context in a huge site with a higher PageRank of 7. 

Myth 4: Google Prefers Keyword-Rich Domains

In years past, Google seemed to put a disproportionate amount of emphasis on keywords in the domain name (what you may think of as the URL). For example, vinylhousesiding.com would almost certainly be ranked first in a search for vinyl house siding.
Not anymore, says Google. If vinylhousesiding.com is in fact the more relevant, authoritative site on the topic, it will probably still rank first--but not because of its domain name alone.

Myth 5: Websites Must Be 'Submitted' to Search Engines

In 2001, yes, this was the case--indeed, this was the first service that my company, Wpromote, ever provided. But in 2012? Not at all. At this point, if there is any connection from any site to yours, your site will be quickly discovered by Google.Note that being indexed is a far cry from achieving high rankings--but that initial step of submission is no longer needed or helpful.

Myth 6: Good SEO Is Basically About Trickery

False, false, false. Although there are still some SEO experts out there who go about their business trying to "trick Google," this is absolutely not the way to provide good, lasting SEO.
Good SEO is about creating a relevant, informative website, with unique content and great user experience, and encouraging the sharing and distribution of great content to drive organic publicity and links back to your site.In the end, this is exactly what Google explicitly wants to reward with high rankings--so it is anything but "tricking" the search engines.


Article reference: http://www.inc.com/michael-mothner/seo-marketing-myths.html
bestmarketingseo.com



Tuesday, December 25, 2012

Google Panda Update vs Google Penguin Updates


The SEO community has been a buzz this past week with the latest update from Google, named Penguin. Penguin came down the pipeline last week, right on the tail of the latest Panda update. Since most of the big updates in the past year have been focused on Panda, many site owners are left wondering what the real differences between Panda and Penguin are. Here is a breakdown:

Google Panda Update Overview:

According to Google’s official blog post when Panda launched,

This update is designed to reduce rankings for low-quality sites—sites which are low-value add for users, copy content from other websites or sites that are just not very useful. At the same time, it will provide better rankings for high-quality sites—sites with original content and information such as research, in-depth reports, thoughtful analysis and so on.

Basically, Panda updates are designed to target pages that aren’t necessarily spam but aren’t great quality. This was the first ever penalty that went after “thin content,” and the sites that were hit hardest by the first Panda update were content farms (hence why it was originally called the Farmer update), where users could publish dozens of low-quality, keyword stuffed articles that offered little to no real value for the reader. Many publishers would submit the same article to a bunch of these content farms just to get extra links.


Panda is a site wide penalty, which means that if “enough” (no specific number) pages of your site were flagged for having thin content, your entire site could be penalized. Panda was also intended to stop scrappers (sites that would republish other company’s content) from outranking the original author’s content.

Here is a breakdown of all the Panda updates and their release dates. If your site’s traffic took a major hit around one of these times there is a good chance it was flagged by Panda

1. Panda 1.0 (aka the Farmer Update) on February 24th 2011
2. Panda 2.0 on April 11th 2011. (Panda impacts all English speaking countries)
3. Panda 2.1 on May 9th 2011 or so
4. Panda 2.2 on June 18th 2011 or so.
5. Panda 2.3 on around July 22nd 2011.
6. Panda 2.4 in August 2011(Panda goes international)
7. Panda 2.5 on September 28th 2011
8. Panda 2.5.1 on October 9th 2011
9. Panda 2.5.2 on October 13th 2011
10. Panda 2.5.3 on October 19/20th 2011
11. Panda 3.1 on November 18th 2011
12. Panda 3.2 on about January 15th 2012
13. Panda 3.3 on about February 26th 2012
14. Panda 3.4 on March 23rd 2012
15. Panda 3.5 on April 19th 2012

Search Engine Land recently created this great Google Panda update infographic to help walk site owners through the many versions of the Google Panda updates.

Many site owners complained that even after they made changes to their sites in order to be more “Panda friendly,” their sites didn’t automatically recover. Panda updates do not happen at regular intervals, and Google doesn’t re-index every site each time, so some site owners were forced to deal with low traffic for several months until Google got around to re-crawling their website and taking note of any positive changes.
Search Engine Land recently created this great Google Panda update infographic to help walk site owners through the many versions of the Google Panda updates.

Many site owners complained that even after they made changes to their sites in order to be more “Panda friendly,” their sites didn’t automatically recover. Panda updates do not happen at regular intervals, and Google doesn’t re-index every site each time, so some site owners were forced to deal with low traffic for several months until Google got around to re-crawling their website and taking note of any positive changes.

Google Penguin Update Overview:

The Google Penguin Update launched on April 24. According to the Google blog, Penguin is an “important algorithm change targeted at webspam. The change will decrease rankings for sites that we believe are violating Google’s existing quality guidelines.” Google mentions that typical black hat SEO tactics like keyword stuffing (long considered webspam) would get a site in trouble, but less obvious tactics (link incorporating irrelevant outgoing links into a page of content) would also cause Penguin to flag your site. Says Google,

Sites affected by this change might not be easily recognizable as spamming without deep analysis or expertise, but the common thread is that these sites are doing much more than white hat SEO; we believe they are engaging in webspam tactics to manipulate search engine rankings.

Site owners should be sure to check their Google Webmaster accounts for any messages from Google warning about your past spam activity and a potential penalty. Google says that Penguin has impacted about 3.1% of queries (compared to Panda 1.0’s 12%). If you saw major traffic losses between April 24th and April 25th, chances are Penguin is the culprit, even though Panda 3.5 came out around the same time.

Unfortunately, Google has yet to outline exactly what signals Penguin is picking up on, so many site owners that were negatively impacted are in the dark as to where they want wrong with their onsite SEO. Many in the SEO community have speculated that some contributing factors to Penguin might be things like:

1. Aggressive exact-match anchor text
2. Overuse of exact-match domains
3. Low-quality article marketing & blog spam
4. Keyword stuffing in internal/outbound links

It’s important to remember that Panda is an algorithm update, not a manual penalty. A reconsideration request to Google won’t make much a difference–you’ll have to repair your site and wait for a refresh before your site will recover.  As always do not panic if you are seeing a down turn in traffic, in the past when there is a major Google update like this things often rebound.  If you do think you have some sort of SEO penalty as a result of either the Google Panda or Google Penguin updates, please contact your SEO service provider to help or start trouble shooting.

Article reference:brickmarketing.com/blog/panda-penguin-updates.htm
bestmarketingseo.com






Wednesday, April 25, 2012

What is this "over optimization" that everyone is talking about?


Google’s “over optimization” Penalty 

1.  The most common infraction is keyword stuffing/spamming/abusing etc.  When desperate for rankings people will often resort to just repeating keywords as often as they can.  You need to focus on proper keyword usage and the best test for this is to read your content out loud.  You'll know if it sounds keyword stuffed and you can go back and edit it.  One way to avoid keyword stuffing is to use different variations of your words and phrases and look for synonyms that you can get in there.  That eliminates the over usage of one phrase and it creates more phrases you can possibly be found for.

2.    Lots of low quality links all with the same or very similar anchor text. This dates back to the days of the good old fashioned link exchange - "you link to me and I'll link to you and by the way, let's both have everyone linking to us use the same anchor text and all link to only the homepage".  Not cool these days.  As with most things in SEO, it's about balance and well rounded diversity.  You want quality links to many different pages, all with different anchor text (that is relevant to the page you are linking to).  These kind of links are hard to obtain but Google doesn't care.  It's what they want.  Most people want a quick easy link fix and it was never a good idea but it's even more important that you avoid it these days.

3.    Too little content on the page but lots of "optimization"

4.    Lots of duplicate content taken from sites that rank well.

5.    There is some theory that too many sites being pointed/redirected to your main site could hurt you.  I've heard a lot of talk about this one but not seen a lot of proof

6.    Link farms/networks. There are two ways this can hurt you.  If you are running a link farm - which means too many links on a page, none with any real text or purpose other than giving out links.  The second way is having a link farm/network linking to your site (especially if the same ones have multiple links back to your site).

It actually surprises me that everyone is making such a big deal about this - over optimization has always been a bad thing.  It's been called spamming and black hat (which also involves cloaking and various other nefarious things) but the bottom line is Google has been continually ramping up their efforts to weed out bad sites.  Each time it happens and a group of bad sites take a hit, everyone is surprised and rushes to analyze it.  The truth is you do need to optimize but it needs to be in a more "natural" way.  Avoid some pretty basic things and be strong in the areas you should be.
As I said, this really isn't new.  We went through this months ago.  I mean, wasn't Panda aimed at the quality of sites?  You bet it was!

All this really means is Google is getting more aggressive about cleaning up the search results.  I am actually good with that.  Sites that aren't participating in any of the problem areas have nothing to worry about. If you have a quality site that focuses on "evergreen" optimization, you won't have to worry each time there is an algorithm shake up.

Article Resources Write by Jennifer Horowitz

Thursday, February 23, 2012

Is Cloud Computing Good For SEO?

 Is Cloud Computing Good For SEO?

Cloud computing is often thought about in terms of data storage, not web development. But as it becomes more popular you’ll start to see many more uses for it. For data storage purposes, cloud computing was thought to be a much needed relief for many companies. But it turns out that even Google has security issues with keeping data secure. So if Google can’t do it, who can?

Someday someone is going to get the bright idea to build a website in the cloud. What will happen?

First, I predict the site will be hacked. If data storage is unsecure in the cloud then how much more will web development be? Hackers can already sneak into WordPress and other CMS systems. If those systems were cloud-based they’d be even that much more insecure. And an unsecure server is bad for search engine optimization and your website.


If a hacker gets into your server, he could really do some damage. Link dropping, malicious malware downloads, just to name a couple. Some webmasters on traditional servers have already experienced these issues with WordPress and other CMSs. Just by not updating their software. Now put that non-updated software on the cloud. See the issue?

You could lose your search engine rankings for some of these issues. All because of a lack of security in the cloud.
article sources:http://www.searchengineoptimizationjournal.com/2009/07/16/cloud-computing-seo/ 

The Impact of Cloud Computing on SEO

The Impact of Cloud Computing on SEO


What is Cloud Computing?

Cloud computing has been developing in the last few years and it promises to be the new business model for many companies. With cloud computing there is no need to own the physical infrastructure. Hardware and software capacities are rented through a provider. Users can rent virtual computers on which they can run their own applications, allowing access/share data through the cloud (Internet) at any given moment. Most importantly cloud computing allows sites hosted in the cloud to be accessed rapidly and in the user’s local language, as WebPages are served through a cluster of servers closer to the user’s IP location.

Is Hosting location important for SEO?
Hosting location and country Top Level Domain (TLD) have always been important factors for SEO to target search results specific to a country. While TLD indicated the site’s origin (e.g. .com.au for Australia) to search engines (SEs), hosting has been significantly important to identify the geo-location of a website as SEs have been looking at the IP address of the site to detect the server location. If a site was hosted on a server that was physically located in a country, then that site would have been included in the country-specific searches even if this had a generic TLD domain name. SEO recommendation for a site targeting Australia would have been to choose a hosting provider coming from Australia rather than the US. Hosting location would have especially affected sites using generic TLDs such as .org, .info, .biz, etc. Google also allows webmasters to set the geo-location within Webmaster Tools to help them target a specific market.

Cloud computing is changing the hosting location factor
Cloud computing is going to change the hosting location factor. The reason behind this is that sites can be hosted anywhere in the cloud, however, the web pages are served locally. So sites hosted in the cloud that have different versions (US, UK, AU etc) initially created to target specific countries, will all look like local sites to Google and therefore will start competing against each other. The site that has the highest authority will eventually outranks the others – e.g. if the US version of a site has the strongest authority, this would eventually outrank the AU version within the Australian market.
As cloud computing becomes more popular, Search Engines would change their algorithms to take this into account. This is going to revolutionise the way that sites operate across different countries and in different languages.


Google Caffeine and Cloud Computing

In June 2010 Google released the new Caffeine update, providing a new search indexing system allowing Google to index web pages on an enormous scale. Page speed became a factor in SEO and it started playing an important role in the rankings of websites in search results (especially for sites aiming to rank in a competitive space). One of the key strengths of cloud computing is the faster delivery of web pages, significantly improving the page loading time. Cloud computing allows the distribution of resources more efficiently and effectively and can have a huge impact on a site’s loading time.

Google and Cloud Computing
Google has been promoting cloud computing for quite some time. Matt Cutts on his Blog “Why cloud services rock” mentioned how “hosted services and storing data in the cloud [on someone else’s servers] can be better than doing it yourself”. Matt Cutts, in his video on “Big Changes to search in the next few years”, also commented that “As more people get comfortable with online computing, more of them will choose to store their data from onsite hardrives and store it in the “cloud”. As a result searching in the cloud for relevant information will become increasingly important”.





Article Source :http://www.bruceclay.com/blog/2011/04/3749/