If you haven’t already done so check if you were penalized by Google’s Panda update. Many times people think they got hit by the Panda update when they actually received a penalty for spammy links. Positive that Google’s Panda mauled your traffic? Great! Let’s jump in and see what we can do to tame the beast.
Basically there are a few factors that result in a website getting slapped with the Panda penalty:
- Duplicate content between your website and other websites on the web
- Duplicate content within your website
- Poor visitor interaction
Fixing each of these issues is easier said than done. But don’t worry, I’m going to hold your hand every step of the way and together we can slay the Panda.
Remove Duplicate Content & Write Unique Content
First things first, make sure all the content on your website is unique. Copying and pasting content is the fastest way to get pandalized. I hate using general tips like “write unique content” because it is so vague, but for many websites this is the only option.
- Go to CopyScape.com and check for duplicate content – CopyScape is an amazing tool that searches the web for duplicate content. With the free version you can only check 1 URL at a time. If you have a larger site and larger wallet than I would recommend checking a batch of URLs with Copyscape Premium. Keep in mind that checking for duplicate content in batches will cost you $0.05 per URL
- Noindex or rel=canonical all pages with duplicate content – Every page that CopyScape identifies as being a duplicate to other pages on the web needs to go. There are many ways to do this but the 2 easiest ways are noindexing the page or using the rel=canonical tag. The noindex code tells search engines not to index the page. The rel=canonical tag tells Google that the page is a duplicate of another page. A general rule of thumb is to canonical pages with backlinks and noindex the others. Why? Because the canonical tag passes link juice to whatever page you point it to. This may be a bit confusing, but I have gone ahead and created 2 examples to visualize the process.
Both the noindex and rel=canonical code are placed in the <head> of your source code. Once these tags are added to your duplicate pages it will take anywhere from 1 day to 1 month before Google re-crawls the pages and removes them from the index. The time it takes for Google to re-crawl these pages depends on the size of your website and the number of backlinks. Once this is done it’s time to sit back and wait for Google to re-run the Panda algorithm. This occurs about once every 4-8 weeks; the next Panda update should occur sometime in early January 2012.
Implement the Rel=Author Tag to Your Articles
I would highly recommend implementing the rel=author tag if your articles are getting taken from scraper sites. The rel=author tag is a fairly new piece of html code that Google is now reading. It tells Google who is the original source of an article and sometimes adds a portrait of the author to the search engine results page.
Matt Cutt’s has hinted that using rel=author is, or could be a ranking factor if you are the source of an article. And as of today, Google announced that they will be providing author stats in Google Webmaster Tools. These are 2 big indicators that rel=author is already being used as a signal by Google and could be even more of a ranking factor in the future.
In order to utilize the rel=author code you must have a Google+ account. Google uses your Google+ account to confirm that the article is associated with the author and pull the portrait thumbnail into the SERPs. Google has provided some easy to follow steps on implementing rel=author, but essentially this is how it works:
Not only will rel=author tell Google that you are the original author of the content, but the profile picture in your search result will increase the number of clicks to your website. You can’t go wrong!
Clean up Internal Duplicate Content
Now that we have addressed duplicate content between different websites, it’s time to focus on internal duplicate content issues. The reason Panda cracked down on internal duplicate content is because many spammy websites were automatically creating 100′s or 1000′s of duplicate pages while only changing a few keywords. This would allow them to target many keywords with very little effort (emphasis on would).
So what does this mean for you? It’s quite simple, any duplicate content within your website has gotta go. There are 2 main reasons why a website would have internal duplicate content:
- You are automatically creating 100′s or 1000′s of duplicate pages to target more keywords
- You have pages with multiple URLs
If you belong to the first group and are automatically creating 100′s or 1000′s of pages with essentially duplicate content it’s time to stop. Unless you want to get hit by Panda, you need to either rewrite all unique content or noindex/canonical the duplicate pages.
Having multiple URLs for a single page is a common issue amongst most websites. This can be a problem because in Google’s eyes, every separate URL is a separate page. Which means that to Google and other search engines, www.opencart.com and www.opencart.com/index.php are different pages.
As you can see, the mikesbikeshop.com/red-bikes page is adding on the ?sort=A-Z parameter whenever someone sorts the page alphabetically. This is causing multiple URLs for the same page as well as duplicate content. Mike should put the canonical tag on the red bikes page but also on every page of his website as an SEO best practice.
If you are using WordPress then I would recommend installing the All in One SEO Pack because it will do all this for you, otherwise you will need to do this page by page. If you have a large website, hire a programmer to automatically do this for you.
Lower Your Bounce Rate
Google knows when someone searches a keyword, clicks on your site, and immediately leaves your website to click on another search result. It is likely that these types of interactions can signal to Google that your website is low quality and may be a good fit for the panda penalty. There are a few ways you can lower your bounce rate:
- Have good and relevant content that the searcher is looking for.
- Add a video. Videos can significantly lower the bounce rate of a page and increase the average time on site.
- Don’t target irrelevant keywords just because they have a high number of monthly searches.
- Add a clear call to action to get your visitors to go to another page.
These tips will help lower your bounce rate and reduce your risk of being penalized by Panda. But at the end of the day if users are consistently bouncing from your pages it may be a sign of a deeper issue. Put yourself in the searcher’s shoes and figure out what they would want to see for any given search query. If you can do that then you are one step closer to being a successful internet marketer.
Whether for better or worse, Google’s panda update forever changed SEO. The update determines the quality level of any given website by looking at duplicate content and usage metrics. Sometimes Google gets it right, other times not so much. Either way, this update is just another way of Google pushing webmasters to provide higher quality content.
Previous – Part 3
How To Check if You Were Penalized