Guides: How Do I?Marketing MadnessSpecials

What Google Doesn’t Want you to Know About Panda

Despite all the conversations about what Google is looking for when indexing pages, there is still one thing that they are completely focusing on: incoming links from other sites. In fact, from what I’ve seen in my own personal SEO studies, incoming links is still one of the main factors that determine the rank for any site – but not like it used to be.  I believe that the Panda Update that everyone has been talking about still, was only a revamp of this strategy.

Not all links are the same and here are some thought and ideas that I have about what works, and what does not. A lot of this has been through testing, but also conversations with Gail Gardner of Growmap.

Relevancy Matters

While this has been pointed out many times before, there seems to be a substantial change to Google’s algorithm. The entire link building strategies that include anchor texting on random sites doesn’t have much effect anymore especially faced against links from relevant content.

What does this mean exactly? If a site is about a topic, let’s says Cats and Dogs, and has a lot of pages about Cats and Dogs, the entire ranking of any link coming from that site is worth more than let’s say a page about Cats and Dogs on a site about Cars.

Google’s algorithm somehow ranks an entire site’s relevancy and provides that links from that site to something similarly relevant has more juice than any random link. I’ve tested this with link building, finding that about 200 random links are worth about one link from a similar relevant PR4 site. That seems crazy, but it also means that if you only get a few people linking to you from really good sites with good keywords, you’ll do a lot better than let’s say the guy who paid some offshore guy to spam blog comments.

First Unique Content Goes First.

I believe that this is probably the least understood part of the entire system. There was so much talk about article sites that were losing SEPRS because they had so much content. It makes more sense that they lost their rankings because their content wasn’t just not unique, but wasn’t the first mention of the content. Duplicate content isn’t the issue here. I think all those people saying “Don’t allow someone to copy your content” are lost, because you can’t prevent people from stealing, mentioning, content. Then you could hurt other people’s SERPS by copying it everywhere.

There is a crazy theory that perhaps the first instance of content actually gains rankings based on the other versions of the content, even when not linked!

I know this doesn’t particularly make sense, but I did several tests of this myself. I put some content on a friend’s very small blog, and then two weeks later duplicated that same exact content on one of the top technology sites in the world. At first the content showed up in that site on the first page of a search, but then it dropped off and the original content on the small blog turned up at a higher ranking than the bigger sites original ranking.  This site was originally only page 4 on the ranking, and suddenly went to page 1 with no links in.

Also I tested this several other times with similar content, duplicating it on other major sites that I have access to and found the same results.  In every instance, the original content gained ranking eventually but only when duplicate content was placed on a secondary very high ranked page.

It is my belief that the engineers Google realize that a lot of content is copied from a variety of places, often without permission, or as part of a feed, news service. Penalizing the original source because of duplication makes no sense whatsoever! Why would it?

In theory, getting links from that duplicate content should enhance your listings too – something which I am going to test more and more in the coming months.

Age of Site & Links is Utmost Importance.

Google has gone nuts with the whole refreshing of content– often showing new content with new links on the first page of searches and having them quickly drop off within a week to page 3 or 4. I believe that this is part of Google new strategy to change things up a bit, to allow new content to get a “day in the sun” but also to emphasize that the age of a site brings more relevancy to the table.

I’m pretty sure that they’ve created a system that a site must be around for a certain amount of time to get any juice –and combined with the fact that older sites generally have more content about a certain topic, their age of site has become a significant factor.

Basically, if you have links coming from an older site that has been around for longer time, to your site that has been around for a longer time, you will see immediate jumps in keyword rankings over younger site links of any type.  New links aren’t as powerful as old links.

It seems to me that even newer content on older sites has less value than older content. It causes a temporary jump, but long term unless it consistently builds in links to it from other sites, it’s not as valuable.

If I could find a term for this, it would be a “gravitas” scale of some sort. Basically, Google wants reference, fact sites to have a higher ranking than those sites that are opinion, news or promotional sites. The proof of a sites authenticity or gravitas would be in people consistently linking to it over a period of time. People who link to a site or article only over a small period of time are usually just linking to an interesting news story, a fad, or link building.

Links over a long time show that a site is a valuable “resource.”

That’s why Wikipedia scores so high in so many factors (including PR) because its s stable resource that people link to all the time when making references.

Here’s the basic formula, for those who don’t want to read my entire article:

First + Relevant Content + Age + Links

About Pace Lattin

Show More

Pesach Lattin

Pesach "Pace" Lattin is one of the top experts in interactive advertising, affiliate marketing. Pace Lattin is known for his dedication to ethics in marketing, and focus on compliance and fraud in the industry, and has written numerous articles for publications from MediaPost, ClickZ, ADOTAS and his own blogs.

Related Articles


  1. That’s very interesting to hear the results of your duplicate content testing and if Google really have developed an algorithm that can detect and reward the “original” article then it really does change the game in so many ways…

    1. Have you read about the Google+ authorship? I have been told that this actual attaches originality on an article somehow also, although I’m not sure what that means. I think first click, first + gets it “original.” although thats just a guess.

      1. Makes sense, first plus clicks would certainly be a pretty good indication that something was there first (assuming everyone will get click happy with the G+ that is)…..either way, it seems like they are really working hard to find a solution that will reward the original content owners which gets the thumbs up from me 🙂

        1. Yes – makes sense that content being copied shows relevancy of the content, ie, other people are talking about it. The idea that someone would be penalized for other people’s actions makes no sense.

  2. Hi Pace,

    Thanks for the article. I have a question about domain age. For my domain, and others that I have, I tend to renew my domains each year to spread out the expense. Does google down grade the domain rank because of this or does the spider actually see that the domain might be 5 years old, but just renewed each year.

    And if annual renew is a black mark, how detrimental is it to the seo rank of the website?


    ps cute baby, congrats

    1. That’s an excellent clarification of “domain age”, that is: how long it has been in the Google system. This makes a lot more sense then a domain that is 15 yrs old but never used. That’s a good gem there.

      Also, it’s a good Panda analysis makes perfect sense in all definitions and I think you might be on the mark about it, certainly describes the demise of article sites.

      Anonymous posted a comment about Google+ and being “the first to get the click” and therefore the most original author of the content is completely unsubstantiated. Be careful of SEO rumors I say….

      Nice article,


  3. Hi Again,

    On original content: If I were to write andarticle, use a quality spinner, alter it 30% to 50% each time it is submitted to an article directory, is google going to rank each one as unique?


  4. There’s another way to look at this too, and that’s that Google is giving preference to brands. It makes sense from a big picture standpoint – “brands” have more resources/money to create useful content/services/goods/etc, and because businesses have different standards (compared to a personal blog), it’s in their best interest to be authoritative and have, as you say, gravitas. So what does this mean for smaller sites that aren’t run by businesses? You need to act like a brand – fake it til you make it. It’s about becoming authoritative, and you become authoritative by building trust, and you build trust by putting out accurate, objective and updated information. Ask yourself “what would a brand do” and do it. This can go for your SEO, social media, content strategies – everything.

    1. Yeah, that’s what my friend Gail Gardner at Growmap thinks. However, originally the site in my experiment that got all the listing was actually a nothing site. All the links were to the secondary site.

      Strangely enough, eventually they actually switched places yesterday with one site going to page 1 the other to page 3!

  5. Pingback: Work From Home
  6. Pingback: Gumby Robbery
  7. Pingback: money making
  8. Pingback: PLR Products

What's your opinion?