Google doesn't want you to know the secret sauce behind its ranking algorithm recipe. So, most SEOs rely on gut feeling when using certain SEO methods they think should work.
However, there are ingenious SEO among us who don't mind working their fingers to the bone and digging up hard-won SEO facts in (usually time- and resource-consuming) field experiments.
Warning: After this post, your understanding of SEO will never be the same. Read at your own risk!
Here is what he did:
And, Brian Dean got 11% success rate on this.
The Verdict: Creating paramount content on proven link-bait topics works.
So, Petrovic and his team decided to check if this was true. They set up 2 nearly identical test sites:
They observed the site's rankings for a month. Both sites had been showing up in search at approximately the same positions.
Then, after getting a link from a PR 7 webpage (that had tons of outgoing links), site B catapulted from #70 to #3 in the cause of several weeks! Site A got no such link, and its rankings stayed the same.
The Verdict: Even though a high-PR page links out a lot, a link from it is still worth it
The idea was to see whether the ‘great content promotes itself’ mantra preached by Google’s Matt Cutts was actually true. The only means of attracting visitors Woodward had promised to use were:
The results of the experiment are truly astonishing, plus matthewwoodward.co.uk now has 9,608 backlinks and 655 unique domains linking to it (the stats are from WebMeUp Backlink Tool) – with Woodward never purposefully building a single link to it in his life!
The Verdict: it is possible to drive traffic to a site without link building
Tweak 1: Sebald points a link (exact-match anchor text) from his second-best page to homepage ->
The site jumps from #11 to #10 on Google.
Tweak 2: Sebald puts a blog-wide link (exact-match anchor text) to homepage ->
The site nosedives from # 10 to #12 (this link is subsequently removed).
Tweak 3: Sebald now points a blog-wide link (exact-match anchor text) to his second-best page ->
The site shoots up to #9 in the search results!
The Verdict: Internal links between important pages still have weight.
Site-wide links to homepage have negative effect.
Have you always been told that your title tag has to be 65 to 70 characters long? But you've probably noticed that sometimes it still gets cut off no matter how closely you follow this recommendation.
What he did:
Hartzer and the Standing Dog agency created a fictitious title tag with non-existent and nonsense words that was 95 characters long and 448 pixels wide. And, the title appeared in Google's search results in its entirety!
According to Bill Hartzer, they've seen titles up to 512 pixels wide in search results. What's your observation?
The Verdict: Maximum page title length is 512 pixels, not 70 characters.
Cyrus Shepard of Moz made an extremely bold move, in my opinion. Sometime after Google Penguin 1.0, he went ahead and disavowed all 35,000 links pointing to his SEO consulting site, cyrusshepard.com (whatever was in Webmaster Tools).
According to Cyrus, those were mostly great, natural links the site had earned thanks to its merits.
Come Penguin 2.0, and the traffic to Cyrus's site crippled at least by half. This prompted him to remove the disavow file from Google Webmaster Tools. Do you think this solved the problem?
When Penguin 2.1 came out, no significant increase in site traffic has been observed.
The Verdict: Even if you disavow "good" links, their value is nulled.
And, once disavowed, links never come back.
What happened exactly was Dan Petrovich of Dejan SEO made a series of tests, copying content from other sites, and successfully outranking them in the SERPs.
How come? Turns out, a copycat page can outrank the original page, if:
In general, there is no sure-fire way to ensure your content is never hijacked, but routine checks for duplicates + setting up canonicals, claiming authorship and other measures do help.
The Verdict: Site hijack is real, but there are ways to prevent it.
SEOs have always debated whether Google uses social signals such as tweets, likes or +1's to rank websites.
What Eric Enge did was he blasted 6 different pages with 800+ Likes (some real, some purchased on Fiverr), which did not even lead to those pages being indexed in Google!
Another part of the experiment involved Enge sharing another set of pages on Facebook. Again, the shares didn’t help the pages’ discovery in no way whatsoever (they never got indexed by Google).
The Verdict: Facebook likes/shares have NO effect on indexation or rankings.
Once upon a time, Jason Noel of Maven Websites noticed that Google returned a strikingly different number of results for a search term with no capital letters than for a search term in which every word was capitalized.
41, 400 results found vs. 15,300 results (for the lower-case term)
The search term was a proper name (for a geographical object), and that could have made a difference. Jason argues, however, that could be because some results pages capitalized the keyword in their URL, while others didn’t.
What this ultimately means is – if you’re optimizing for "Javits Center," for example, do capitalize the name in the URL – just in case.
The Verdict: Google may treat capitalized and non-capitalized proper names as different terms.
Yeah, before you ask. Neil Patel and his team were called in to help TechCrunch do its best in search and on social media. What Neil did:
As the result, search traffic to TechCrunch doubled, and overall traffic grew by 30%!
The Verdict: Never underestimate the power of on-page SEO and social.
OK, these were the 10 eye-opening experiments performed by the industry's distinguished SEOs. Was any of this information of help? Did you learn anything new? Your thoughts and comments are welcome.
|Enjoyed the post? Share!|
|Read comments (7)|
|Enjoyed the post? Share!|