Google Is Not A Reliable Source Of Traffic

If I were starting a website today, I would not consider Google traffic anything more than a bonus.

They are arbitrarily penalizing websites and asking webmasters to perform ‘good faith’ efforts to get rid of links they didn’t acquire in the first place. Other webmasters can destroy your sites ability to rank in Google and there is little you can do about it. There’s no way you can ever feel comfortable considering how much you don’t control over the future of your Google traffic.

Try doing a stress test for your site to determine what would happen if your organic traffic in Google declined between 50% to 90% overnight. Can you survive? If not, then it’s probably time to start coming up with some contingency plans.

What is your margin of safety?

Warren Buffett describes Margin of Safety in terms of tolerances:

When you build a bridge, you insist it can carry 30,000 pounds, but you only drive 10,000-pound trucks across it.

The same concept should apply to your website.

How much traffic or how many orders do you need to keep the lights on or feel financially secure?

Whatever that number is, make sure you can attain those targets without accounting for Google traffic. This is certainly not easy, but it will make your website and business stronger over the long run.

LONG LIVE LINKS! LONG LIVE SEO!

I’ve been reading a lot lately about how Google is changing and the semantic web is coming and that links with anchor text are dying and that pretty soon you won’t need to get links and that you’ll just need to create content with a lot of words related to your topic in it and customers will come flocking.

I disagree.

Links are going nowhere, mainly because links are a fundamental building block of the web and quite frankly, a site that discusses a person, place, or thing without linking to it creates a poor user experience. So even if there is no ‘perceived’ SEO benefit, links will always exist on the web for usability, users will click them, search engines will find them, and therefore links will have SEO benefit.

Also, this whole concept of Google being able to judge the relevancy of a document simply based on the ‘co-occurence’ of other terms is a fantasy in my opinion. If that’s the case, then cue up the bulk LSI article creators on Fiverr! If Google doesn’t look at links, and just looks at co-occurence then there will be millions of LSI-optimized (lsioptimization.com is available …) horrible articles about everything under the sun and Google will have no way to tell which ones to rank.

I’m not ragging on co-occurence but writing using related terms isn’t new, and no matter how large a signal it is there will always be an even larger ‘popularity’ score involved based on how users can ‘vote’ for sites.

Before Facebook and Twitter the only way users could ‘vote’ on the web was by linking to an article from their website. Therefore, links were a proxy for popularity, and quantity/quality of links was a proxy for how popular something was.

Today Google has many other ways to tell how users ‘vote’:

1) What they click on in the search results
2) How long they stay on a site and what they do
3) How many Facebook likes, retweets, Google +’s, etc. a page has
4) How many links a site has
5) How many people search for a subject/person/brand/entity
6 – Infinity) Other stuff

They also have tons of other ways to judge links:

1) How long has it been on the web
2) What is the anchor text
3) How often is it clicked
4) Where is it on the page
5) You get the idea …

But keep in mind that users and search engines still arrive at pages via …. LINKS! Whether they click on it from Facebook, or Twitter, or from an email, people click links, and the quantity and quality of those links out there will always be a vital factor in how your page ranks in search engines. Sure over time certain attributes of the link will change in value (the authority of the page/site the link is on, the anchor text, the usage metrics of the URL the link is pointing to, a lot of stuff mentioned here) but links are here to stay.

For those that hate reading and want the 8 second summary I’m basically saying that as long as people use links to surf the web they’ll always be a very valuable signal for search engines to determine how to rank sites.

Bing Piggy-Backing off Google’s work

Bing appears to have been piggybacking off Google’s results. Considering Bing historically sucked at serving long tail results, it’s no surprise to see them do whatever they can to return more relevant results. I can’t decide if this is simply a brilliant use of competitive intelligence or laziness, or both.

Other non-search perspectives from Nate Silver of the wonderful political blog 538 (notes how Bing’s “It’s one of 1,000+ signals  we use”, is bogus) and Barry Ritholtz from The Big Picture (says Microsoft has always stolen) are worth a read.

Only Four Scottsdale Internet Marketing Service Providers?

At least that’s what Google would have you think by only displaying four listings in the serp for ‘internet marketing services scottsdale’.  I was obviously part of some experiment because when I clicked over to page two it showed me ten listings, and then when I went back over to page 1 there were ten listings.  The same top four that appeared in my initial search result were also the top four that appeared when I went back to page one and had ten results.

search result with four listings

If this was rolled out with any level of consistency it would make rank tracking pretty difficult.  I tend to agree with Patrick Altoft in that rankings as we know them are a thing of the past, and now it’s more about the range of rankings you’ll have for a keyword at any given time as pages are constantly being reranked by things such as localization and personalization.

Learn How to Build Links in 90 Seconds!


It should take you no more than 90 seconds to read the last four paragraphs of this interview with Debra Mastaler from Alliance Link. If you want to learn how to build links read this, then read it again. (Copied below for your reading pleasure)

You have a new product and want to use it to build links. Before going public, you do a soft launch to your customers with an invitation to link and incentive (sales.) You then run a national contest (promotions) and announce it via press release (publicity). You also send a release announcing the winner.

While the first part s is going on, you instruct your copywriters to create a humorous piece of link bait which is launched on Digg and a handful of other social media sites (promotions). The linkbait has a tie-in to the contest. (promotions) A blogger outreach program begins and offers free samples of the new product and an invite to review or send rate cards with advertising opportunities (promotions). Once the reviews come in, bundle them, and send release showcasing the successful launch. (publicity).

Based on the success of the launch, have sales and copywriting staff write a case study/white paper referencing the process, reviews and customer feedback. Offer white paper to key journalists before going public, (publicity) and then add the paper to any content source that will take it (promotions).

And so on. By the time this link cycle is complete, you will have touched on almost every facet of marketing without having to purchase a single link or having any one of the tactics stand out. Balance is key.

Where is Dave?

As you can tell, this blog has been idle for a very long time.  During that time I’ve been working on some independent side projects, blogging over at our company blog and working a ton.  I recently ported this blog over to Thesis as I’m trying to become proficient so I can use it for my other sites.  Will let you know how this goes :)

You can read some recent posts by me:

Designing for Visibility
Google’s Newish Keyword Tool
Be Prepared for When Yahoo Becomes Bing

Why does Yahoo handle redirects differently than other engines?

I was always under the impression that a 301 redirect would tell all bots that a page has permanently moved and now resides on the destination page, so they should transfer all link/search equity over to that page and remove the old page from the index.  I also believed that a 302 redirect was the opposite, and told the engines that the destination page is only a temporary home and should not be indexed.  Apparently Yahoo does not agree:

yahoo

Have they always handled redirects in this manner?  Is this why their index is littered with pages that return 404′s, since webmasters who perform 301 redirects from top level pages down to deeper pages may remove those top level pages after awhile?  This article was updated by Yahoo fairly recently (in May 2009), so perhaps this is something new.

4 Best FREE Sources to Learn SEO

SEO is a very complex topic.  It takes many hours of reading, experimenting, and analyzing to become proficient.  The hardest part about learning online is figuring out who to trust.  If I had to pick four free resources from which to learn SEO I would choose the following:

1. Google Webmaster Guidelines – This is a pretty comprehensive document outlining how to do SEO the Google approved way.  If you were to stick with only following the rules in this guide your site would be well ahead of much of the competition.

2. Yahoo Search Topics – Particularly the section which discusses their search spider called Slurp and the section about ranking.  It also gives you some information about how to use Site Explorer to look up backlinks, something all SEO’s should learn how to do.

3.  Bing Webmaster Center – Again, pretty similar to the other two where it’s set up into different sections related to technical recommendations, content guidelines, and things to avoid.

Those three are very similar, but if you read through them you will spot similarities and come away with a good understanding of what the search engines deem important.

4.  Once you’ve conquered those three help centers, you’ll want to make your way over to Webmaster World’s Google  Hot Topics section.  This is far more advanced than the other three but provides great insight into the intricacies of creating a succesful site and ranking well in Google.  Definitely a little more advanced than the three listed above, but this resource is well worth the read.  Definitely a place to come to if you have questions about your site or how Google may be treating your site.

Bonus:

5.  Techniques and Failures for Web Content Accessibility Guidelines 2.0 -  Digesting all of this content would take months or years.  It’s not important to know everything on here, but the more items from this list that you implement within your site the better off your site will be from a conversion aspect and an accessibility aspect.

Search Engine Bot Activity from this weekend

Was checking into my server logs this weekend after I made a new post to see which spiders would crawl it first.  To probably nobody’s surprise, Googlebot was quick on the scene, followed by MSNBot, and Slurp didn’t roll through until a good 24 hours after it was posted.  Slurp is just not a fan of daveshap.com.

blog-post-bots

One thing I did find very interesting was that MSNBot always checks robots.txt before crawling any pages, whereas Googlebot only checked it once the whole weekend.  The grand totals were 37 crawls of robots.txt by MSNBot, 8 by Slurp, and 1 by Googlebot.  Seems like Googlebot is a much more efficient crawler than the other two.  For some reason I’m not surprised.