Bibliographic Information

The Art of SEO

Publisher: O'Reilly Media, Inc.
Pub. Date: March 19, 2012
Print ISBN-13: 978-1-4493-0421-8
Must finish - go through bookmarks.
Add catalogue info to this page.


Bing Webmaster Tools is also a great asset. It offers a similar capability for downloading a spreadsheet of the links that Bing has in its database for a site.

For quick and dirty link totals, you can use a Firefox plug-in known as SearchStatus. This plug-in provides basic link data on the fly with just a couple of mouse clicks. Figure 10-23 shows the menu you’ll see with regard to backlinks. Notice also in the figure that the SearchStatus plug-in offers an option for highlighting NoFollow links, as well as many other capabilities. It is a great tool that allows you to pull numbers such as these much more quickly than would otherwise be possible. 



There is no such thing as a cookie-cutter SEO plan, and for this, all parties on the SEO bandwagon should rejoice. The ever-changing, dynamic nature of the search marketing industry requires constant diligence. SEO professionals must maintain a research process for analyzing how the search landscape is changing, because search engines strive to continuously evolve to improve their services and monetization. This environment provides search engine marketers with a niche within which the demand for their services is all but guaranteed for an indefinite period of time, and it provides advertisers with the continuous opportunity, either independently or through outside consulting, to achieve top rankings for competitive target searches for their businesses.

Organizations should take many factors into account when pursuing an SEO strategy, including:

  • What the organization is trying to promote

  • Target market

  • Brand

  • Website structure

  • Current site content

  • Ease with which the content and site structure can be modified

  • Any immediately available content

  • Available resources for developing new content

  • Competitive landscape

  • And so on...

Learning about the space the business is in is not sufficient. It may not make sense for two businesses offering the same products on the market to use the same SEO strategy.

For example, if one of the two competitors put its website up four years ago and the other company is just rolling one out now, the second company may need to focus on specific vertical areas where the first company’s website offering is weak.

The first company may have an enormous library of written content that the second company would struggle to replicate and extend, but perhaps the second company is in a position to launch a new killer tool that the market will like.

Do not underestimate the importance of your SEO plan. Skipping over this process or not treating it seriously will only result in a failure to maximize the business results for your company.


Business Factors That Affect the SEO Plan

Here are some examples of business issues that can impact SEO:

Revenue/business model

It makes a difference to the SEO practitioner if the purpose of the site is to sell products, sell advertising, or obtain leads. We will discuss this more in the later sections of this chapter.

Target customers

Who are you trying to reach? This could be an age group, a gender group, or as specific as people looking to buy a house within a 25-mile radius of Orlando, FL.

Competitor strategies

The competitive landscape is another big factor in your SEO plan. Competition may be strongly entrenched in one portion of the market online, and it may make sense to focus on a different segment. Or you may be the big dog in your market but you have specific competitors you want to fend off.

Branding goals

There may be terms that it is critical for you to own, for branding reasons.

Budget for content development

An important part of link building (which we will discuss in detail in Chapter 7) is ensuring the quality of your content, as well as your capacity to commit to the ongoing development of high-quality on-page site content.

How your potential customers search for products like yours

Understanding what customers do when they are searching for products or services like yours is one of the most basic functions of SEO (we will discuss it in detail in Chapter 5). This involves mapping the actual search queries your target customers use when they go to a search engine to solve their current problem.

Advanced Methods for Planning and Evaluation

There are many methodologies for business planning. One of the better-known ones is the SWOT (Strengths, Weaknesses, Opportunities, Threats) analysis. There are also methodologies for ensuring that the plan objectives are the right type of objectives, such as the SMART (Specific, Measurable, Achievable, Realistic, Timelined) plan. We will take a look at both of these in the context of SEO.

SWOT analysis

Sometimes you need to get back to the basics and carry out a simple evaluation of where you are in the marketplace, and where you would like to be. A simple SWOT analysis is a great starting point. It creates a grid from which to work and is very simple to execute.


Every company is unique, so naturally their challenges are unique. Even a second SEO initiative within the same company will not be the same as the first initiative. Your initial SEO efforts will have changed things, creating new benchmarks, new expectations, and different objectives. Thus, each SEO project is a new endeavor.
One way to start a new project is to set SMART objectives. Let’s look at how to go about doing that in the world of SEO.
Specific objectives are important. It is easy to get caught up in the details of the plan and lose sight of the broader site objectives. You may think you want to rank #1 for this phrase or that, but in reality what you want is more customers. Perhaps you don’t even need more customers from organic search, but you want higher sales volumes, so in fact having the same number of orders but with a higher average order value would meet your objectives better.
Measurable objectives are essential if one is to manage the performance in meeting them—you can’t manage what you can’t measure. SEO practitioners have to help their clients or organizations come to grips with analytics, and not just the analytics software, but the actual processes of how to gather the data, how to sort it, and, most importantly, how to use it to make informed decisions.
Achievable objectives are ones that can be accomplished with the available resources. You could decide to put a man on Mars next year, for example, but it is just too big an undertaking to be feasible. You can be ambitious, but it is important to pick goals that can be met. You cannot possibly sell to more people than exist in your market. There are limits to markets, and at a certain point the only growth can come from opening new markets, or developing new products for the existing market.
Aside from basic business achievability, there are also limits to what can rank at #1 for a given search query. The search engines want the #1 result to be the one that offers the most value for users, and unless you are close to having the website that offers the most value to users, it may be unreasonable to expect to get to that position, or to maintain it if you succeed in getting there.
Realistic objectives are about context and resources. It may be perfectly achievable to meet a certain objective, but only with greater resources than may be presently available. Even a top ranking on the most competitive terms around is achievable for a relevant product, but it is a realistic goal only if the resources required for such an effort are available.
Time-bound is the final part of a SMART objective. If there is no timeline, no project can ever fail, since it can’t run out of time. SEO generally tends to take longer to implement and gather momentum than a paid advertising campaign. It is important that milestones and deadlines be set so that expectations can be managed and course corrections made.
The PageRank of the home page of the site providing the link. Note that Google does not publish a site’s PageRank, just the PageRank for individual pages. It is common among SEO practitioners to use the home page of a site as a proxy for the site’s overall PageRank, since a site’s home page typically garners the most links. You can also use the domain mozRank, available through SEOmoz’s Open Site Explorer tool, to get a third-party approximation of domain PageRank.
The perceived authority of the site. Although there is a relationship between authority and PageRank, it is not a 1:1 relationship. Authority relates to how the sites in a given market space are linked to by other significant sites in the same market space, whereas PageRank measures aggregate raw link value without regard to the market space.
So, higher-authority sites will tend to have a higher PageRank, but this is not absolutely the case.
The PageRank of the linking page.
The perceived authority of the linking page.
The number of outbound links on the linking page. This is important because the linking page passes some of its PageRank to each page to which it links; each of those pages consumes a portion of the available PageRank, leaving less to be passed on to other pages. This can be expressed mathematically as follows:
For a page with passable PageRank n and with r outbound links:
Passed PageRank = n/r
It is likely that the actual algorithm used by the search engines is different, but the bottom line is that the more outbound links a page has, the less valuable a link from that page will be in comparison to a link from the same page with fewer outbound links.
The relevance of the linking page and the site.


Directories can be an easy way to obtain links. A large number of directories are out there, and they may or may not require a fee to obtain a listing. Table 7-1 lists some examples of quality directories.
Table 7-1. List of quality directories
Directory name
Arts and Humanities
Arts and Humanities
Arts and Humanities
Arts and Humanities
The key to success in link building via directories is to identify the high-quality ones and stay away from the poor-quality ones. A good quality indicator is whether the directory exists for users or for manipulating search engine rankings; if it’s the latter, stay away from it. The search engines view entries in the manipulative directories as being paid links that carry no value, whereas links in the high-quality directories are seen as having editorial value.
Note that DMOZ is particularly hard to get into. It is staffed solely by volunteers who may or may not respond to your submission. It is worth taking the time to submit something, but don’t worry if you never hear back. Also, don’t resubmit as it is common belief that sites that resubmit get moved to the back of the line, which can only lead to further delays. Submit once, and then forget it.

What search engines want from directories

Here are the essential factors the search engines look for in evaluating directories for quality:
  • Whether the fee is made specifically in payment for an editorial review, and not for a link.
  • Whether editors may, at their whim, change the location, title, and description of the listing.
  • Whether editors may reject the listing altogether.
  • Whether the directory keeps the money even if the submission is edited or rejected (which helps confirm that the fee was paid for the editorial review and that receiving a link in return was not guaranteed).
  • Whether the directory has a track record of rejecting submissions. The inverse of this, which is more measurable, is that the quality of the sites listed in the directory is high.
Ultimately, “anything for a buck” directories do not enforce editorial judgment, and therefore the listings do not convey value to the search engines.
To take a closer look at this, let’s examine some of the key statements from Yahoo!’s Directory Submission Terms (
I understand that there is no guarantee my site will be added to Yahoo!.
I understand that Yahoo! reserves the right to edit and place my site as appropriate.
I understand that if my site is added, it will be treated as any other site in Yahoo! and will receive no special consideration.
These statements make it pretty clear that Yahoo! will in fact reject your submission if your site is not a quality site, and it will keep your money. In addition, Yahoo! has a proven history of enforcing these policies, and as a result the quality of the Yahoo! directory’s content remains high.

Classifying directories

You can divide directories into three buckets:
Directories that provide sustainable links
These are directories that comply with the policies outlined earlier. Most likely, these links will continue to pass link juice for the foreseeable future.
Directories that pass link juice that may not be sustainable
These are directories that do not comply with all the policies outlined earlier. The reason such directories exist is that search engines tend to use an “innocent until proven guilty” approach. So, the search engine must proactively make a determination of guilt before a directory’s ability to pass link juice is turned off.
Even so, link juice from these types of directories is probably not going to be passed in the long term, and they do not represent a good investment for publishers.
Directories that do not pass link juice
These are the directories that the search engines have already flagged. They do not pass any value. In fact, submission to a large number of them could be seen as a spam signal. Although it is unlikely that any site would be banned from or penalized by a search engine based on this signal alone, when combined with other signals of manipulation, buying lots of low-value directory links could play a role in a penalty being applied by the search engines.

Detecting directories that pass link juice

The process is relatively simple for directories that pass sustainable links, as defined earlier:
  • Investigate their editorial policies and see whether they conform to what search engines want.
  • Investigate the sites they list. Are they high-quality, valuable resources that do not contain spam or use manipulative SEO tactics?
  • Investigate their track record. Do they enforce their policy for real? This may be a bit subjective, but if there are lots of junky links in the directory, chances are that the policy is just lip service.
  • As another check, search on the directory name and see whether there is any SEO scuttlebutt about it.
The process is a bit harder for directories that do not conform to the policies search engines prefer, but there are still some things the publisher can do:
  • Search on the name of the directory to see whether it shows up in the search engine results. If not, definitely stay away from it.
  • Take a unique phrase from the directory’s home page and see whether it shows up in the search engine results. If not, definitely stay away from it.
  • Does the directory have premium sponsorships for higher-level listings? This is a sure signal that indicates to the search engines that the directory’s editorial policies may be secondary to its monetization strategy.
  • Does the directory promote search engine value instead of traffic? This is another bad signal.
  • Evaluate the directory’s inbound links. If the directory is obviously engaged in shady link-building tactics, it is a good idea to avoid it.
  • Is the directory’s target audience webmasters and SEO practitioners? If so, stay away from it.

Tracking Social Media in Your Web Analytics

In the social media analytics world, there are several key types of metrics we’re interested in tracking:
  • Traffic data—how many visits and visitors did social media drive to our sites?
  • Fan/follower data—how many people are in our various networks, and how are they growing?
  • Social interaction data—how are people interacting with, sharing, and resharing our content on social networks?
  • Social content performance—how is the content we’re producing on social sites performing?
Getting the right metrics to answer these questions requires segmenting by network. Not every question will have direct answers in the data, so you may need to make assumptions or inferences.


Facebook offers a relative wealth of data through its built-in product for brand pages, Facebook Insights ( For example, you can see the click-through rate on Shared stories (as shown in Figure 8-23) or the demographics of your Fans (as shown in Figure 8-24).


LinkedIn functions like a hybrid of Twitter and Facebook. Connections require acceptance from both sides, but public entities (like company pages) and groups can be “followed.” LinkedIn tends to be a great social network for those who are recruiting talent or involved in business-to-business (B2B) sales and marketing. It’s often far less effective as a pure consumer/B2B channel.
Like Facebook, LinkedIn has some built-in analytics for businesses, one for individual profiles, and lots of data points that are useful to track, including:
Company page views and unique visitors
You can track the number of times your company’s LinkedIn profile has been viewed over time and how many unique visitors there have been to the page. An example of this is shown in Figure 8-33.
Number of followers, Connections, Profile views, Top keywords, Traffic     

Tools for Measuring Social Media Metrics

The number of tools available to track social media has grown exponentially over the last three years. Here are some of the best ones:


Great for tracking RSS feed content’s performance in the social world, though with the purchase by Google, Facebook data is now gone (and as Facebook is the largest network by far, the value of the service has taken a big hit).


Excellent for tracking click-throughs on content from any source on any device or medium. Given the nonreporting of many desktop and mobile clients, bitly’s tracking has become a must for those seeking accurate analytics on the pages they share.


Probably the best known of the social media monitoring tools, Radian6 is geared toward enterprises and large budgets and has impressive social tracking, sentiment analysis, and reporting capabilities.


Measures an author’s authority by tracking activity related to many social accounts, including Twitter, Google+, LinkedIn, and others (


Another fantastic tool for tracking social metrics, which was acquired by Twitter in 2011.

Social Mention

Enables Google Alerts–like updates from social media sources (Twitter in particular), and offers several plug-ins and search functions.

Raven Tools

A toolset that offers both search and social tracking functionality, Raven can help you track many of the basic metrics from Twitter, Facebook, StumbleUpon, YouTube, and blogs, and will likely expand into other networks in the future.


A very impressive social and web monitoring tool that, like Radian6, is geared toward enterprises. Converseon offers human-reviewed sentiment classification and analysis—a very powerful tool for those seeking insight into their brand perception on the social web.


Specifically focused on tracking Facebook interactions and pages, PageLever provides more depth and data than the native Insights functionality.

Twitter Counter

A phenomenal tool for monitoring the growth of Twitter accounts over time (it tracks latently, offering historical data for many accounts even if you haven’t used it before). If you upgrade to “premium” membership, it provides analytics on mentions and retweets as well.


Technically more of a social discovery tool, Wonk lets you search and find profiles via a “bio” search function. It also offers very cool analytics on follower overlap and opportunity through a paid credits model.

Social Bakers

Allows you to monitor stats for Twitter and Facebook (and several types of unique Facebook sources, like Places and Apps).


More than a raw analytics tool, Crowdbooster focuses on providing tips on factors such as timing and suggested users to engage with to help improve your social reach.

Offers link and content tracking, along with traditional social metrics analytics; it also has a very pretty interface.


Provides an aggregation of metrics and a datastream from Facebook, Twitter, and YouTube.


Offers reporting via Excel exports, including some very cool streams of data.

Most Shared Posts

A plug-in for WordPress from Tom Anthony that enables WordPress users to see the posts that are most shared on Google+, Twitter, and Facebook.


A social media dashboard for Twitter that also provides interesting tracking metrics.

There are a number of other tools in this space. As it is a young and growing area, it is fragmented, and you will need to find the tracking tools that work best for you. Over time you can expect these tools to evolve, and you can also expect to see a lot of consolidation.

Potential User Engagement Signals

Each search engine has at its disposal a rich array of data sources that enable it to measure a wide range of online behavior. Here are some of the many signals they can extract from those sources:
Click-through rate (CTR)
The search engines can measure the click-through rate on links presented in search results, on web pages in URL shorteners, on RSS feed readers, and more.
Clicking on other search results
Once a user completes a search and visits a link, a common behavior indicating a problem with a result is that that user returns to the search results, often quite quickly, and then clicks on another result. This is often referred to as “comparison shopping.”
Generating new searches
Similarly, a user may look at a given search result, then come back to the search engine and modify his search query.
Bounce rate
Bounce rate is a measurement of the percentage of users who visit only one page on a website. Search engines extend that definition to take into account the interaction of the user with the search results. For example, if a user clicks on a search result, then returns to the SERPs and clicks on another result, that could be an indicator that the first result was not a good response for that search query.
Time on page
Search engines can measure the amount of time spent on a given page using their affiliated browsers or toolbars. More time on page might be considered a signal of higher quality.
Time on site
Similarly, time on site could be considered a positive signal if the average user spends more time on your site than on the sites of your competitors. Of course, it could also mean that your site is difficult to navigate or loads very slowly, so this signal needs to be looked at in conjunction with other signals.
Pages per visit
This metric can be easily measured by the browser, a toolbar, or Google Analytics. More page views implies greater user engagement with the site; however, some sites seek to increase page views by paginating their content to generate more ad impressions. As with time on site, a signal like this is hard to evaluate on a standalone basis.
Repeat visits
Do users return to the site? Return visits could also be viewed as a quality signal.
Pages printed
While many pages on the Web do not lend themselves to printing, there are certain classes of pages that do, such as reference articles, recipes, maps, and similar content. If a user decides to print a page, that indicates a higher level of interest in the content.
Pages bookmarked
If a user bookmarks a page so she can return to it later, that is also a positive signal.
Scroll bar usage
Another relatively subtle indicator of engagement is whether or not the user scrolls down to see more of the content on a page.

Social Media Blogs

For further reading on social media, you can find good information on these sites:



There are many great social media tools that have not been mentioned previously in this chapter. Some of the best ones are:

Page Speed

In April 2010, Google announced that the page load time for a website was now considered a ranking factor ( However, Google has indicated that this affects a small percentage of results (about 1%). Industry tests appear to confirm this, including the results of an investigation published in this SEOmoz article:
The bottom line on page speed is that there are a lot of reasons why you should treat it as important, the biggest of which is the impact it has on conversion rates and bounce rates on your site. As an SEO factor, it is likely to only have an impact on your results if you have a particularly slow site. If that’s the case, the drive to improve your conversion rate should already be compelling enough to lead you to address the issue.
In addition to the title tag, the search engines read the meta keywords tag (the second outline in Figure 2-18). Here, you can specify a list of keywords that you wish to have associated with the page. Spammers (people who attempt to manipulate search engine results in violation of the search engine guidelines) ruined the SEO value of this tag many years ago, so its value is now negligible. Google never used this tag for ranking at all, but Bing seems to make some reference to it (you can read about this in detail at Spending a lot of time on meta keywords is not recommended because of the lack of SEO benefit.
Search engines also read the meta description tag (the third outline in the HTML in Figure 2-18). The meta description tag has no influence in search engine rankings (see, but it nonetheless plays a key role, as search engines often use it as a part or all of the description for your page in search results. A well-written meta description can have a significant influence on how many clicks you get on your search listing, so time spent on meta descriptions is quite valuable. Figure 2-20 uses a search on trip advisor to show an example of the meta description being used as a description in the search results.
A fourth element that search engines read is the alt attribute for images. The alt attribute was originally intended to allow something to be rendered when viewing of the image is not possible. There were two basic audiences for this:
  • Vision-impaired people who do not have the option of viewing the images
People who turn off images for faster surfing (this is generally an issue only for those who do not have a broadband connection)


Structural blog optimizations

As we have discussed throughout this book, there are many key elements to successful SEO. These include things such as title tags, heading tags, good content, inbound links, and SEO-friendly architecture. Although the various blog publishing platforms are great, they can sometimes also require tweaking to achieve optimal SEO results:
  • Blogs usually offer the ability to categorize each post. Make sure the tag name is used in the title of that tag page.
  • Override default title tags with custom ones. You can do this using plug-ins such as the All in One SEO Plugin ( Along with many other SEO features, this plug-in allows you to supply a custom title tag, defined through a custom field in a post or a page.
  • Rewrite your URL to contain keywords, and to use hyphens (preferred over underscores) as word separators. Do not let the blog platform include the date in the URL. If you are concerned that too many words in the URL can look spammy, consider post slug shortening with a WordPress plug-in such as Clean Trunks (
  • Make sure you 301-redirect from to (or vice versa). Note that if you have a site at and a blog at, you may need to implement a separate redirect just for the blog. This has to be handled not just for the home page, but also for all internal pages (e.g., permalink pages). Each URL must redirect to the corresponding URL on the www version.
  • If you change from one blog platform to another one, the URL structure of your blog will likely change. If so, make sure you maintain legacy URLs by 301-redirecting from each page’s old location to the new one.
In general, be on the lookout for situations where your blog interferes with basic SEO principles. Working around these limitations can be the difference between mediocre and great results. Figure 9-11 depicts a sample search result that shows the impact of using keywords in blog page titles.

Optimizing your anchor text

Anchor text is just as important in blogging as it is in general SEO. You need to leverage it as much as you can. Here are some specifics:
  • Make the post’s title a link to the permalink page. You do not want your only link to the post to say “Permalink.”
  • Use a tool such as Open Site Explorer ( or Majestic SEO ( to see who is linking to your site. Using these tools or tools like them, you can see who is linking to you and what anchor text they have used. Look for opportunities to request revisions to anchor text on inbound links, but before making such a request make sure you are comfortable that your relationship with the linker will not result in their simply removing your link instead of changing it.
  • Internally link back to old, relevant posts within the body of a blog post. Don’t use here or previously or similar words as the anchor text; use something keyword-rich instead.

More blog optimization basics

As we previously emphasized, a lot of basic SEO techniques also apply in optimizing your blog. For example, the use of emphasis tags within posts (bold, strong, em, etc.) can have a significant impact. In addition, use the rel="NoFollow" attribute on links where appropriate—for example, in all links in trackbacks, in comments, and in posts where you do not vouch for the site receiving the link.

Links remain critical

Obtaining links and managing your link juice remain critical activities. Blog platforms provide limited ability to manage your internal link juice, so this may require some customization to accomplish. Fortunately, in the WordPress environment some really good plug-ins are available to help you with this.
For starters, you can also create cross-links between related posts using a plug-in such as the Yet Another Related Posts Plugin. This is a great way to get people who just finished reading one post on your blog to consider reading another one.

Other things that are interesting to promote in one fashion or another are your top 10 posts. However, make sure each post links to the next and to previous posts using keyword-rich anchor text.

The Tracking Cycle: Produce, Launch, Measure, Refine

In summary, the basic process usually looks something like this:
  1. Define an SEO campaign and set goals. What are you going to accomplish, and what is the strategy for accomplishing it? How will you measure progress?
  2. Discuss your strategy. The marketing and business development teams are your allies here—you want to ensure that your SEO objectives are based on the overall business and site objectives, both long- and short-term.
  3. Establish a baseline. Now that you are about to start and you have decided how you are going to measure progress, establish a baseline by recording the current stats prior to beginning work. Make sure you don’t get a false baseline due to seasonal factors or some other unusual event. Comparing year-over-year data will usually help you eliminate fluctuation due to seasonality. However, you must also consider how changes in the market, new competition, competitors exiting the market, industry consolidation, and changes in your business strategy may have affected that year-over-year data.
  4. Proceed with your project. Implement the new pages, the site changes, the link-building campaign, or whatever else you may have planned. Put it in place and execute it.
  5. Collect data. Collect the newest data for each metric you decided to focus on. Since SEO can take days to show results, make sure you wait long enough for your efforts to have an impact. Many factors could influence the length of time you should wait. Here are some of them:
    • If your site is brand new, it may take longer for your changes to take effect.
    • If the scope of the change is drastic (such as a complete redesign), the time it takes to see results will probably be longer.
    • Sites that get crawled at great depth and frequently will probably yield results faster.
    • Sites seen as authoritative may also show faster results.
  6. Compare the baseline data to the new data. The new data has little meaning unless it is compared to your baseline. This is the time when you can really assess your progress.
  7. Refine your campaign. Now that you have compared your old data with your new data, you can make some decisions. Is the campaign a bust? If so, abandon it and move on to the next one. The old business axiom “fail quickly” applies here. The faster you diagnose a failure and move on to the next thing, the better.
    Your hosting company most likely provides a free web analytics solution, such as AWStats, Webalizer, or something similar. Although these tools provide valuable data, they are very limited in scope, and other tools out there provide significantly more data. Here are some of the best-known options:
    SiteCatalyst (enterprise-level solution)
    IBM Coremetrics (enterprise-level solution)
    IBM NetInsight (enterprise-level solution)
    Webtrends (enterprise-level solution)
Mobile SEO
In the case of the iPhone, you simply need to scan the user-agent string for “iPhone.” Blackberries are a bit more complex. Blackberries that use a Webkit-based browser ( are considered smartphones, while those that do not should be treated as feature phones. Therefore, phones whose user-agent strings contain either “Blackberry” or “Webkit” should be classified as smartphones and the others treated as feature phones.
For other devices, you can create a small web page and put this code on it:
CCBot/2.0 (
Once that is in place, visit the page with your mobile device and you will see the user-agent string.
When a user comes to your site, you check to see if that user’s browser has a mobile user agent. If so, you give him the mobile version of the page. If not, you give him the desktop version. There is also a tag you can use to inform Googlebot (the desktop version) about your Google content. This allows Googlebot to be aware of your mobile pages even prior to Googlebot-Mobile learning about them. The syntax for the tag is as follows:
A Google spokesperson told Eric Enge, “This tag is relevant only for feature phone optimized websites. We recommend using it even when you do user agent detection and HTTP redirects because the tag can be seen by normal Googlebot (and other search engines’ crawlers) and can be used to improve the search experience even before our Googlebot-Mobile for feature phones reaches your site.” You can learn more about the tag here:

Selecting the Right Analytics Package

Logfile tracking and JavaScript tracking are equally valid methods, and each has its own strengths and weaknesses. The biggest advantage of the logfile method is that it allows you to track search engine crawler activity on your site. This is something you cannot do in JavaScript implementations, because search engine crawlers do not execute the JavaScript.
Another major advantage of a logfile-based solution is that you run the software in-house, so no third party has a copy of a logfile with your proprietary traffic data on it. This distinction can be a big win in terms of security for some organizations.
In addition, logfile analysis allows you to track all of the following:
  • Users who don’t have JavaScript enabled (or are using privacy or ad filters)
  • Media types beyond HTML
  • Partial requests (incomplete page loads and load errors)
Ultimately, though, most companies opt for JavaScript tracking because it offers a much greater level of flexibility than logfile tracking. You can tweak the JavaScript to do custom conversion tracking, or gather pages into logical groupings in a manner that cannot be done as easily in logfile-based applications.
Some other key advantages of JavaScript tracking include the following:
  • Tracks outgoing link requests
  • Tracks events that don’t involve HTML requests, such as playing a video
  • Records visitors’ screen resolutions
Some analytics packages, such as Unica and Webtrends, offer both options, or a combined solution. This kind of approach can bring you the flexibility and power of JavaScript, but still get you your search engine robot crawling data as well.

Number of pages getting search traffic

An indirect way of measuring effective indexation is to keep an eye on how many pages are getting search traffic. This number should represent a subset of the total pages indexed, but it is more valuable because these pages were not just indexed, but ranked highly and were interesting enough that visitors decided to click on the listing.
This is an important metric to track as you work on addressing site architecture issues like duplicate content and bot traps. As you remove search engine spider roadblocks, the bots will find and index more pages. Those pages may rank and get search traffic, so this number should increase. The same happens as you consolidate duplicate content and the indexation of the site improves.

Common considerations for a mobile site

There are some recommendations that hold true regardless of whether you choose a same-URL approach or a mobile-subdomain approach. These are:
  • Create a small, lightweight, fast-loading site (<20 KB per page). Mobile devices have limited bandwidth for communication.
  • Use the XHTML Basic 1.1 DOCTYPE (with XHTML MP 1.2, cHTML, or WML 1.3). This is important to make sure your site renders properly on the majority of mobile devices that may be used to access your site.
  • Use UTF-8 character encoding.
  • Perform on-site keyphrase optimization as usual (with a focus on short titles and small amounts of body copy), and include the word mobile in the title or the heading tag (or both). Focus keywords on location and immediacy searches that enable users to take action.
  • Avoid use of Flash media interfaces/content because these do not render on most phones. Use only JPEG or GIF images.
  • Check that your mobile-friendly URLs’ DTD declarations are in an appropriate mobile format, such as XHTML Mobile or Compact HTML.
  • Set a reasonable Cache-Control value, such as 600 (10 minutes), to tell your browser to keep a local copy instead of requesting a new copy from the server if the local copy hasn’t expired. This saves on download time.
  • Speed up your page load times by reducing DNS lookups. This can be done by combining files (such as script files) or removing unnecessary images. You can read more about this here:

Mobile SEO tools for creating mobile-ready sites

There are a growing number of tools that can help you render a mobile-ready version of your existing site. Here are some of the best ones:

Other mobile SEO tools

In addition to creating your mobile site, it is useful to check other aspects of mobile SEO. Here are some tools that let you do that:

Google resources for mobile SEO

For additional reading, here are two resources from Google that cover mobile SEO considerations:

Measuring Content Quality and User Engagement

The search engines also attempt to measure the quality and uniqueness of a website’s content. One method they may use for doing this is evaluating the document itself. For example, if a web page has lots of spelling and grammatical errors, that can be taken as a sign that little editorial effort was put into that page (you can see more on this at
The search engines can also analyze the reading level of the document. One popular formula for doing this is the Flesch-Kincaid Grade Level Readability Formula, which considers things like the average word length and the number of words in a sentence to determine the level of education needed to be able to understand the sentence. Imagine a scenario where the products being sold on a page are children’s toys, but the reading level calculated suggests that a grade level of a senior in college is required to read the page. This could be another indicator of poor editorial effort.
The other method that search engines can use to evaluate the quality of a web page is to measure actual user interaction. For example, if a large number of the users who visit the web page after clicking on a search result immediately return to the search engine and click on the next result, that would be a strong indicator of poor quality.
Bounce rate
The percentage of visitors who visit only one page on your website.
Time on site
The average amount of time spent by users on the site. Note that Google Analytics only receives information when each page is loaded, so if a visitor views only one page, it does not know how much time is spent on that page. More precisely, then, this metric tells you the average time between the loading of the first page and the loading of the last page, but it does not take into account how long visitors spent on the last page loaded.
Page views per visitor
The average number of pages viewed per visitor on your site.
Top Nine Ranking Factors (according to SEOMoz)
Page Level Link Metrics
This refers to the links as related to the specific page, such as the number of links, the relevance of the links, and the trust and authority of the links received by the page.
Domain Level Link Authority Features
Domain level link authority is based on a cumulative link analysis of all the links to the domain. Factors considered include the number of different domains linking to the site, the trust/authority of those domains, the rate at which new inbound links are added, the relevance of the linking domains, and more.
Page Level Keyword Usage
This describes use of the keyword term/phrase in particular parts of the HTML code on the page (title element, 

s, alt attributes, etc.).

Domain Level Keyword Usage
This refers to how keywords are used in the root or subdomain name, and how impactful that might be on search engine rankings.
Page Level Social Metrics
Social metrics considered include mentions, links, shares, Likes, and other social media site–based metrics. At the time of the survey, the considered sites were Facebook and Twitter. Since then Google has launched Google+, and Search, plus Your World, which would also be included in this definition.
Domain Level Brand Metrics
This factor includes search volume on the website’s brand name, mentions, whether it has a presence in social media, and other brand-related metrics.
Page Level Keyword Agnostic Features
Factors included here are on-page elements such as the number of links on the page, number of internal links, number of followed links, number of NoFollowed links, and other similar factors.
Page Level Traffic/Query Data
Elements of this factor include the click-through rate (CTR) to the page in the search results, the bounce rate of visitors to the page, and other similar measurements.
Domain Level Keyword Agnostic Features
Major elements of this factor in the survey included the number of hyphens in the domain name, numeric characters in the domain name, and domain name length.

Negative Ranking Factors

The SEOmoz survey also identified a number of negative ranking factors. Some of the most significant ones included:
Malware being hosted on the site
The search engines will act rapidly to penalize sites that contain viruses or trojans.
Search engines want publishers to show the same content to the search engine as is shown to users.
Pages on the site that sell links
Google has a strong policy against paid links (, and sites that sell them may be penalized.
Content that advertises paid links on the site
As an extension of the prior negative ranking factor, promoting the sale of paid links may be a negative ranking factor.

Advanced Google Search Operators

Google supports a number of advanced search operators ( that you can use to help diagnose SEO issues. Table 2-1 gives a brief overview of the queries, how you can use them for SEO purposes, and examples of usage.

Table 2-1. Google advanced search operators


Short description

SEO application



Domain-restricted search; narrows a search to one or more specific domains/directories

Show approximately how many URLs are indexed by Google

From a directory

Including all subdomains

Show sites of a specific top-level domain (TLD)


inurl:/ allinurl:

URL keyword restricted search; narrows the results to documents containing one or more search terms in the URLs

Find web pages having your keyword in a file path

inurl:seo inurl:company


allinurl:seo company

intitle:/ allintitle:

Title keyword restricted search; restricts the results to documents containing one or more search terms in a page title

Find web pages using your keyword in a page title

intitle:seo intitle:company


allintitle:seo company

inanchor:/ allinanchor:

Anchor text keyword restricted search; restricts the results to documents containing one or more search terms in the anchor text of backlinks pointing to a page

Find pages having the most backlinks/the most powerful backlinks with the keyword in the anchor text

inanchor:seo inanchor:company


allinanchor:seo company


Body text keyword restricted search; restricts the results to documents containing one or more search terms in the body text of a page

Find pages containing the most relevant/most optimized body text


ext:/ filetype:

File type restricted search; narrows search results to the pages that end in a particular file extension

A few possible extensions/file types:

PDF (Adobe Portable Document Format)

HTML or .htm (Hypertext Markup Language)

.xls (Microsoft Excel)

.ppt (Microsoft PowerPoint)

.doc (Microsoft Word)




Wildcard search; means “insert any word here”

Search for a phrase “partial match”

seo * directory returns “seo free directory,” “seo friendly directory,” etc.


Similar URLs search; shows “related” pages by finding pages linking to the site and looking at what else they tend to link to (i.e., “co-citation”); usually 25 to 31 results are shown

Evaluate how relevant the site’s “neighbors” are

Compare and


Information about a URL search; gives information about the given page

Learn whether the page has been indexed by Google; provides links for further URL information; this search can also alert you to possible site issues (duplicate content or possible DNS problems) will show you the page title and description, and invite you to view its related pages, incoming links, and the cached version of the page


See how the crawler perceives the page; shows Google’s saved copy of the page

Google’s text version of the page works the same way as SEO Browser


Shows keywords Google thinks are related to keyword

Can be very useful in uncovering related words that you should include on your page about keyword

~zoo ~trip will show you keywords related to zoo and trip


When using the site: operator, some indexed URLs might not be displayed (even if you use the “repeat the search with omitted results included” link to see the full list). The site: query is notoriously inaccurate. You can obtain a more accurate count of the pages of your site indexed by Google by appending &start=990&filter=0 to the URL of a Google result set for a site: command.

This tells Google to start with result 990, which is the last page Google will show you since it limits the results to 1,000. This must take place in two steps. First, enter a basic command and get the results. Then go up to the address bar and append the &start=990&filter=0 parameters to the end of the URL. Once you’ve done this, you can look at the total pages returned to get a more accurate count. Note that this only works if Google Instant is turned off.

To see more results, you can also use the following search patterns:

  • +, etc. (the “deeper” you dig, the more/more accurate results you get)

  • inurl:keyword1 + inurl:keyword2, etc. (for subdirectory-specific keywords)

  • intitle:keyword1 + intitle:keyword2, etc. (for pages using the keywords in the page title)

To learn more about Google advanced search operators, check out Stephan Spencer’s book Google Power Search (O’Reilly).

Combined Google queries

To get more information from Google advanced search, it helps to learn how to effectively combine search operators. Table 2-2 illustrates which search patterns you can apply to make the most of some important SEO research tasks.

Table 2-2. Combined Google search operators

What for

Usage description



Competitive analysis

Search who mentions your competitor; use the date range operator within Google advanced search to find the most recent brand mentions. The following brand-specific search terms can be used: [], [domain name], [domainname], [site owner name], etc.

(+ add &as_qdr=d [past one day] to the query string); use d3 for three days, m3 for three months, etc.

seomoz during past 24 hours

Keyword research

Evaluate the given keyword competition (sites that apply proper SEO to target the term).

inanchor:keyword intitle:keyword

inanchor:seo intitle:seo


Find more keyword phrases.

key * phrase

free * tools

SEO site auditing

Learn whether the site has canonicalization problems. - inurl:www -inurl:www


Find the site’s most powerful pages.





tld site:domain.tld











Find the site’s most powerful page related to the keyword. keyword seo intitle:keyword intitle:seo



site:domain inanchor:keyword inanchor:seo

Link building

Find sites with high authority offering a backlink opportunity.

site:org bookmarks/links/"favorite sites"/

site:gov bookmarks/links/"favorite sites"/

site:edu bookmarks/links/"favorite sites"/

site:org donors


Search for relevant forums and discussion boards to participate in discussions and probably link back to your site.

inurl:forum OR inurl:forums keyword

inurl:forum OR inurl:forums seo

Firefox plug-ins for quicker access to Google advanced search queries

You can use a number of plug-ins with Firefox to make accessing these advanced queries easier. For example:

Advanced Dork (, for quick access to intitle:, inurl:, site:, and ext: operators for a highlighted word on a page, as shown in Figure 2-25.

Bing Advanced Search Operators

Bing also offers several unique search operators worth looking into, as shown in Table 2-3.

Table 2-3. Bing advanced operators


Short description

SEO application



Domain outbound links restricted search; finds all pages the given domain links out to

Find the most relevant sites your competitor links out to seo


File type restricted search; narrows search results to pages linking to a document of the specified file type

Find pages linking to a specific document type containing relevant information

contains:wma seo


IP address restricted search; shows sites sharing one IP address



Body text keyword restricted search; restricts the results to documents containing query word(s) in the body text of a page

Find pages containing the most relevant/best optimized body text

inbody:seo (equivalent to Google’s intext:)


Location-specific search; narrows search results to a specified location (multiple location options can be found under Bing’s advanced search)

Find geospecific documents using your keyword

seo loc:AU


Feed keyword restricted search; narrows search results to terms contained in RSS feeds

Find relevant feeds



Feed keyword restricted search; narrows search results to pages linking to feeds that contain the specified keywords

Find pages linking to relevant feeds


More Advanced Search Operator Techniques

You can also use more advanced SEO techniques to extract more information.

Determining keyword difficulty

When building a web page, it can be useful to know how competitive the keyword you are going after is. This information can be difficult to obtain. However, there are steps you can take to get some idea as to how difficult it is to rank for a keyword. For example, the intitle: operator (e.g., intitle:"dress boots") shows pages that are more focused on your search term than the pages returned without that operator.
You can use different ratios to give you a sense of how competitive a keyword market is (higher results mean that it is more competitive). For example:
dress boots (108,000,000) versus “dress boots” (2,020,000) versus intitle:"dress boots" (375,000)
Ratio: 108,000/375 = 290:1
Exact phrase ratio: 2,020/37 = 5.4:1
Another significant parameter you can look at is the inanchor: operator (for example, inanchor:"dress boots"). You can use this operator in the preceding equation instead of the intitle: operator.

Using number ranges

The number range operator can help restrict the results set to a set of model numbers, product numbers, price ranges, and so forth. For example: "product/1700..1750"
Unfortunately, using the number range combined with inurl: is not supported, so the product number must be on the page. The number range operator is also great for copyright year searches (e.g., to find abandoned sites to acquire). Combine it with the intext: operator to improve the signal-to-noise ratio; for example, intext:"copyright 1993..2005" −2008 blog.

Advanced doc type searches

The filetype: operator is useful for looking for needles in haystacks. Here are a couple of examples:
confidential business plan -template filetype:doc
forrester research grapevine filetype:pdf


If you are using Yahoo! India, you should use the originurlextension: operator instead.

Determining listing age

You can label results with dates that give a quick sense of how old (and thus trusted) each listing is; for example, by appending the &as_qdr=m199 parameter to the end of a Google SERP URL, you can restrict results to those published within the past 199 months.

Uncovering subscriber-only or deleted content

You can get to subscriber-only or deleted content from the Cached link in the listing in the SERPs or by using the cache: operator. Don’t want to leave a footprint? Add &strip=1 to the end of the Google cached URL. Images on the page won’t load.
If no Cached link is available, use Google Translate to take your English document and translate it from Spanish to English (this will reveal the content even though no Cached link is available):

Identifying neighborhoods

The related: operator will look at the sites linking to the specified site (the “Linking Sites”), and then see which other sites are commonly linked to by the Linking Sites. These are commonly referred to as neighborhoods, as there is clearly a strong relationship between sites that share similar link graphs.

Finding Creative Commons (CC) licensed content

Use the as_rights parameter in the URL to find Creative Commons licensed content. Here are some example scenarios to find CC-licensed material on the Web:
Permit commercial use
Permit derivative works
Permit commercial and derivative use
Make sure you replace KEYWORDS with the keywords that will help you find content relevant to your site. The value of this to SEO is an indirect one. Creative Commons content can potentially be a good source of content for a website.

The Value of In-House SEO

One of the significant advantages of in-house SEO is the ability to access search expertise whenever you need it. Some additional advantages are:
  • A greater likelihood that SEO best practices will be integrated into the company culture.
  • Greater accountability.
  • Faster ramp-up time (familiarity with the core business).
  • Faster implementation time (greater leverage in the development process).
  • There is someone to ensure that SEO requirements remain in scope (and raise the flag when they are descoped).
  • The SEO expert can be tapped anytime and her work reprioritized immediately for quick deliverable turnarounds.

    The Value of Outsourced Solutions

    Although it may seem that in-house SEO is the ideal solution, there are also plenty of reasons to use an outsourced team instead of, or in conjunction with, an in-house team. Here is a summary of some of the best reasons to leverage outsourced talent:
    Finding expert-level (10+ years) SEO talent to hire is difficult, so it may be smarter to contract with such talent.
    It allows you to focus on your core business and outsource your noncore activities.
    Experts generally have a broader reach in the marketplace via industry contacts and as a result of working with many clients.
    SEO is multidisciplinary, so using external people makes it easier to pull in components when needed.
    Outsourced teams don’t have the myopic focus that sometimes occurs in long-time in-house personnel.
    Outsourcing brings a fresh perspective and new ideas, and outsourced talent can often identify what in-house staff have missed.
    Many SEO consulting firms, such as those of the authors, are willing to take on contracts that include training in-house personnel. This kind of approach can give you the best of both worlds: the ability to immediately inject a high level of expertise into your organization (albeit on a temporary basis), and a long-term plan to bring the capability in-house, or at least build up the expertise with ongoing or occasional augmentation via outside consulting.

The Impact of Site Complexity on SEO Workload

  • The amount of time that is needed to perform effective SEO depends on the size and complexity of the website, as well as the competitiveness of the market space the organization is pursuing. An organization’s size and vertical scope also have an effect on the overall complexity of the SEO process. Here are some of the ways that SEO considerations can affect the complexity of a site:
    Keyword research
    More pages mean more keyword research. Solid keyword research is needed to help drive the site architecture and content plans. The less authoritative the site is, the more difficult it will be to select which keywords to target.
    Page title tags
    Page title tags are an important ranking factor, and an important piece of the search results page itself. This means that you need to take the time to write the best possible copy for generating the highest click-through rates. For very large sites, you may have to design an algorithmic method to choose these for you.
    Page content
    The content must be unique and substantial enough for a search engine to understand what the page is about. This typically requires 200–300 words of page copy. If you sell products and receive content from manufacturers, you need to invest the resources to write unique product descriptions; otherwise, you risk being omitted from the search engine’s indexes or ranking lower because your content is a duplication of other sites’ content.
    Meta descriptions
    Meta descriptions are important because search engines often use an excerpt from your meta description in the SERPs, and the description they provide for a page can influence its click-through rate. While you cannot directly control the description used in the SERPs, you can influence it. For example, by including the keywords that you are targeting with a page within its meta description text, you can make the relevancy of your page clearer to the searcher. The larger the site is, the more writing you will have to do, because search engines want unique meta descriptions for each page on the site. For very large sites, you may have to design an algorithmic method to choose these for you.
    Link-building efforts
    As sites scale, the complexity and need for links grows. You need to research the competitiveness of your targeted keywords, and make plans for link development so that execution neither grossly exceeds nor falls short of the necessary effort. The more websites/domains your company owns, the more link building is required. Likewise, the less authoritative your website is, the more link building work is required.
    Web-based partnerships
    Websites of all sizes engage in partnerships and relationships with other entities (charities, businesses, consultants, clients, etc.). SEO professionals know that all of these partnerships represent opportunities for acquiring links and link relationships, and that when they are properly leveraged they can result in massive value-adds for the organization. For larger organizations, these partnerships can be more complicated in nature. The more complex the organization is, the longer it will take to leverage these opportunities for SEO.
    Development platforms and CMSs
    The development platforms and CMSs used on larger sites can often create a number of limitations regarding SEO implementation, and frequently require extensive, costly, and time-consuming workarounds before optimization efforts can be implemented. If you have a non–search friendly CMS (most are not search engine–friendly), you will have to do more customization work. It is recommended that you work with an expert to understand what is needed so that you develop it right the first time (since you may need to recode things every time you upgrade the CMS).

    Working with Limited Resources/Budget

    Learning SEO and doing it on your own can be a challenging task, for two major reasons:
    The demanding, ever-changing landscape of search algorithm behavior is often unpredictable and nonintuitive.
    There are literally thousands of details and tactics to learn, some of which may have little impact on their own but, when used in various combinations with other components, can have a powerful influence on rankings. Herein lies the “art” aspect of mastering SEO.
    Fortunately, many SEO training materials are available via paid subscription at, SEO Book, Instant E-Training, Market Motive, ClickZ Academy, and the SEMPO Institute, among others. If you don’t have the budget for a subscription, you can try public blogs and resources such as the SEOmoz Blog and Guides and Search Engine Land. Also consider the many resources for learning that we discussed in Chapter 11.

    Basic low-budget SEO ideas

    You can do numerous things, at a fairly low cost, to improve your site’s overall optimization, including the following:
    Use the free search engine tools
    Use the free tools provided by the three major search engines. Create accounts in Google Webmaster Central, Bing Webmaster Center, and Yahoo! Site Explorer, and verify yourself as a site owner on all three. This will provide you with access to diagnostic tools, such as robots.txt validators, as well as reports on backlinks, spidering activity, server errors, top search queries, anchor text, and more.
    Find the best keywords to target
    Use the Google AdWords Keyword Tool to find keywords with high search volumes. Then use the SEOmoz Keyword Difficulty Tool to get an estimate of how hard it would be to rank for the terms you have identified.
    Also, look at the step-by-step keyword research process designed for small businesses in the Yahoo! Style Guide.
    Check out your competitors
    Assess your site and those of your competitors for SEO success factors such as keyword-rich URLs, title tags, and

    tags, keyword prominence, and so on. To survey your and your competitors’ title tags across a large number of pages, use the search engines’ site: operators and set (in the preferences) the number of results returned per page to 100.

    Optimize your title tags
    You want each title tag across your site to be unique and focused on a relevant keyword theme. Make each title tag count, because of all the elements on the page, it’s what the search engines give the most weight—and it heavily influences the searcher’s click decision from among the search results.
    Optimize other critical elements
    Analyze the text, HTML, inbound links, internal links, anchor text, and so on to determine your ideal configuration for success. Include a dose of your own critical thinking.
    Measure, test, measure, and refine
    Test your assumptions and the assertions of others—particularly SEO bloggers (not every piece of advice you find will be accurate or applicable). Measure key performance indicators (KPIs) and treat SEO like a series of experiments. Make iterative improvements to your URLs, title tags,

    tags, internal linking structure, anchor text, page copy, link acquisition efforts, and so on.

    What sorts of KPIs should you measure and aim to improve? At a minimum, consider checking rankings, traffic, and conversion metrics. However, you can also check other metrics, such as the number of different search terms used to find your site, the number of different landing pages where search visitors arrive, the growth of inbound links and the addition of any notable inbound links, and so forth.


    A free resource on key performance indicators is the ebook The Big Book of KPIs (, offered by Web Analytics Demystified. They also offer an ebook titled Web Analytics Demystified. Other great books include Web Analytics, An Hour a Day by Avinash Kaushik (Sybex), and Mastering Search Analytics by Brent Chaters (O’Reilly).
    Test different ideas
    Get one great idea for a tool, service, resource, or page, and bounce it off some folks in the forums (or privately through email to an SEO expert whom you trust). Hire a developer who can help you build it; use Craigslist or other online classifieds and post with a budget you can afford.
    Many in-house SEOs have had success finding copywriters via Craigslist to write articles for nominal fees ranging from $10–$50 per article. Few companies and in-house resources can compete with these rates.
    Leverage low-cost tools
    Consider using one of the following tools:
    These tools are popular with dynamic and static website/application developers, respectively. Most of the time they are used to build web pages, but they also offer a range of reporting tools. With Dreamweaver, it is possible to do all of the following:
    Analyze website accessibility (WAI and other standards)
    Analyze website coding practices
    Find empty or missing title tags
    Find empty or missing image alt attributes
    Analyze link structure (orphaned files, external links, and broken links)
    Store everything in nice reports anyone can view in HTML format
    You can fix most of the noticed errors/warnings automatically, either directly or indirectly. And even better, you can extend the list of available default tools with scripting and extensions.
    This is a simple link-based crawler. Web developers use Xenu to check for broken links on a regular basis, but for SEO purposes the best value comes in the form of simple internal link analysis. By ordering the Xenu Sitemap based on “links in” and “page level,” it is easy to detect possible internal linking abnormalities that may interrupt PageRank flow or decrease anchor text value; and of course, you can save all this information as a report. Xenu gathers loads of information, and it is a very useful tool, even for in-depth SEO purposes.
    This is a small desktop program you can install on your PC or Mac that spiders websites’ links, images, CSS, scripts, and apps from an SEO perspective. It fetches key on-site page elements for SEO, presents them in tabs by type, and allows you to filter for common SEO issues (or slice and dice the data how you see fit by exporting it into Excel). You can view, analyze, and filter the crawl data as it’s gathered and updated continuously in the program’s user interface.
    Although it may seem to be an unconventional tool for a web developer/SEO practitioner, Microsoft Word is undeniably one of the best copywriting and publishing tools that practically all users are familiar with. It has several built-in features that help to produce high-quality content, analyze existing content, fix and locate essential grammar errata, and above all, easily automate and synchronize all features and changes with other users and publishing tools. For more tech-savvy folks, there is always the scripting option for fine-tuning.
    As with most SEO tools, the beauty is in the eye of the beholder. If you use the preceding tools properly, they can be very helpful, but if you lack experience or try to use them for the wrong kind of task, they can cause pain and misery.
    Making proper on-page optimization decisions usually takes days. Even for a relatively small site, it is possible to cut that down to fewer than two hours by using the tools and methods we just described. Of course, there is a difference in the quality of work and the documentation you’ll get with these tools compared to what you’d get with more conventionally priced SEO services, but they do have their place.
    SEO webinars
    There are a plethora of free SEO webinars that will expand your knowledge. Great sources include Instant E-Training, SEOmoz, Search Marketing Now, and SEMPO.
    Limited cash options
    If you’re on a tight budget, create a blog with great content and promote it via social media (Twitter, Facebook, and Google+). Execute the SEO action items identified in this book, and leverage the free tools and guides we’ve mentioned. Attend free webinars. Do what you can for SEO. Then, invest in an hour or two with a consultant to do an ad hoc site review. You can also check with SEO firms to see which ones are offering special introductory packages. Many firms offer special pricing just to get started with a client.
    These are just examples of things you can do, and countless other options are also available. Whatever you do, don’t short-circuit the process of developing in-house expertise, even if you hire outside help. That in-house knowledge will be invaluable over time, and if you have hired a quality outsourcing partner they will be delighted as it will make their job easier. They may even help train you or your team.

SEO as an Enduring Art Form

  • Today, SEO can be fairly easily categorized as having five major objectives:
    Make content accessible to search engine crawlers.
    Find the keywords that searchers employ (i.e., understand your target audience) and make your site speak their language.
    Build content that users will find useful, valuable, and worthy of sharing. Ensure that they’ll have a good experience on your site to improve the likelihood that you’ll earn links and references.
    Earn votes for your content in the form of editorial links and social media mentions from good sources by building inviting, shareable content and applying classic marketing techniques to the online world.
    Create web pages that allow users to find what they want extremely quickly, ideally in the blink of an eye.
    Note, though, that the tactics an SEO practitioner might use to get links from editorial sources have been subject to rapid evolution, and will continue to be in the future. In addition, mastery of social media environments is now required of most SEO professionals.
    One thing that you can be sure about in the world of search is change, as forces from all over the Web are impacting search in a dramatic way.
    To be an artist, the SEO practitioner needs to see the landscape of possibilities for her website, and pick the best possible path to success. The requirements currently include social media optimization expertise, local search expertise, video optimization expertise, an understanding of what is coming in mobile search, and more. Such a well-rounded individual is a far cry from the backroom geek of the late 1990s.
    No one can really predict what the future will bring and what will be needed to successfully market businesses and other organizations on the Web in 2 years, let alone 5 or 10. However, you can be certain that websites are here to stay, and also that websites are never finished and, just like any other direct marketing channel, need continuous optimization. SEO expertise will be needed for a long time to come—and no existing person is better suited to map the changing environment and lead companies to success in this new, ever-evolving landscape than today’s SEO practitioner.

    Referring Sites

    Referring site reports are useful for a number of reasons, but one of the more interesting SEO reasons to look at these reports is to spot when you receive new links. You can often see those new links in these reports first, even before the search engines report them.

    Open Site Explorer

    Open Site Explorer was developed based on crawl data obtained by SEOmoz, plus a variety of parties engaged by SEOmoz. You can see a complete list of Open Site Explorer’s sources at
    Open Site Explorer is designed to be the tool for SEO practitioners to use when mapping links across the Web. It finds as many links as it can and then lets you extract them all into a spreadsheet.

    Majestic SEO

    Majestic SEO offers a comprehensive backlink analysis for your domains. Majestic assembles its database of link information based on its own crawl of the Web. Outside of the search engines, it probably has the largest database of link information available. Majestic SEO competes directly with Open Site Explorer and offers a wide variety of reports and tools to mine valuable backlink information.
    Majestic uses a simplified metric for valuing a link known as ACRank. This is deliberately not designed to directly map to the PageRank model, but in fact uses a linear scale from 1 to 15.

    Link Research Tools

    European-based offers a link-building toolkit called Link Research Tools. Link Research Tools includes a number of tools for different use cases. The Backlink Profiler and Competitive Landscape Analyzer help you analyze the competition. The Link Juice Thief, Missing Links Tool, and SERP Research Tool help you identify new link opportunities. The Link Alerts Tool monitors link growth for your or your competitors’ sites. The Common Backlinks Tool is a hub finder that helps to find sites that power your competitors.
    The Strongest Subpages Tool identifies a domain’s strongest subpages—which are quite often not the home page. The Link Juice Recovery Tool helps a webmaster identify missing/broken pages with external links. The JUICE Tool evaluates the value of a link using mozRank, ACrank, Google cache date, page speed, social votes, and content quality.

    Raven Tools

    Raven provides another comprehensive toolset for link building. One of Raven’s tools is the Link Manager (, which tracks links that you plan on acquiring, have requested, or have successfully built. The Link Manager includes conversion and ROI tracking, as well as a Firefox toolbar that makes it easier to add websites to your queue of link acquisition targets. The Link Manager will go out and automatically check the identified pages to see whether you have acquired a link on them, and it will tell you the results—thus automating the process of identifying targets and tracking the progress of one’s campaign on an ongoing basis.

    Link Builder and Link Galaxy

    One commercial alternative for link analytics (geared more toward enterprise-class clients) is Covario’s Link Builder™ solution, powered by Covario’s Link Galaxy link collection vehicle.
    Link quality is much more important than link quantity, and Link Builder provides multiple unique views allowing you to analyze which link hubs are the most critical to obtain and prioritize them in order of importance.

    Third-party link-building tools

    A variety of third-party link-building tools are also available.


    Developed by Bruce Clay, Inc., LinkMaps ( allows you to map the backlinks of any website, including your competitors’. LinkMaps gathers the initial data from the search engines and then adds some advanced filtering, including:
    Removing pages that return 404 errors
    Removing pages that do not have a link
    Limiting results to no more than four pages per domain
    Filtering out guest books
    Identifying possible link farms
    LinkMaps also shows you which search engines found the link and which did not. In addition, LinkMaps will build a page that contains links to the pages that link to you. The page is NoIndexed, so it will not show up in the search engines, but you can use it to help the search engines discover the links to your site.

    Conductor Searchlight

    This platform provides a rich toolset for link building for enterprise marketers. It provides deep insight into both you and your competitors’ backlink profiles in easy-to-understand charts that automatically update as the landscape changes. It also helps you discover link-development opportunities by determining where your competitors have links in places that you do not. Conductor Searchlight makes use of the Open Site Explorer data set from SEOmoz.

    Stone Temple Consulting LinkFocus

    Stone Temple Consulting LinkFocus is a link-building research tool that uses a proprietary algorithm to identify the most important links to a given website. The raw link data is pulled from third-party sources such as Open Site Explorer and Majestic SEO, but is then put through a prioritization process to help users rapidly identify the highest-priority link targets.

    Google Blog Search

    It is well known that the link: command works poorly in Google Web Search. For whatever reason, Google has decided to return only a small sampling of data when people use that command. However, interestingly enough, you can get more data on your backlinks using Google Blog Search—and it reports more than just the blog links

    Measuring the value of a link

    One of the big questions that people ask is what the value of a particular inbound link is. There is no simple way to answer that question, but there are some metrics you can look at that can give you a feeling for how important a link might be. Here are some of the most important elements in determining a link’s value:
    Where does the linking page rank for the term/phrase you want to rank for?
    If the page is ranking #1 at Google for sliced bread and you want to be #1 at Google for sliced bread, guess what? That’s the #1 most valuable link you can get. Keep going down the list to about positions 25 to 30, and you’re still getting solid gold in link value.
    Where does the linking page rank for one to two important, competitive terms in its title tag?
    This will give you a very solid idea about how much overall link juice and respect the search engines are giving the page. It is also a good way to identify the global link value that could be provided by a link from this page.
    Where does content on the linking domain generally rank for competitive terms in its pages’ respective title tags?
    As in the preceding list item, we’re trying to identify how positively the engines view pages on the domain. If the pages generally rank in the top 20 results, you can rest assured that search engines think the domain’s value is pretty high, and that links from that domain will pass significant value.
    How many keyword phrases do the linking domain and page rank in the top 20 results for?
    Sites that rank for a very small set of keywords may be overoptimized, causing their links to be of lower value, whereas sites that rank for a larger set of relevant keywords provide your site with more relevant links.
    Does the linking site carry any brokered sets of links?
    Sites that sell links may lose their ability to pass link juice. This really applies to any type of low-quality, manipulative linking. If you can see it, chances are that Google might see it someday, too. In addition, Google may penalize a site retroactively for repeated bad behavior, even if that behavior has been stopped.
    What is the relevance of the linking page/site to your target page?
    Answering this question requires you to think critically about the visitors to both the potential linking page and the domain. If the relevance of the subject matter to your site is high, the link will provide more semantic and topic-specific value.
    When was the last time Google crawled the page?
    A fresh timestamp is a great indicator of the importance of the page. The older it is, the less likely it is to be relevant to a modern search query. You can check a page’s timestamp by looking at its cached version in Google.
    The following are elements of secondary indicators of link value:
    Pages that link to high-ranking competitors
    Although this isn’t always an indication of direct value, it can be a good signal. Your competitors are obviously ranking based on the strength of their links, so researching the sources of those links can provide insight into where they derive that value.
    PageRank of the domain
    Google does not publish a value for the PageRank of a domain (the PageRank values you can get from the Google Toolbar are for individual pages), but many believe that Google does calculate a Domain PageRank value. Most SEOs choose to use the PageRank of the website’s home page as an estimate of Domain PageRank. You can look at the PageRank of the home page to get an idea of whether the site is penalized and to get a crude sense of the potential value of a link from the site. You can also get an assessment of the link strength of a domain using Domain-level MozRank (DmR) from SEOmoz’s Open Site Explorer tool, which we discussed earlier in this chapter.
    A PageRank 6 (or higher) home page clearly has some link love and respect, while a PageRank 2 page obviously has less; a gray bar can be a good red flag, and seeing a PageRank of 0 tells you that the domain is either new or completely invisible. However, as link-building expert Eric Ward points out, avoiding lower-PageRank domains simply because they are PageRank 3 or 4 is not necessarily a good thing, as you could be missing out on very relevant links that, in volume, contribute to your relevance.
    PageRank of the page
    Since pages that are relatively new (three to four months old) are shown on the Google Toolbar as having a PageRank of 0 (it can take Google that long to update the PageRank it shows on the Toolbar), and since so many valuable pages may have PageRanks that are only in the 1–3 range, it seems unwise to get caught up in the PageRank of a specific page. It is better to look at the domain and the attention it gives your page. However, PageRank can be valuable if the page has been around for a while.
    Inlinks to the page
    It can be useful to look at the links to the specific page you want to get a link from (or perhaps that already links to you). You want to know whether the domain links into this individual page heavily, or whether it is practically an orphan. You should also consider whether it is a page that other sites reference. Both of these factors can help illuminate potential value.
    Total inlinks to the domain
    This is a pretty indirect measurement, but it’s not completely useless. However, since the number often takes into account lots of links from a single domain, it can be misleading.
    Number of external links on the page
    Pages generally pass some of their link juice to each page they link to. Therefore, if you have a link from a page with 10 outbound links, it is far better than having a link from an equivalent page with 100 outbound links, so this is a parameter worth knowing about.
    Bing will provide data on 404 pages, pages blocked by the Robots Exclusion Protocol (REP), dynamic URLs that might be troublesome for Bing, malware-infected pages, and unsupported content types. 

Key Performance Indicators for Long-Tail SEO

  • As we have discussed throughout this book, the long tail is an important part of SEO. Metrics are available for diagnosing the health of your long-tail search traffic. Here are some that were developed by Brian Klais of Netconcepts:
    Branded-to-nonbranded ratio
    This is the percentage of your natural search traffic that comes from branded versus nonbranded keywords. If the ratio is high and most of your traffic is coming from searches for your brand, this signals that your SEO is fundamentally broken. The lower the ratio, the more of the long tail of natural search you likely are capturing. This metric is an excellent gauge of the success of your optimization initiatives.
    Unique crawled URLs
    This is the number of unique (nonduplicate) web pages on your site that are crawled by search engine spiders such as Googlebot. Your website is like your virtual sales force, bringing in prospects from the search engines. Think of each unique page as one of your virtual salespeople. The more unique pages you have, the more opportunities you have to sell through the search engines.
    Search visitors per contributing page
    This is the percentage of unique pages that yield search-delivered traffic in a given month. This ratio essentially is a key driver of the length of your long tail of natural search. The more pages you have yielding traffic from search engines, the healthier your SEO program is. If only a small portion of your website is delivering searchers to your door, most of your pages—your virtual salespeople—are warming the bench instead of working hard for you. You can think of these nonperforming pages as “freeloaders.”
    Keywords per page
    This is the average number of keywords each page (minus the freeloaders) yields in a given month. Put another way, it is the ratio of keywords to pages yielding search traffic.
    The higher your keyword yield, the more of the long tail of natural search your site will capture. In other words, the more keywords each yielding page attracts or targets, the longer your tail is. So, an average of eight search terms per page indicates pages with much broader appeal to the engines than, say, an average of three search terms per page. According to a Netconcepts study on the long tail of natural search (, the average online retailer had 2.4 keywords per page.
    Search visitors per keyword
    This is the ratio of search engine–delivered visitors to search terms. This metric indicates how much traffic each keyword drives and is a function of your rankings in the SERPs. Put another way, this metric determines the height or thickness of your long tail.
    The average online retailer in the aforementioned Netconcepts study obtained 1.9 visitors per keyword.
    Index-to-crawl ratio
    This is the ratio of pages indexed to unique crawled pages. Just because Googlebot crawls a page doesn’t guarantee it will show up in Google’s index. A low ratio can mean your site doesn’t carry much weight in Google’s eyes.
    Search visitors per crawled page
    Calculated for each search engine separately, this is a measure of how much traffic the engine delivers for every page it crawls. Each search engine has a different audience size. This metric helps you fairly compare the referral traffic you get from the different engines. The abovementioned Netconcepts study found that Bing and Yahoo! tended to crawl significantly more pages, but the yield per crawled page from Google was typically significantly higher.
    As you optimize your site through multiple iterations, watch the aforementioned key performance indicators (KPIs) to ensure that you’re heading in the right direction. Those who are not privy to these metrics will have a much harder time capturing the long tail of SEO.

    Tracking Duplicate Content

    Having duplicate content—i.e., one or more pages with multiple unique URLs—on your site will, in effect, dilute your chances of ranking for long-tail terms. You only want one canonical URL for each page of content, and you don’t want that content to be repeated verbatim on other pages of your or other sites.
    When you find duplicate content, figure out which page and URL you want to be the canonical resource (typically the one with the highest PageRank), and 301-redirect the others to it.


    Some analytics packages and independent SEO tools can track duplicate content, such as’s Duplicate Content Checker:

    Mapping Content Moves

    The first stage of dealing with a site redesign is to figure out which content will be moved where and which content will be removed altogether. You will need this to tell you which URLs you will need to redirect and to which new locations.
    To do this, you must start by getting a complete map of your URLs. For many websites this is not as simple as it sounds. Fortunately, tools are available to make the job easier. Here are some ways to tackle this problem:
    Extract a list of URLs from your web server’s logfiles and site architecture documentation.
    Pull the list from your XML Sitemap file, provided you believe it is reasonably complete.
    Use a free crawling tool such as Xenu or GSiteCrawler.
    Use Google Webmaster Tools to pull a list of the external links to your site, and make sure all pages that have received links on your site are included.
    These tools should help you assemble a decent list of all your URLs. You must then map out the pages that you want to redirect them to.

Maintaining Search Engine Visibility During and After a Site Redesign

  • Companies may decide to launch a site redesign as part of a rebranding of their business, a shift in their product lines, or a marketing makeover, or for a variety of other business reasons. During a site redesign, any number of things may change on the site. For example:
    Content may move to new URLs.
    Content might be eliminated.
    Content may be changed.
    Content could be moved behind a login.
    New sections may be added.
    New functionality may be added.
    Navigation/internal linking structure may be changed significantly.
    Of course, the move may involve moving everything to a new domain as well, but we will cover that in the next section, “Maintaining Search Engine Visibility During and After Domain Name Changes.” Here are some best practices for handling a site redesign:
    Create 301 redirects for all URLs from the original version of the site pointing to the new URLs on the redesigned site. This should cover scenarios such as any remapping of locations of content and any content that has been eliminated. Use a spreadsheet similar to the one we outlined at the beginning of this chapter to map out the moves to make sure you cover all of them.
    Review your analytics for the top 100 or so domains sending traffic to the moved and/or eliminated pages, and contact as many of these webmasters as possible about changing their links. This can help the search engines understand the new layout of your site more rapidly and also provides better branding and a better user experience.
    Review a backlink report (using your favorite backlinking tool) for your site and repeat the process in the preceding bulleted item with the top 200 to 300 or so results returned. Consider using more advanced tools, such as Open Site Explorer or Majestic SEO, which allow you to filter your links to more easily identify the most important ones.
    Make sure you update your Sitemap and submit it to Google Webmaster Central and to Bing Webmaster Tools.
    Monitor your rankings for the content, comparing old to new over time—if the rankings fall, post in the Google Groups Webmaster Central Forum ( information regarding what you did, what happened, and any information that might help someone help you. Google employees do monitor these forums and sometimes do comment in situations where they think they can help. Don’t use the forums to complain; state what has happened and ask for help, as this gives you the best chance of getting feedback.
    Monitor your Webmaster Central account and your analytics for 404 errors and to see how well Google is doing with your 301s. When you see some 404s pop up, make sure you have a properly implemented 301 redirect in place. If not, fix it. Don’t limit this checking just to 404 errors. Also be on the lookout for HTTP status codes such as 500, 302, and others.

    Hidden Content That May Be Viewed as Spam

    Hidden text is one of the challenges that webmasters and search engines still face. Spammers continue to use hidden text to stuff keywords into their pages, for the purposes of artificially boosting their rankings. Search engines seek to figure out when spammers are doing this and then take appropriate action. There are many ways to create hidden text unintentionally, though, and no one wants to be penalized for something they did not intend to do. To understand this better it is useful to examine Google’s Webmaster Guidelines for hidden text ( to see the bottom line:
    If your site is perceived to contain hidden text and links that are deceptive in intent, your site may be removed from the Google index, and will not appear in search results pages.
    Of course, as with many techniques, there are shades of gray between “this is clearly deceptive and wrong” and “this is perfectly acceptable.” Matt [Cutts, head of Google’s webspam team] did say that hiding text moves you a step further towards the gray area. But if you’re running a perfectly legitimate site, you don’t need to worry about it. If, on the other hand, your site already exhibits a bunch of other semi-shady techniques, hidden text starts to look like one more item on that list. It is like how 1 grain of sand isn’t noticeable, but many grains together start to look like a beach.
    Related to this is a posting by Matt Cutts on Threadwatch (
    If you’re straight-out using CSS to hide text, don’t be surprised if that is called spam. I’m not saying that mouseovers or DHTML text or have-a-logo-but-also-have-text is spam; I answered that last one at a conference when I said “imagine how it would look to a visitor, a competitor, or someone checking out a spam report. If you show your company’s name and it is Expo Markers instead of an Expo Markers logo, you should be fine. If the text you decide to show is ‘Expo Markers cheap online discount buy online Expo Markers sale...’ then I would be more cautious, because that can look bad.
    Obviously, this is a fate you want to avoid. Note the use of the word perceived in the Google Webmaster Guidelines snippet. It doesn’t sound like a simple black-and-white problem, does it? In fact, it is not, as there are many ways to create hidden text.

    Unintentionally creating hidden text

    There are a few ways to create hidden text without intending to do so. One of the most common ways is via your CMS, which has some CSS-based methods built into it. For example, many CMSs use the display:none technique to implement drop-down menus or other widgets that the user clicks on that then “expand” to display more text. Tab folders are a great example of this. Sometimes the display:none technique is used in user-generated content systems where the page normally shows the number of comments on a post, but chooses to suppress the text “0 Comments” in the event that no comments have been made yet.
    Another common way that people create hidden text occurs when they provide enhancements for the visually impaired. For example, you may have a Flash object on your web page and want to provide users with a text description of the content. You may not want to place the text on the page, as it might make the page look cluttered to a user with normal vision. The solution some people use to serve both audiences is to hide the text from the sighted users.
    Many of these scenarios have no SEO value, even when manipulated by spammers. These types of techniques generally do not carry a risk of being penalized, because there is no reason to suspect negative intent.

Spam Filtering and Penalties

  • Over time, it has become a lot more difficult to “game” the search engines and a lot easier to fall victim to a search engine penalty or outright ban. It is hard to recover from these.
    Consequences can include ranking penalties, removal of the site’s “voting” power (i.e., ability to pass PageRank), incomplete indexation (i.e., a partial site ban), or, worst of all, a total site ban.
    Not even the largest corporations spending big dollars on Google AdWords are immune. For example, BMW had its entire site banned from Google for a period of time because it created doorway pages—pages full of keyword-rich copy created solely for the search engine spiders and never for human viewing. To add insult to injury, Google engineer Matt Cutts publicly outed BMW on his blog. He made an example of BMW, and all of the SEO community became aware of the carmaker’s indiscretions.
    Search engines rely primarily on automated means for detecting spam, with some auxiliary assistance from paid evaluators, spam vigilantes, and even your competitors. Search engineers at Google and Microsoft write sophisticated algorithms to look for abnormalities in inbound and outbound linking, in sentence structure, in HTML coding, and so on.
    As far as the search engines are concerned, SEO has an acceptable side and an unacceptable side. In general terms, all types of actions intended to boost a site’s search engine ranking without improving the true value of a page can be considered spamming.
    Each search engine has different published guidelines. Here is where you can find them:
    Google’s Webmaster Guidelines are located at
    Bing offers some tips for publishers that you can access from this page:
    The search engines have varying degrees of tolerance for various SEO tactics. Anything that violates these guidelines, pollutes the search results with irrelevant or useless information, or would embarrass you if your Google AdWords or Bing rep discovered it is unsustainable and should be avoided.
    There’s a big difference between “search engine–friendly” and crossing the line into spam territory. “Search engine–friendly” can mean, for example, that the site is easily accessible to spiders, even if it is database-driven; that HTML code is streamlined to minimize the amount of superfluous code; that important headings, such as product names, are set apart from the rest of the text (e.g., with

    tags) and contain relevant keywords; or that link text is contextual, instead of comprising just “click here” or “more info” references.

    Contrast these basic SEO practices with the following manipulative search engine spam tactics:
    Serving pages to the search engines that are useless, incomprehensible, unsuitable for human viewing, or otherwise devoid of valuable content—such as doorway pages, which SEO vendors may refer to by more innocuous names, including “gateway pages,” “bridge pages,” “jump pages,” “attraction pages,” “advertising pages,” “channel pages,” “directory information pages,” “search engine entry pages,” “satellite sites,” “mini sites,” “magnet sites,” or “shadow domains.” Whatever you call them, by definition they are created for the sole purpose of boosting search engine rankings.
    Creating sites with little useful unique content. There are many techniques for doing this, including:
    Duplicating pages with minimal or no changes and exposing them to the same search engines under new URLs or domains.
    Machine-generating content to chosen keyword densities (e.g., using a technique such as Markov chains; but they are not recommended).
    Incorporating keyword-rich but nonsensical gibberish (also known as spamglish) into site content.
    Creating a low-value site solely for affiliate marketing purposes (see the upcoming subsection on duplicate content for a more complete definition of thin affiliate).
    Repeating the same keyword phrase in the title tag, the

    tag, the first alt tag on the page, the meta description, the first sentence of body copy, and the anchor text in links pointing to the page.

    Targeting obviously irrelevant keywords.
    Concealing or obscuring keyword-rich text or links within the HTML of a page so that it is not visible to or accessible by human users (i.e., by placing it within comment tags, noscript tags, or noframe tags; or by using colored text on a similarly colored background, tiny font sizes, layers, or links that don’t show as links to users because they are not highlighted in some manner, such as with an underline).
    Hijacking or stealing content from other sites and using it as fodder for search engines. This is a practice normally implemented using scrapers. Related to this is a practice known as splogging: creating blogs and posting stolen or machine-generated content to them.
    Purchasing links for the purpose of influencing search rankings.
    Participating in link farms (which can be distinguished from directories in that they are less organized and have more links per page) or reciprocal linking schemes (link exchanges) with irrelevant sites for the purpose of artificially boosting your site’s importance.
    Peppering websites’ guest books, blogs, or forums in bulk with keyword-rich text links for the purpose of artificially boosting your site’s importance.
    Conducting sneaky redirects (immediately redirecting searchers entering your site from a keyword-rich page that ranks in the search engine to some other page that would not rank as well).
    Cloaking, or detecting search engine spiders when they visit and modifying the page content specifically for the spiders to improve rankings.
    Buying expired domains with high PageRank, or snapping up domain names when they expire with the hope of laying claim to the previous site’s inbound links.
    Google bowling, or submitting your competitors to link farms and so on so that they will be penalized.
    These tactics are questionable in terms of effectiveness and dubious in the eyes of the search engines, often resulting in the offending site being penalized by or banned from the search engines—a risk that’s only going to increase as the engines become more aggressive and sophisticated at identifying and removing offenders from their indexes. We do not advocate implementing these tactics to those interested in achieving the long-term benefits of SEO.
    The search engines detect these tactics not just by automated means, through sophisticated spam-catching algorithms, but also through spam reports submitted by searchers—and yes, by your competitors. Speaking of which, you too can turn in search engine spammers, using the forms located at the following URLs:
    A lot of times marketers don’t even know they’re in the wrong. For example, search engines scrutinize much more heavily pages that show signs of potential deception, such as no-archive tags, noscript tags, noframe tags, and cloaking, even though all of these can be used ethically.
    There is a popular myth that SEO is an ongoing chess game between SEO practitioners and the search engines. One moves, the other changes the rules or the algorithm, then the next move is made with the new rules in mind, and so on. Supposedly, if you don’t partake in this continual progression of tactic versus tactic, you will not get the rankings lift you want.
    For ethical SEO professionals, this generally does not apply. Search engines evolve their algorithms to thwart spammers, and if you achieve high rankings through SEO tactics within the search engines’ guidelines, you’re likely to achieve sustainable results.
    Seeing SEO as a chess game between yourself and the search engines is a shortsighted view, as the goal of the search engines is to provide relevant search results to their users. Trying to fool the search engines and take unfair advantage using the latest tricks isn’t a sustainable approach for anybody—the company, its SEO vendor, the search engine, or the search engine’s users.

    Recognizing Low-Quality Domains and Spam Sites

    You can spot a poor-quality site in many ways, not the least of which is the “common sense” check. Would you hire a company with a website named, or The domain, of course, is only one signal, and search engines rely on a wide range of signals as indicators of quality. Some of the most obvious signals are site owners who are actively spamming the search engines with their offsite activities—for example, if the site is actively buying links, or text link–spamming blog posts, forums, and article comments.
    However, there are also less obvious signals. Many such signals mean nothing by themselves and gain significance only when they are combined with a variety of other signals. When a number of these factors appear in combination on a site, the likelihood of it being seen as a low-quality or spam site increases.
    Here is a long list of some of these types of signals:
    Short registration period (one year, maybe two)
    High ratio of ad blocks to content
    JavaScript redirects from initial landing pages
    Use of common, high-commercial-value spam keywords such as mortgage, poker, texas hold ’em, porn, student credit cards, and related terms
    Many links to other low-quality spam sites
    Few links to high-quality, trusted sites
    High keyword frequencies and keyword densities
    Small amounts of unique content
    Very few direct visits
    Registered to people/entities previously associated with untrusted sites
    Not registered with services such as Google Webmaster Central or Bing Webmaster Tools
    Rarely have short, high-value domain names
    Often contain many keyword-stuffed subdomains
    More likely to have longer domain names (as above)
    More likely to contain multiple hyphens in the domain name
    Less likely to have links from trusted sources
    Less likely to have SSL security certificates
    Less likely to be in high-quality directories such as DMOZ, Yahoo!, IPL2, and so forth
    Unlikely to have any significant quantity of branded searches
    Unlikely to be bookmarked in services such as My Yahoo!, Delicious, etc.
    Unlikely to get featured in social voting sites such as Digg, Reddit, Yahoo!, StumbleUpon, and so forth
    Unlikely to have channels on YouTube, communities on Google+, Facebook, or links from Wikipedia
    Unlikely to be mentioned on major news sites (either with or without link attribution)
    Unlikely to register with Google/Yahoo!/MSN Local Services
    Unlikely to have a legitimate physical address/phone number on the website
    Likely to have the domain associated with emails on blacklists
    Often contain a large number of snippets of “duplicate” content found elsewhere on the Web
    Frequently feature commercially focused content
    Many levels of links away from highly trusted websites
    Rarely contain privacy policy and copyright notice pages
    Rarely listed in the Better Business Bureau’s Online Directory
    Rarely contain high-grade-level text content (as measured by metrics such as the Flesch-Kincaid Reading Level)
    Rarely have small snippets of text quoted on other websites and pages
    Cloaking based on user agent or IP address is common
    Rarely contain paid analytics tracking software
    Rarely have online or offline marketing campaigns
    Rarely have affiliate link programs pointing to them
    Less likely to have .com or .org extensions; more likely to use .info, .cc, .us, and other cheap, easily obtained top-level domains (TLDs)
    Almost never have .mil, .edu, or .gov extensions
    Rarely have links from domains with .mil or .gov extensions
    May have links to a significant portion of the sites and pages that link to them
    Extremely unlikely to be mentioned or linked to in scientific research papers
    Unlikely to use expensive web technologies (Microsoft Server and coding products that require a licensing fee)
    May be registered by parties who own a very large number of domains
    More likely to contain malware, viruses, or spyware (or any automated downloads)
    Likely to have privacy protection on the Whois information for their domain
    It is important to note that while many of the signals above can be viewed negatively in aggregate, having one, such as a private domain registration, in and of itself is not going to be interpreted as a spam signal. Many legitimate sites will have one or more of these signals associated with them. For example, there are many good sites with an .info TLD.
    There are also some signals that can be picked up that require data from a web analytics tool (which Google may be able to obtain from Google Analytics):
    Rarely receive high quantities of monthly visits
    Rarely have visits lasting longer than 30 seconds
    Rarely have visitors bookmarking their domains in the browser
    Unlikely to buy significant quantities of PPC ad traffic
    Rarely have banner ad media buys
    Unlikely to attract significant return traffic
    There can be legitimate reasons for the occurrence of many (possibly even most) of the signals mentioned above. For instance:
    Not every site needs an SSL certificate.
    Businesses outside the United States will not be in the Better Business Bureau directory.
    The site may not be relevant to scientific research papers.
    The publisher may not be aware of Google Webmaster Tools or Bing Webmaster Tools.
    Very few people are eligible for an .edu, .gov, or .mil TLD.
    These are just a few examples meant to illustrate that these signals all need to be put into proper context. If a site conducts ecommerce sales and does not have an SSL certificate, that makes that a stronger signal. If a site says it is a university but does not have an .edu TLD, that is also a strong signal that it is a poor-quality site.
    Doing one, two, or three of these things is not normally going to be a problem. However, sites that do 10, 20, or more of them may well have a problem.

    Competitors Can Report You

    Search engines supplement their internal spam fighting efforts by allowing users to submit spam reports. For example, Google provides a form for reporting spam at A poll at Search Engine Roundtable in May 2008 showed that 31% of respondents had reported their competitors as spammers in Google. The bottom line is that having your competitor report you is a real risk.
    In addition, the search engines can and do make use of human reviewers who conduct quality reviews. In fact, in 2007 a confidential Google document called the “Spam Recognition Guide for Raters” was leaked to the public (and it is still available at The guide delineates some of the criteria for recognizing search engine spam, such as whether the site is a thin affiliate.

    Basic Rules for Spam-Free SEO

    Especially if you are a novice SEO practitioner/publisher, the first and most important rule is to be familiar with the guidelines of the search engines (see the beginning of this section for the locations of the guidelines on the search engine sites).
    Second, it is essential that you learn to apply a basic personal filter to your SEO activities. Generally speaking, if you engage in an activity for the sole purpose of influencing search rankings, you are putting your site’s rankings at risk.
    For example, if you start buying keyword-rich text links from a bunch of sites across the Web and you do not expect to get significant traffic from these links (enough to justify placing the advertisements), you are headed for trouble.
    Search engines do not want publishers/SEO experts to buy links for that purpose. In fact, Google has strongly campaigned against this practice. Google has also invested heavily in paid link detection. The following blog posts from Matt Cutts, the head of Google’s webspam team, elaborate:
    Of course, there are more ways you can get into trouble than through purchasing or aggressively spamming other websites with keyword-rich text links. Earlier in this chapter, we listed a number of ways that spammers may behave. Most publishers/SEO practitioners won’t run into the majority of these as they involve extremely manipulative behavior, as we outlined previously in this section. However, novice SEO practitioners do tend to make certain mistakes. Here are some of the more common ones:
    Stuffing keywords into your web page so it is unnaturally rich in those words.
    Overoptimizing internal links (links on your website to other pages on your website). Generally speaking, this may occur by overstuffing keywords into the anchor text of those links.
    Cloaking, or showing different content to the search engine crawlers than you show to users.
    Creating websites with lots of very thin content pages, such as the thin affiliate sites we discussed previously.
    Implementing pages that have search engine–readable text that is invisible to users (hidden text).
    Participating in link schemes, link farms, or using other tactics to artificially boost link popularity.
    As we discussed previously, there are many other ways to end up in the crosshairs of the search engines, but most of those are the domain of highly manipulative SEO practitioners (sometimes referred to in the industry as black hat SEO practitioners).
    Ultimately, you want to look at your intent in pursuing a particular SEO practice. Is it something you would have done if the search engines did not exist? Would it have been part of a publishing and promotional plan for your website in such a world?
    This notion of intent is something that the search engines look at very closely when evaluating a site to see whether it is engaging in spamming. In fact, search engine representatives speak about intent (why you did it) and extent (how much did you do it) as being key things they evaluate when conducting a human review of a website.
    A perceived high degree of intent and pursuing something to a significant extent can lead to the more severe penalties being applied. But even if your intent is pure and you don’t pursue a proscribed practice to a significant degree, the search engines will want to discount the possible benefits of such behavior from the rankings equation. For example, if you buy a few dozen links and the search engines figure out that the links were paid for, they will discount those links. This behavior may not be egregious enough for them to penalize you, but they don’t want you to benefit from it either.

    Identifying Search Engine Penalties

    The reality is that penalties are imposed on websites. When this happens, it is useful to have an idea how to fix it. So, let’s take a look at how it works.
    The flowchart in Figure 11-4 shows a series of basic checks that you can use to find out whether you have a problem. It is also important to remember that search engines are constantly tuning their algorithms (Google makes algorithm changes every day). Simple movement up and down in the rankings does not necessarily constitute a penalty.

SEO Resources

  • One of the easiest ways to research what is happening in the world of SEO is to study the websites and periodicals that cover SEO in detail, but ongoing testing of SEO hypotheses should also play a role.


    A large number of online sites cover the search marketing space. Here is a short list of some of the better-known ones:
    Search Engine Land, owned and operated by Third Door Media
    Search Engine Watch, owned and operated by Incisive Media
    SEOmoz, owned and operated by SEOmoz
    Each of these sites publishes columns on a daily basis, with Search Engine Land and Search Engine Watch publishing multiple posts every weekday. The columns are typically written by industry experts who have been chosen for their ability to communicate information of value to their reader bases. SEOmoz also provides a wide range of tools and resources for SEO practitioners.

    Commentary from search engine employees

    Search engine representatives sometimes actively participate in forums, or publish blog posts and/or videos designed for webmasters. The main blogs for each of the three major search engines at the time of this writing are:
    The search engines use these blogs to communicate official policy, announce new things, and provide webmasters with useful tips. You can reach Google personnel via the Google Webmaster Help group in Google Groups ( Members of the Google webspam team are active in this group, answering questions and even starting their own new threads from time to time.
    You can also interact with search engine representatives in various forums, such as WebmasterWorld and Search Engine Roundtable ( Sometimes they use nicknames such as “googleguy” or “msndude,” so watch for those. You can also watch for search engine personnel who leave comments in popular SEO blogs. We will discuss the value of forums in more detail in The SEO Industry on the Web.

    Interpreting commentary

    Search engine reps are “managed” by their corporate communications departments. Some aren’t allowed to go on the record. Some need approval before doing so, and/or need their comments edited before publication. A rare few have free reign (e.g., Matt Cutts). Often they can’t be very specific or they can’t answer questions at all. The algorithms the search engines use are highly proprietary and they need to keep them secret.
    This means there are certain types of questions they won’t answer, such as “What do I have to do to move from position 3 to position 1 on a particular search?,” or “How come this spammy site ranks so much higher than mine?”
    In addition, they have their own motives and goals. They will want to reduce the amount of spam in their search indexes and on the Web overall (which is a good thing), but this may lead them to take positions on certain topics based on those goals.
    As an example, Google does not talk about its capability for detecting paid links, but it suggests that its ability to detect them is greater than the general webmaster community believes. Taking this position is, in itself, a spam-fighting tactic, since it may discourage people who otherwise might have chosen to do so from buying links (as we indicated in Chapter 7, we do not recommend purchasing links; this example is simply meant to illustrate how a policy might affect communications).
    In spite of these limitations, you can gather a lot of useful data from interacting with search engine representatives.

    SEO Testing

    SEO is both an art and a science. As with any scientific discipline, it requires rigorous testing of hypotheses. The results need to be reproducible, and you have to take an experimental approach so as not to modify too many variables at once. Otherwise, you will not be able to tell which changes were responsible for specific results.
    And although you can glean a tremendous amount of knowledge of SEO best practices, latest trends, and tactics from SEO blogs, forums, and ebooks, it is hard to separate the wheat from the chaff and to know with any degree of certainty that an SEO-related claim will hold true. That’s where the testing of your SEO efforts comes in: in proving what works and what doesn’t.
    Unlike multivariate testing for optimizing conversion rates, where many experiments can be run in parallel, SEO testing requires a serial approach. Everything must filter through the search engines before the impact can be gauged. This is made more difficult by the fact that there’s a lag between when you make the changes and when the revised pages are spidered, and another lag before the spidered content makes it into the index and onto the search engine results pages (SERPs). On top of that, the results delivered depend on the user’s search history, the Google data center accessed, and other variables that you cannot hold constant.

    Sample experimental approach

    Let’s imagine you have a product page with a particular ranking in Google for a specific search term, and you want to improve the ranking and resultant traffic. Rather than applying a number of different SEO tactics at once, start varying things one at a time:
    Tweak just the title tag and see what happens.
    Continue making further revisions to the title tag in multiple iterations until your search engine results show that the tag truly is optimal.
    Move on to your heading tag, tweaking that and nothing else.
    Watch what happens. Optimize it in multiple iterations.
    Move on to the intro copy, then the breadcrumb navigation, and so on.
    You can test many different elements in this scenario, such as:
    Title tag
    Headline (



    , ...) tags

    Placement of body copy in the HTML
    Presence of keywords in the body copy
    Keyword prominence
    Keyword repetition
    Anchor text of internal links to that page
    Anchor text of inbound links to that page from sites over which you have influence
    Testing should be iterative and ongoing, not just a “one-off” in which you give it your best shot and that’s it. If you’re testing title tags, continue trying different things to see what works best. Shorten it; lengthen it; move words around; substitute words with synonyms. If all else fails, you can always put it back to the way it was.
    When doing iterative testing, it’s good to do what you can to speed up the spidering and indexation so that you don’t have to wait so long between iterations to see the impact.
    You can do this by flowing more link juice to the pages you want to test. That means linking to them from higher in the site tree (e.g., from the home page). Be sure to do this a while before forming your baseline, though, because you will want the impact of changing the internal links to show in the search engines before you initiate your test (to prevent the two changes from interacting).
    Alternatively, you can use the Google Sitemaps protocol to set a priority for each page, from 0.0 to 1.0. Dial up the priority to 1.0 to increase the frequency with which your test pages will be spidered.


    Don’t make the mistake of setting all your pages to 1.0; if you do, none of your pages will be differentiated from each other in priority, and thus none will get preferential treatment from Googlebot.
    Since geolocation and personalization mean that not everyone is seeing the same search results, you shouldn’t rely on rankings as your only bellwether regarding what worked or didn’t work.

    Other useful SEO metrics

    As we discussed in Chapter 9, many other meaningful SEO metrics exist, including:
    Traffic to the page
    Spider activity
    Search terms driving traffic per page
    Number and percentage of pages yielding search traffic
    Searchers delivered per search term
    Ratio of branded to nonbranded search terms
    Unique pages spidered
    Unique pages indexed
    Ratio of pages spidered to pages indexed
    Conversion rate
    And many others
    But just having better metrics isn’t enough. An effective testing regimen also requires a platform that is conducive to performing rapid-fire iterative tests, in which each test can be associated with reporting based on these new metrics. Such a platform comes in very handy with experiments that are difficult to conduct under normal circumstances.
    Testing a category name revision applied sitewide is harder than, say, testing a title tag revision applied to a single page. Specifically, consider a scenario where you’re asked to make a business case for changing the category name “kitchen electrics” to the more search engine–optimal “kitchen small appliances” or “small kitchen appliances.” Conducting the test to quantify the value would require applying the change to every occurrence of “kitchen electrics” across the website—a tall order indeed, unless you can conduct the test as a simple search-and-replace operation, which you can do by applying it through a proxy server platform.
    By acting as a middleman between the web server and the spider, a proxy server can facilitate useful tests that normally would be invasive on the ecommerce platform and time-intensive for the IT team to implement.


    During the proxying process, you can replace not only words, but also HTML, site navigation elements, Flash, JavaScript, frames, and even HTTP headers—almost anything, in fact. You also can do some worthwhile side-by-side comparison tests: a champion/challenger sort of model that compares the proxy site to the native website.

    Start with a hypothesis

    A sound experiment always starts with a hypothesis. For example, if a page isn’t performing well in the SERPs and it’s an important product category for you, you might hypothesize that it’s underperforming because it’s not well linked-to from within your site. Or you may conclude that the page isn’t ranking well because it is targeting unpopular keywords, or because it doesn’t have enough copy.
    Once you have your hypothesis, you can set up a test to gauge its truth. For example, in the case of the first hypothesis, you could try these steps:
    Add a link to that page on your site’s home page.
    Measure the effect, waiting at least a few weeks for the impact of the test to be reflected in the rankings.
    If the rankings don’t improve, formulate another hypothesis and conduct another test.
    Granted, this can be a slow process if you have to wait a month for the impact of each test to be revealed, but in SEO, patience is a virtue. Reacting too soon to changes you see (or don’t see) in the SERPs can lead you to false conclusions. You need to give the search engines time to fully process what you have done so that you can improve the chances that you are drawing the right conclusions based on your tests. You also need to remember that the search engines may be making changes in their algorithms at the same time.

    Analysis of Top-Ranking Sites and Pages

    There are many reasons for wanting to analyze top-ranking sites, and particularly those that rank at the top in your market space. They may be your competitors’ sites—which is reason enough to explore what they are doing—but even if they are not, it can be very helpful to understand the types of things they are doing and how those things may have helped them get their top rankings. With this information in hand you will be better informed as you decide how to put together the strategy for your site.
    Let’s start by reviewing a number of metrics of interest and how to get them:
    Start with a simple business analysis to see how a particular company’s business overlaps with yours and with other top-ranked businesses in your market space. It is good to know who is competing directly and who is competing only indirectly.
    Find out when the website was launched. This can be helpful in evaluating the site’s momentum. Determining the domain age is easy; you can do it by checking the domain’s Whois records. Obtaining the age of the site is trickier. However, you can use the Wayback Machine ( to get an idea of when a site was launched (or at least when it had enough exposure that started tracking it).
    Determine the number of Google results for a search for the site’s domain name (including the extension) for the past six months, excluding the domain itself. To get this information, search for in Google. Then append &as_qdr=m6 to the end of the results page URL and reload the page.
    Determine the number of Google results for a search for the site’s domain name (including the extension) for the past three months, excluding the domain itself. This time, modify the results page URL by adding &as_qdr=m3 to the end of it.
    Perform a query on Google Blog Search for the domain name, excluding the domain itself, on the default settings (no particular timeline).
    Find out from Google Blog Search how many posts have appeared about the site in the past month. To do this, search for the domain in Google Blog Search, then append &as_qdr=m1 to the end of the results page URL and reload the page.
    Obtain the PageRank of the domain’s home page as reported by the Google Toolbar.
    Analyzing the site’s backlinks using an industrial-strength tool such as SEOmoz’s Open Site Explorer or Majestic SEO can be helpful to analyze backlink profiles. These tools provide a rich set of link data based on their own crawl of the Web, including additional critical details such as the anchor text of the links.
    If you are able to access a paid service such as Hitwise or comScore, you can pull a rich set of additional data, breaking out the site’s traffic by source (e.g., organic versus paid versus direct traffic versus other referrers). You can also pull information on their highest-volume search terms for both paid and organic search.
    Determine the number of indexed pages in each of the two major search engines, using
    If relevant, obtain Technorati’s authority number for the site, which derives from the number of individual, unique blogs that have linked to a site in the past 90 days.
    If relevant, get Google’s feed subscriber numbers for the site, which you can find by searching for domains inside Google Reader.
    If relevant, determine Bloglines’s subscription numbers for the site, which derive from searches performed inside Bloglines.
    Search on the company brand name at Google, restricted to the past six months (by appending &as_qdr=m6 to the results page URL, as outlined earlier).
    Repeat the preceding step, but for only the past three months (using &as_qdr=m3).
    Perform a Google Blog Search for the brand name using the default settings (no time frame).
    Repeat the preceding step, but limit it to blog posts from the past month (using &as_qdr=m1).
    Of course, this is a pretty extensive analysis to perform, but it’s certainly worthwhile for the few sites that are the most important ones in your space. You might want to pick a subset of other related sites as well.


    As valuable as website metrics are, brand names can sometimes provide even more insight. After all, not everyone is going to use the domain name when talking about a particular brand, nor will they all link. Thus, looking at brand mentions over the past few months can provide valuable data.

    Understand Your Market Opportunity

    Most SEOs probably cannot say with certainty what ranking in the top search positions for their keywords is (or would be) worth to them in terms of traffic and revenue. We would never set out to develop a new product or launch a new service without first understanding the business case/market opportunity, and natural search should be no different. Yet, partially because of its roots in the tech community, and partially because SEO is still maturing as a discipline, many SEOs jump right into optimizing keywords without completing a full market opportunity assessment. This approach means you don’t have a North Star to steer by and makes it difficult to show true progress toward a goal as you move forward with your natural search efforts.
    Understanding the market opportunity prior to tweaking even the first

    tag is important because it will help you to:

    Understand what SEO is worth to your organization. It will give you an idea of what natural search visibility is worth to your organization and answer questions like “How many visitors would I gain if I was in position 1 for all my keywords?” Answers to these questions will guide the SEO process, including budget, headcount, and infrastructure investments, and will help you make the SEO business case to the rest of the organization.
    Track progress as you go. Building a market opportunity benchmark prior to embarking on SEO will enable you to measure progress as you go and demonstrate the ROI of SEO in real, quantifiable metrics that will resonate with the C-suite.
    A market opportunity assessment in natural search shows the current visitors captured for your keywords, versus the potential opportunity were you to rank in a top position. There are various tools available on the Internet that can help simplify the process.
    To conduct a market opportunity assessment:
    Isolate your top group of nonbranded keywords. Determine your target keywords based on their perceived conversion potential. The number of keywords may vary based on factors such as your SEO maturity or vertical market space and can range from around 50 for those just starting out on up to several thousand keywords for those farther along in natural search.
    Gather search volumes and rankings for your keywords. Find out what your current search volumes and rankings are for your targeted keywords.
    Use a click-through rate (CTR) curve to identify the potential for new visitors. A CTR curve shows the expected percentage of clicks based on the search position of the keyword. Plugging your current search volume and rank into a CTR curve will give you an idea of the number of new visitors you can expect to get by moving up in the search rankings. You can get estimated CTR data by position from Figure 1-12 (back in Chapter 1), which shows the average CTR by ranking position.
    Scale that across your total traffic. The first three steps of the analysis give you a sense of the traffic gain you can get for your top keywords, but it is likely that you have many other keywords driving traffic to your site. If you have an opportunity to double the traffic on your top keywords, you likely also have the opportunity to double your overall nonbranded organic search traffic volume.

Get Buy-in Across the Organization

  • Now that you’ve done an opportunity assessment and know what you will be working toward in natural search, it’s time to internally publicize the opportunity and obtain buy-in across the organization. For SEO to be successful in any organization, active participation from many parts of the business, across nearly every department, is needed. From C-level executives approving budget, to the IT department implementing web and infrastructure changes, to content writers using SEO best practices, organizational buy-in is as critical to successful SEO as backlinks and


    To obtain organizational buy-in:
    Get people excited—then scare the heck out of them. Show people the opportunity in natural search that has come out of your opportunity assessment. Make them aware of what’s being missed out on by not ranking in search. Then, make them aware of where your company stands relative to the competition, and identify competitors who are currently cashing in on the opportunities you are missing out on.
    Keep the opportunity foremost in people’s minds. Make posters outlining the market opportunity and hang them on the walls. Speak about it often, and generally socialize it in the organization. The more you can keep the opportunity front and center in people’s minds and keep them excited about it, the better.
    Create a service-level agreement (SLA) with stakeholders. To formalize the working relationship with organizational departments and increase their buy-in to SEO, sign an SLA with them that defines how quickly they will respond to requests and the metrics that will be used to measure success.

    Progress Through the Stages of SEO Maturity

    Now that you have organizational buy-in and appropriate resources to execute, how do you go about scaling your keywords and continuing to drive natural search traffic to your website?
    Our observations over the years have shown that organizations move through five distinct stages when it comes to SEO. Each stage is characterized by investments in budget, strategy, and metrics appropriate to the stage’s maturity level. Organizations should continuously be striving to move into the next maturity stage until they are at the “Compete” level and are actively competing to best the top firms in the search rankings. The stages are:
    Try. In the first stage of SEO maturity, the organization dips its toe in the water and focuses on low-hanging fruit, optimizing for a small set of initial keywords. At the same time, they begin efforts to move their top target keywords on page 7+ of the SERPs into striking-distance positions on pages 4 or below, so they are primed to move into top-visibility positions in later phases. In the “Try” phase, the majority of the budget is spent on personnel, with a portion allocated to engineering/production. A minimal investment is made in SEO technology. One employee is assigned to SEO part-time as part of his regular duties.
    Invest. In this stage, organizations begin to expand their coverage, moving into more competitive keywords with higher search volumes. Average ranks begin to improve as keywords optimized in the previous stage begin to move into page 2, 3, or 4 of the search rankings. Organizations now begin to invest in some link-building activities and content creation and start to expand their investment in SEO technology. One or two employees are assigned full-time to SEO.
    Measure. In this stage, to maintain investment in SEO, it becomes critical for organizations to begin showing ROI and tying SEO metrics such as rankings and traffic to business metrics such as conversions and revenue. In the “Measure” stage, rankings continue to increase as the organization becomes increasingly SEO-savvy, beginning to invest in continuing education. Several full-time employees are assigned to SEO, along with specialized outside consultants.
    Scale. In this stage the organization organizes and prioritizes, beginning to look at natural search from a holistic perspective. A natural search pipeline develops as SEO starts to be treated as a sales campaign organizationally. Many more keywords move into top-visibility positions as investments in link building and content creation are increased. Several employees remain assigned full-time to SEO along with outside consultants and specialists.
    The organization is now ready to move on to stage 5.

These are notes I made after reading this book. See more book notes

Just to let you know, this page was last updated Wednesday, Apr 17 24