Conversion Optimization

Get the training you need to DOMINATE your competition. Stop losing ground and join the SEM Mastery.

The Current Algorithm Update
A reminder: I release this information every month the the members of the SEO Revolution to show you the trends and the possible upcoming changes in the search engine algorithms of Google, Yahoo! and Bing/MSN.

This issue is the best part of the subscription according to my long-term subscribers, and what made me "famous" so to speak. No one else releases this type of information to my knowledge. This gives you a great view at what is really happening with the major search engines with real results from real domains. Note: THESE RESULTS ARE NOT TAKEN FROM TEST DOMAINS. The results are taken from the top ten results in the SERPs (Search Engine Results Pages). No more wading through the forums looking for answers, and I guarantee you, you will never find as much detailed information in forums or other publications as you will find here.

Each update I sit surrounded by a ton of printouts, log files, pieces of scratch paper everywhere, sticky notes, white boards looking like battle plans, etc. My staff and I have compiled all of it, verified it, and put it into understandable terms so you can take it and be more profitable on the web. A few months ago a subscriber commented that the algorithm update cleared up dozens of misconceptions they had about SEO. After making the recommended changes, within just two weeks, they saw positive results.

I hope each of you are having similar, if not better results.

Testing Basics
It should be noted that for the algorithm update I take the sites of the top ten listed sites for competitive keyword phrases and analyze them. This is all done manually and by hand, I don't automate this process. Why? Automation often does not detect redirects, and gathers information from sites that are not top ranking pages, and thus, throw off the numbers. Also, I can spot problems, errors, and I can go to another keyword set to get reliable data. I take great pride and care in gathering this information for you.

As far as our testing methods for methodologies and such, you might ask, "How do I test?" The following will give you an outline of how I do my testing. Due to competitive advantage, I will not give exact specifics, but this will give you a good overview:

Our Testing Setup

  • I have five test servers. Note: I dumped all of our Windows servers due to market conditions and constant hack attempts.
  • Each server is located in a different part of the country in the United States, and I have one in the UK.
  • Each test sever has at least 16 test domains (there are over 365 total test domains).
  • Domains are matched in pairs, for A/B testing.
  • All domains are "dot com", no testing is done with other extensions for the algorithms. This will not change in the near future. The reason is there are just too many extensions and it would not only cloud the testing model, but it would increase the time and expense without providing enough relevant data to justify the expense.
  • Testing is done with at least 8 pairs of domains. The 8 pairs are configured as follows: 3-7 pages, 8-24 pages, 25-49 pages, 50-99 pages, 100-149 pages, 150-299 pages, 300-499 pages, 500+ pages. All the pages are real content, as are the sites. There is no scraped content, or dummy content. The sites do, "naturally" grow with new content, and when a site grows from the 3-7 pages size to 8 pages, a new test domain is created.
  • When performing testing, one of the domains in the pair is tested while the other remains constant. This is A/B testing and allows me to pinpoint the issue being tested. Once I see something unusual, I verify it, not just on another testing pair, but across at least a dozen testing pairs, and sometimes, across all 160+ testing pairs. Unlike other testing models on the web, I never trust the results from just one test. Never.
  • Due to varying issues within the algorithms, it can take up to six weeks to see consistent numbers in order to formulate accurate conclusions. This is why you rarely read "breaking news" from me on a certain topic. Yes, I have heard about it, and I go right to work in testing it. For breaking news, please refer to the SEO Revolution Blog.

Algorithm Update
Just as a reminder so that there is no confusion: You should not use these numbers as guidelines for your site. If you aren't aware already, you should know that each industry is a little different in terms of the criteria that is used. However, these figures will give you an understanding of how each engine is differing currently and how the numbers compare. We have added another column to the test results, so you now can see the changes over eight months, which will hopefully allow you to see the "evolution" of each engine more clearly. By using these numbers and seeing changes, we can begin to understand how the engines are changing and evolving. And since these are real test results, you can draw accurate conclusions instead of reading the latest "tips" and "hot advice" on the forums, which often takes you down the wrong path.

To obtain these figures, we assigned twelve of our employees five keyword phrases each and had them run tests on the top ten results for each keyword phrase in Yahoo!, Google, and MSN. The keyword phrases are highly competitive, and are across different industries. According to WordTracker, these keywords have at least 1,100 daily queries (the "Money Phrases"). Also, since the major engines are giving less and less emphasis on Alt tags, we have included link text in the update. We take the figures, and drop the highest and lowest figure per test to remove "exceptions".

One of the areas that I like to emphasize is that of consistency. You can tell a good search engine that is consistent and is not radically changing their algorithm. As you will see from the numbers below, it is Google that is very consistent, while Yahoo! and MSN try to figure out what to do so they make wild changes, which often result in results that do not meet user needs.

With this update, we have simplified the results and took the areas that matter most. We will be adding more "off page" factoring next update as it becomes more and more important.

<Table Removed - Member Access Only>

Keyword in Title - How many times does the keyword phrase appear as an exact phrase in the Title. 0.7 refers that only 70% of the top listings have the keyword phrase in the Title.
Total Words in Title - How many words are in the Title tag. 7.0 refers to seven words appear on average in the top listed pages.
Keyword in Meta Description - How many times does the keyword phrase appear in the Meta Description. 0.4 refers that only 40% of the top listings have the keyword phrase in the Meta Description.
Total Words in Meta Description - How many words are in the Meta Description tag. 14.3 refers to on average, 14 words appear in the top listed pages for the Meta Description.
Keyword in Meta Keywords - How many times does the keyword phrase appear in the Meta Keywords. 0.5 refers that only 50% of the top listings have the keyword phrase in the Meta Keywords.
Total Words in Meta Keywords - How many words are in the Meta Keywords tag. 19.5 refers that, on average, 20 (rounded up) words appear in the top listed pages for the Meta Keyword tag.
Keyword in Content - How many times does the keyword phrase appear in the content? 2.6 refers to that an average of 3 times (rounded up) that the keyword phrase will appear on a top ranking page. This is the content of the page, what you can see. This does NOT take into account comment tags, alt tags, etc.
Total Words in Content - How many words of content are on the page. 571 refers to the actual number of words (including stop words). The major engines do not index the stop words, but they are counted and figured in the density numbers.
Keyword Density in Content - What density percentage is the keyword phrase in the Content. 1.0 refers to 1%, or one instance of the keyword phrase per 100 words of text.
Keyword in Link Text - How many times does the keyword phrase appear in the Link/Anchor Text. 0.7 refers to 70% of the top sites contain the keyword phrase in Anchor text on the page.
Backlinks - How many backlinks does the page have. 1,761 refers to the number of qualified pages that Google recognizes as linking to the page.
Google PageRank - What is the Google PageRank of the page. 7.0 refers to a PageRank of 7 as the average PageRank for the top sites.

While stemming continues to be present in Google, I am seeing a continued reduced effectiveness. I am still seeing a strong presence of LSI and this is vital, as an understanding of LSI is not widely known amongst webmasters, let alone SEOs.

Title Tag - Make sure you have read through the Title Tag article before proceeding. Because of "off page factors" none of the major engines require the keyword to be in the Title in order to get a top ten ranking any longer. While sites with the keyword phrase in the Title now make up half of the top ten results, it still isn't a vital piece. Examples: "radio stations" (4/10), "job search" (5/10) and the best example is "search engine" (3/10), as only two of the top ten have the keyword phrase in the Title. There are also exceptions, such as "used cars" which has the keyword phrase present in all top ten results, but for the most part, half of the top results do not have the keyword phrase in the Title.

So, does that mean that you should no longer use the keyword phrase in the Title? No, of course not. This is to get you to think more "outside the box" as too many SEOs have the false believe that you throw a phrase in the Title and buy a couple of links with the phrase in the anchor text and they're done. Hardly.

Keyword proximity (keyword phrase appearing at the beginning of the tag) continues to show strength in MSN and Yahoo!. As I mentioned stemming above, all things equal, your site can rank in the top ten for a keyword phrase in Google that you are not targeting (in the Title tag) as long as the keyword can be stemmed. This is an area that you should go through your top five keywords and look at the stemmed versions. This is one of the reasons why Google's results are more accurate over the other engines.

Example of Stemming: Please remember that these are "money phrases" and are highly competitive, and this may not be true with lesser competitive phrases.

To Validate the Stemming in Google, do searches for the following keyword phrases. Note: Stemming occurs when the keyword is NOT fully contained in the Title: (number of daily queries predicted from WordTracker [numbers from two months ago]) credit score (1257 [1642]); current mortgage rates (705 [893]); weight loss pills (552 [512]); web marketing (302 [456]). Quite interesting, isn't it? And if you look up the same keywords in Yahoo! and MSN, you will see a difference in quality, in fact, you will see a lot more spam with MSN.

Need an example to understand stemming? Stemming isn't just taking the word "credit" and including "credits", "crediting", "creditor", etc. It goes far beyond that. It allows for the phrase to be included, but not necessarily together. For example, in the "credit score" example, notice the first result in the SERPs is "credit scores", the plural version of the phrase. The third result is "credit scoring", a natural stemmed version. But look at the fifth result, "Free Credit Report, Score Check." Notice how "Credit" and "Score" are not together. This is an example of what I like to call Google's Power Stemming. Neither Yahoo! or MSN use it, nor do the other engines use LSI, which is why I feel Google will continue to dominate the landscape. Their results are just better. Better technology just makes for better results.

Total Words In Title - Eight words or less is the general rule here. I highly recommend keeping the Title at 8 words or less, don't go over as testing results show that click-through rates suffer of sites that have long titles. Why? People scan the results, and are looking for what they want. Longer Titles make scanning more difficult, and often those are overlooked. Make sure that the Title is READABLE and COMPELLING. Don't keyword stuff the Title so it isn't readable. That is one of the biggest mistakes made by webmasters and SEOs. You don't just want a top listing in the SERPs, you also want click-throughs. YOU CAN'T MAKE A SALE WITHOUT A CLICK-THROUGH! Click-throughs only happen when the searcher is compelled by what they have read, and that begins with the Title. A top position is worthless if it isn't clicked. Master this art and you will always have an advantage over your competition. Always!

Proof: When you search, do you always click the number one listing in the SERPs? Of course not. You click on the result that has the Title and/or Description that match what you are looking for. Why? Because you don't want to waste your time. That is your key:"give the user what they want and they will click." Remember the three-step process to successful SEO: ranking, click-through and conversion.

Recommendation: You need to read our article on the Title tag as mentioned above and put the suggestions into use across the board on your site(s). By implementing the information contained in that article you will see MASSIVE improvements in your traffic and conversion levels within 30 days. A compelling Title has more strength than most web marketers realize. Let me emphasize that, as it isn't the Title itself, but a COMPELLING Title. This continues to show strong results on our test domains. Key Tip: While the Title itself is not as effective in getting you well-ranked, it is VERY effective at gaining the click through to your site.

Note: Google has been going back and forth in showing the site's Title, versus showing the Title in DMOZ. To prevent this, you can add the following meta tag to your page:


The above disallows the use of the ODP's Description in any search engine that supports it. Currently, the Big Three (Google, Yahoo! and MSN) do. If you just want to do this for Google, you can use this meta tag:


If you want more information on this, you can read the page on Google's site all about the Title and Description.


Meta Description - Google does show results in the SERPs of unique text found in the Meta Description, but, according to our testing, it is still not factored in the algorithm. Yahoo! and MSN have devalued this tag in their algorithm, and the devaluation continues.

Total Words in Meta Description - 20 words or less, highly focused and compelling. Your first order of business is to convert the "looker" into a "click." Keyword proximity, meaning how close it is to the front of the description, is not as important as it once was. Yahoo! is looking for shorter descriptions, and this trend continues.

Recommendation: The same with the Title tag. Change the description tags to shorter, more compelling descriptions, and the click through rates will increase for you.


Meta Keyword - Google continues to ignore this tag completely for their algorithm, the "spot check" of the keyword density of keywords listed in the Keyword Meta Tag that Google has been doing for nearly the last year seems to be over. I have dumped a ton of Spam into the keyword tag for some sites that rank very well in Google and the rankings have remained. This wasn't the case last year when I first reported this issue. If you want to know more about Meta Tags and Google I have a few articles including my updated Meta Tags Uncovered article which are free to access. MSN and Yahoo! continue to spider and include this tag in their algorithm. The keyword tag Spam that MSN was allowing last time has disappeared again. This is another example of how unstable their engine continues to be.

According to testing, the meta keyword is more insignificant with competitive keywords with Yahoo!. However, for "low hanging fruit" keywords, it is gaining strength, significant strength, and continues to do so. In testing that I have done, when removing the keyword tag completely from a top ten ranked site for a competitive keyword phrase, the ranking did not change in Yahoo! but it dropped out of the Top Ten in MSN. However, for a keyword phrase that is "low hanging fruit" the listing drops in the SERPs in both engines. This trend continues.

Total Words in Meta Keyword - Generally, 20-40 words is common. Be careful not to put in your "money key phrases" here as competitors often check your meta keywords and hijack keywords they are not targeting. There is more and more software hitting the market that will "hijack" the meta keywords in an industry to compile fast keyword lists. Don't allow your competition to steal your hard fought research. The push you get from this tag is so small now that not having your "money phrase" in this tag will not impact you severely.

Recommendation: The keyword tag is now having less of an effect in Yahoo! but it is making a comeback with MSN. Will it last? So far, it is.


Keywords in Content - This is an area which separates the "on page" factor engines with the "off page" factor engines.

Content - Spamming in MSN is taking on new heights. There was one example where a site gained a top ten listing with a 42% keyword density ... yes, 42%. There were numerous others that were in the high 30% range. LSI still pays a huge role with Google as MSN and Yahoo! both have an average of double the number of occurrences of the keyword on the page.

Content Density - There are many "free tools" out there that claim to check keyword density on the fly, but they do not compute it properly. THis is a "warning" if you use one of those tools, and is a main reason why we still do things manually. Keyword density has to do with the density of the content - what you can see on the screen. It does not include the Title, Description, Keyword, Comment tags, Alt Tags, etc. Check the #1 listing for the keyword term, "search engine optimization" it comes back with a density of 15% on the site However, if I go to that site, enter in "search engine optimization" in the Google search box on the Google Toolbar and then click the Highlighter button, this will highlight each occurrence of "search", "engine" and "optimization" in a different color. There are four instances of the keyword phrase that appear in the actual body of the document, which does NOT equal a 15% keyword density. So be careful of the tools you use and rely on.

There are other tools that you can use that allow you to be more accurate, but nothing that is perfect. For our testing, we do all of our checks manually to verify accuracy. For example, we discovered massive amounts of redirects with and without JavaScript (browser-side cloaking). Using automated tools often analyze the wrong page, thus throwing off the data. There also many times we throw out data because it is so far off, that including it would taint the data. And we can have tainted data, can we? Many criticize us for throwing out data, but as testers, that is what you do. You are looking for accurate data, and when a single result is so far away from the averages of hundreds of samples, there is a definite error. By removing the bad data, the overall figures are more accurate and are not effected by the "spike".

The "words on the page" do not include the left and right "gutters", or the top and bottom navigation. According to our testing, this area is NOT figured in the density. What we count as "words on the page" are the same words that are counted in Google's algorithm. Also understand that Google can read page design better than you think it can, and the actual content area is graded at a higher level than the content contained in the gutters. That is important to remember when optimizing your pages. This is why many webmasters who have lived on achieving top rankings with keyword rich anchor text in the navigational areas and the footer are starting to struggle for highly competitive keyword phrases.

Using the highlighter tool with the Google Toolbar is often a great way to see the on-page factors at work. I highly recommend it.

If there has been one aspect to our advice that has not changed since we began in 1996, it is that content, good content, is vital. As you can see, high ranking pages in Google, MSN and Yahoo! average over 400 words of content. This is a key factor to realize. Along with good content comes fresh content. Google has always rewarded sites that have fresh content.

Based on this information, the following are recommendations we are giving at this time. Please note that these recommendations are based on the averages, and your industry may be different:

  • keep your Title tag short (8 words or less) and ensure it is compelling
  • the keyword that you are targeting, use only once in the Title
  • only use one Title tag
  • use a Meta Description tag and write it so it is compelling and keep it 10-20 words in length
  • use the keyword phrase just once in the Meta Description
  • use the Keyword Meta Tag, but be generic with the keywords that you use so you do not tip your niche to your competition
  • list five or fewer keyword phrases in the Keyword Meta Tag
  • avoid unnecessary meta tags
  • build up your viewable content to at least 450 words per page that you want search engine traffic to come to consistently
  • use the keywords in the content naturally, so the text flows - do not keyword stuff your content, as you will turn off your visitor
  • rely on LSI to help build the "theme" of the page
  • use the keyword phrase once in an H1, H2 OR H3 tag, but not all three.
  • use of alt tags is recommended to describe the image, but not for SEO purposes
  • use of the comment tag for SEO purposes should be discontinued
  • appearance of the keyword in the URL (i.e. domain name, sub folder, file name) gives only a slight increase in Google, but can be used to spam in both Yahoo! and MSN
  • headings, anchor text, bold, etc. all give slight increases in the SERPs and should be used in moderation
  • use of the Meta Title tag only gives a "slight" boost for SEO.

Backlinks (Link Popularity and Link Reputation) continue to rise. A solid link campaign is vital to your success online. Yahoo! still has an index that is full of duplicate pages, despite their efforts to manually clean it. MSN has grown the number of backlinks, and this has been consistent over the last few updates. Don't get too excited over the Yahoo! numbers, as these are due to the duplication that is plaguing Yahoo! and it should be corrected shortly. But remember, a link campaign is not just for your home page. I have been stressing for years that link campaigns for category pages, articles, press releases is an excellent idea ... and it will help funnel back PageRank to your home page if you use the dynamic linking strategies from Leslie Rohde.

Odds and Ends

ALT Tags: Using ALT tags to boost your positioning is officially a thing of the past. I recommend them only in terms of aesthetics to help the user navigate your site better. More than half of top ranking pages do not use their keyword in any ALT tag on the page. This has been consistent for over a year and is the reason why we stopped tracking it. There has been some talk in the forums that the ALT tag text is coming back. However, according to my testing, the only thing I can see it working for is "low hanging fruit" and most anything works for those keyword phrases.

Comment Tags: Comment tags continue to not count in the algorithms of the major engines. However, they are indexed, but in an inconsistent nature, therefore I must place them in the "not indexed" category. If you use Comment Tags, use them as they were intended, to leave yourself, or the webmaster, notes of the construction of the website - don't use them for SEO purposes.

Domain Name: Using a keyword-rich domain name is still very overrated. Branding with a good domain is still a better option. In testing that has been done, the increase in a keyword-rich domain is so slight that it is barely measurable for the "money keyword phrases". Sub-domains continue to thrive. Sub-domain Spam with MSN still works very well. Remember, Google treats sub-domains as separate domains.

Internal / External Links: Link text from quality sites continue to bolster sites in Google, however, "spam" and "link farm" sites continue to ignored and penalized.

PageRank: What is interesting here that while PageRank doesn't have the power that it once was, the top ranking pages in Google generally also have a good PageRank - usually 6 or better. This time around, check out MSN, with nearly a PR7 average. This is a clear indication that MSN is still trying to mimic Google and figure out exactly what they are going to do with their algorithm. We are also seeing fewer and fewer "PageRank 0/10" pages in the top ten.

Impressed with the information? Of course you are. Get this and other high quality releases about Google, Yahoo! and Bing/MSN to not just overtake but dominate your competition!

Join through the link below and gain access to the training you need and the monthly updates through the SEO Revolution membership.

SEM Mastery