MVP Summit SEO Workshop a Success Part 2

Robots.txt and Sitemap.xml FilesThe first MVP Summit SEO workshop article covered how a custom CMS can wreck havoc on your site’s SEO strategy. This article covers the other two common problems discovered during an SEO workshop: Misunderstanding of keyword utilization, and the improper use of robots.txt and sitemap.xml files.

Keyword Use

When I review a website during an SEO workshop I usually uncover a glaring misunderstanding of how to use keywords. Site owners love to target as many keywords as possible on one page, and they love implementing keyword stuffing tactics. Both approaches will not result in success and often times will end up hurting your site. Here are two examples of what we saw at the workshop:

Title Tag = “Kirkland Flooring, Seattle Flooring, Bellevue Flooring, Washington Flooring, Washington State Flooring, Redmond Flooring”


Keyword Tag = “Kirkland Flooring, Seattle Flooring, Bellevue Flooring, Washington Flooring, Washington State Flooring, Redmond Flooring, hardwood flooring, laminate flooring, stone flooring, tile flooring”

Clearly, that site owner wants to target every kind of “flooring” keyword possible on one page. He is going after specific keywords relating to flooring such as “hardwood flooring” and he is attempting to target regional or local keywords such as “Kirkland flooring.” You will not be successful targeting so many keywords on one page. Further, lacing your Title Tag or Keyword Tag with the same word multiple times (flooring) is going to trigger Keyword Stuffing penalties.

Here are the best practices:

  1. Target one keyword per page. Create a separate page for stone flooring, laminate flooring etc. Do not try to target every possible flooring option on one page.
  2. Never use the same word more than two times in your Title Tag. Place your targeted keyword at the beginning of the Title Tag.
  3. Never use the Keyword Tag.
  4. Use your targeted keyword in the body content of the page. Try placing that word in the first sentence and then use that term or closely related terms throughout the content. Avoid using it over and over again so you do not appear Spammy or offend the User.


I am stunned to see improper use of the Robots.txt file. Most people do not even know about this file, so if it actually exists on a site I expect that it is written correctly. However, a majority of the time it is not and that is very bad. This file assists the search engine crawlers because the site owner creates rules in this file. You specify for the crawler what sections of your site you want or do not want crawled. This is also the place to educate the crawler of the location of your sitemap.xml file (We will cover the sitemap.xml file next). If you want the crawler to view your entire site your robots.txt file would appear like this:

User-agent: *


Many times a site owner will include the location of the sitemap.xml file, but they do not actually have a sitemap.xml file. Or they have a sitemap.xml file but in the robots.txt file they have the wrong location. Worse yet I have seen robots.txt files that instruct the search engine crawler to crawl NOTHING. When you do that you will quickly see zero traffic from the search engines. This is a great site for explaining robots.txt in a nutshell. Make sure your site has a properly written robots.txt file and include the location of your sitemap.xml file.


Most site owners at my SEO workshops do not have a sitemap.xml file. The sitemap.xml file is a list of URLs you want crawled and indexed by the search engines. It is a quick way for the crawlers to locate every page you want them to return on their search engine results page. This does not mean all of those pages will be indexed or used by the search engine, but it will at least get the crawler to view those pages. explains the proper sitemap.xml protocols, but I personally remove some of their recommendations. On their site you will find this piece of example code:

<?xml version=”1.0″ encoding=”UTF-8″?>
<urlset xmlns=”″>

I remove the <lastmod>, <changefreq> and <priority> attributes so my sitemap.xml files appear like this:

<?xml version=”1.0″ encoding=”UTF-8″?>
<urlset xmlns=”″>

The engines determine the importance of my page and do not care how valuable I think a page is. They also crawl my sites daily and can figure out when a page was last updated. I make my sitemap.xml file as clean as possible and as quick as possible to scan by their crawlers. I recommend stripping out everything but the <loc> attribute.

Finally, place the robots.txt file and sitemap.xml file in the root directory of your website.

I am sure I will continue to see these common problems in future SEO workshops, but hope this article sparks further research and discussion for site owners that have these issues.


Author: Garth O'Brien provides SEO, Social Media and Community Management consulting services. He can help boost the online presence of a small local business or global enterprise corporations in both Google and Bing.

View all posts by