One more reason to run IIS 7 - Thanks IIS team for the SEO toolkit!
The IIS Search Engine Optimization (SEO) Toolkit provides a set of tools that can be used to improve Web site’s relevance in search results by making the site content more search engine-friendly. The IIS SEO Toolkit includes the Site Analysis module, the Robots Exclusion module, and the Sitemaps and Site Indexes module, which let you perform detailed analysis of site's structure and content and offer recommendations and editing tools for managing your Robots and Sitemaps files.
Features
There is a lot of useful features included in the toolkit that will help webmasters and web developers in making their sites search engine friendly.
· Site Analysis is a tool that analyses your web site for compliance with SEO best practices and provides a comprehensive site intelligence data. It has the following key features:
o Fully featured site crawling engine - in order to perform detailed analysis of site's structure and content, Site Analysis tool uses a built-in web crawler, called "iisbot", to download and cache all the publicly available web site content. The web crawler is fully compliant with robots exclusion protocol.
o Report summary dashboard - the results of site analysis are presented in an easy to use dashboard page that serves as a start page for various types of analysis. In addition this page includes a large set of pre-built queries for most common reports.
o Query builder - Site Analysis tool includes a powerful and flexible query builder user interface that lets you create any custom queries that are run against the cached web site content.
o Detailed URL information - you can obtain various detailed information about every URL in your web site, such as response headers and content, the pages that link to that URL as well as all the referenced URLs.
o Detailed Violations descriptions - each content or SEO violation found on a web site has a detailed description as well as a recommended corrective action.
o Word Analysis - any web page can be analyzed with regards to the most commonly used words and phrases within the content of that page. The results of that analysis can be used to select the keywords that most accurately describe the content of the page.
o Route Analysis - unique routes to any page can be displayed in a separate report. This kind of information helps better understand how search engine and site visitors reach a particular page on your web site.
· Robots Exclusion - is a tool for managing the content of robots.txt file for your site. Its key features include:
o User interface for editing robots.txt file - the content of the robots exclusion file - robots.txt - can be edited by using IIS Manager GUI
o Selecting URL paths from physical view of web site - the paths that are specified for "Allow" and "Disallow" directives in robots.txt file can be selected from the physical file system layout of your web site.
o Selecting URL paths from virtual view of web site - the paths that are specified for "Allow" and "Disallow" directives in robots.txt file can be selected from the logical view of your web site obtained from the results of site analysis.
· Sitemaps and Sitemap Indexes - is a tool for managing sitemap and sitemap indexes files for your site. It includes:
o User interface for managing sitemap and sitemap indexes files - the content of the sitemap and indexes files can be edited by using IIS Manager GUI
o Selecting URLs from physical view of web site - the URLs that are specified within a sitemap can be selected from the physical file system layout of your web site.
o Selecting URLs from virtual view of web site - the URLs that are specified within a sitemap can be selected from the logical view of your web site obtained from the results of site analysis.
Cheers,
Steve Schofield
Microsoft MVP - IIS