Optimizing robots.txt file has an important role in the Drupal SEO. robotss.txt is a file that tell the search engine crawler whether a page should be crawled or not. The Drupal site has a default robotss.txt file in it's root folder. But as our drupal site develops, the robotss.txt file has to optimized to make our site Search Engine friendly. Optimizing robotss.txt file requires mainly when you use modules such as Views or Panels. This is because these modules creates duplicate content in your drupal site, which is harmful for SEO.
Wednesday, May 2, 2012
Optimizing Robots.txt file and Drupal SEO
Optimizing robots.txt file has an important role in the Drupal SEO. robotss.txt is a file that tell the search engine crawler whether a page should be crawled or not. The Drupal site has a default robotss.txt file in it's root folder. But as our drupal site develops, the robotss.txt file has to optimized to make our site Search Engine friendly. Optimizing robotss.txt file requires mainly when you use modules such as Views or Panels. This is because these modules creates duplicate content in your drupal site, which is harmful for SEO.
Tuesday, May 1, 2012
Drupal Xml Sitemap - Search Engine Optmization / SEO
Xml SiteMap is simply a page that lists all the urls with meta data likes when they are updated the last time, how often is it updated and other data likes its priority (how important is it). Though Search engine crawlers like Google bot are intelligent enough, they may not crawl all the urls of your site. For example if your site uses Java scripts, Superfish menu Or any kind of flash menus, then the google bot can't index your sute properly. Here comes the importance of a sitemap.xml file. Since the sitemap file stores all the url, no urls of your site will be disappeared from the search results.
Subscribe to:
Posts (Atom)