Robots and Sitemap


The robots.txt is a text file used to control how search engines look at the pages of a website. This can be positive, or negetive - restricting them by specifying directories/files that owners don't want them index (store, and thus display in their results). Generally, search engines respect these restrictions.

The BayView CMS generates the robots.txt file for each website automatically:


Sitemap.xml is file containing a list of the pages of a website, to ensure that search engines index the pages correctly.

As well as containing a list of site pages, it also contains additional information such as how often a page is updated (e.g. daily, weekly or monthly) and its priority within the overall site. This improves the way that search engines (e.g. Google) understand your site, thus improving the way the site is indexed.

The BayView CMS generates the sitemap.xml file for each website automatically:


Included as Standard:

  • Website Hosting
  • CMS Website Management
  • Support

Website Features:

  • Pages & Menus
  • News and Events*
  • Visitor Comments with moderation*
  • File Management
  • Categories, Tags and Tag-groups
  • Social Network integration
  • Easy Google Analytics integration
  • SEO optimised
  • Automatic Robots.txt & Sitemap.xml
  • Automatic RSS feed for news*
  • Webpage Redirection.

*Package dependent

CMS Features

  • Easy online signup & setup website (coming soon)
  • Manage multiple websites
  • Online help, walkthroughs & support tickets
  • Feedback service (tell us what you think)
  • Automatic billing & online payment option
  • User roles and permissions
  • Integrated template editor (coming soon)