Edit the Robots.txt file


A bot or a spider crawls your site to index the pages and update their search engines. The most common bot is the GoogleBot which is Google's web crawling bot (sometimes also called a "spider").  Crawling is the process by which Googlebot discovers new and updated pages to be added to the Google index.

The Robots Exclusion Protocol or "Robots.txt" is the universal way of giving instructions to these bots. Customizing that file allows you to prevent bots from indexing specific pages or areas of your site.

Your robots.txt file can be seen by visiting this link on your site:


The robots.txt file can be updated through your control panel to allow you to quickly and easily set which areas allow the GoogleBot and other Bots to access or disallow access.

Manage the bots that visit your site

Here's how to quickly and easily manage the bots that come to your Membergate site

Click on each section to expand for more information:

1. From under 'Tools' choose 'Robots.txt Generator'

From under 'Tools' choose 'Robots.txt Generator'


Robots txt Generator
2. In the 'Permitted User Agents' field add specific bot names that are allowed to crawl

In the 'Permitted User Agents' field add specific bot names that are allowed to crawl and access your pages to add to their search engine index.


Edit the Robots.txt file
3. In the 'Disallowed Folders' field, add directory or folder paths that you would like to prevent bots from crawling

In the 'Disallowed Folders' field, add directory or folder paths that you would like to prevent bots from crawling. Remember to include the / (backslash) in front of each folder and in between each subdirectory. For example, the images folder inside the members folder would be entered like this:



Edit the Robots.txt file
4. In the 'Allowed Folders' field, add specific directory or folder paths that are okay for bots to crawl

In the 'Allowed Folders' field, add specific directory or folder paths that are okay for bots to crawl. The robots.txt file is usually defaulted to allow a bot to crawl all of your folders (except some specific folders necessary for some software functions)


Edit the Robots.txt file
5. Add a number in seconds in the Crawl Delay box

'Crawl Delay' - Add a number in seconds that controls the frequency to which bots can visit your site. We suggest 5, but the number has to be greater than 2


Edit the Robots.txt file
6. Select Allow or Disallow for the Default Bot Action

'Default Bot Action' - Select Allow or Disallow from the drop down box. This is the permission for bots not listed in the 'Permitted User Agents'. We recommend having this set as 'disallow'


Edit the Robots.txt file
7. Click the 'Save Robot.txt Settings' button at the bottom of the page.

Click the 'Save Robot.txt Settings' button at the bottom of the page.