Lot's of very useful information to solve your problems as far as dealing with many of the typical problems and some not so typical problems that we run into when working on a computer.
There are a few ways to create a robots.txt file for the webbots that willwant to crawl your site.
Searchbots look into your site, to know what you have in there.
This is so that the related search engine can comeback and find the page to answer a search.
Open
Notepad and type this exactly if you want no robots to look at your site. Robots can ignore your request. Bad robots usually do.
Save the robots.txt file to your server along with your webpages.
Save it as
robots.txt
Or if you want a robot to see everything then type this.
User-agent: *
Disallow:
To exclude some sections of the site you can type something like this:
User-agent: *
Disallow: /cgi-bin/
Disallow: /images/
In the case shown above you would add the section to be excluded.
To allow robots to look around your site but to direct them to your xml sitemap specifically type this:
User-agent: *
Allow: /
Sitemap: http://yourwebsite.com/sitemap.xml
Obviously you would replace yourwebsite.com with the actual address of your site.
This is helpful to the sitemap crawlers.
|
Thanks for taking the time to read all of this as it is a really good overview of the subject of promotion for your website or blog. Good luck in the future.
Roger Chartier
|
|
|
|
|
Disclaimer - Privacy Policy
The Author - Roger Chartier
|