So I've been reading a little about the robots.txt file that would be used to promote your own website or restrict which pages are searched.
Is there a simple way to say search everything? I read that having the file in your root directory will give an all clear for spiders to go through the website. I just want to know what the search all command is.
Thanks for any advice.
+ Reply to Thread
Results 1 to 5 of 5
-
Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw?
-
just don't make one.
If you must:
User-agent: *
Disallow:
This will ban bots from every page:
User-agent: *
Disallow: / -
Oh I don't want to ban bots I want to invite them. Hmm... I thought there was an allow all command.
Well I guess maybe I'll forget about it. Thanks coalmanDonatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw? -
Originally Posted by yoda313
The * is a wildcard which means every bot, there's no files or directories listed after the disallow so that means every bot can visit every page.
In the second example with the slash added it disallows every bot from every page. The / is equivalent to you HTML directory. -
Thanks coalman
Donatello - The Shredder? Michelangelo - Maybe all that hardware is for making coleslaw?
Similar Threads
-
IMporting to Subtitle Workshop from txt file
By kouim in forum SubtitleReplies: 4Last Post: 27th Jan 2011, 16:26 -
GSpot 2.70a - Export to .txt file
By skaleton in forum Newbie / General discussionsReplies: 0Last Post: 11th Nov 2010, 11:49 -
AI chat bots
By Dr.Gee in forum ComputerReplies: 0Last Post: 2nd Feb 2010, 20:10 -
How do i burn a .txt file to CD??
By Reprobate in forum Newbie / General discussionsReplies: 6Last Post: 31st Jul 2008, 14:09 -
.txt file to .sup format
By pzoggy7 in forum SubtitleReplies: 5Last Post: 1st Jul 2008, 17:56