1 crazychad Sep 16, 2014 03:10
3 crazychad Sep 19, 2014 16:16
This must be the correct answer. Thanks.
How do I prevent robots scanning my site?
The quick way to prevent robots visiting your site is put these two lines into the /robots.txt file on your server:
User-agent: *
Disallow: /
but this only helps with well-behaved robots.
4 ednong Sep 22, 2014 00:26
Sure,
the robot.txt will help online if the search engine is well-behaved.
If you don't want that any people can reach your blog without password, you can
- set users and passwords (and groups) in your blog preferences
or
- you can create a .htaccess file on your server.
In the first, only people with account and password can see posts for this user/group. In the last one, only people with this one password (or user and password) can see the blog.
5 crazychad Sep 22, 2014 15:29
Thank you very much.
make a
robots.txt
file.This is not specific to b2evolution. Google for details.