2 fplanque Sep 17, 2014 22:00

This must be the correct answer. Thanks.
How do I prevent robots scanning my site?
The quick way to prevent robots visiting your site is put these two lines into the /robots.txt file on your server:
User-agent: *
Disallow: /
but this only helps with well-behaved robots.
Sure,
the robot.txt will help online if the search engine is well-behaved.
If you don't want that any people can reach your blog without password, you can
- set users and passwords (and groups) in your blog preferences
or
- you can create a .htaccess file on your server.
In the first, only people with account and password can see posts for this user/group. In the last one, only people with this one password (or user and password) can see the blog.
Thank you very much.
make a
robots.txt
file.This is not specific to b2evolution. Google for details.