1 chris_bbg888 Jan 04, 2008 14:27
3 sam2kb Jan 04, 2008 15:05
If you're using 2.x
Go to Blog settings -> Features -> and check "Use advanced perms" -> Save
Now you can see permissions. Go to Group perms and check Not Member for Basic users. I guess it is what you need.
4 chris_bbg888 Jan 04, 2008 17:37
using 1.10 :(
Many thanks for your help.
In the folder of the blog is just the index.php file.
Is now all the content of the whole blog protected from being craweled by google?
All things I wrote into the blog are in a SQL database and not in this protected folder. There is just the index-php That's why I have to ask again. :lol:
5 sam2kb Jan 04, 2008 17:50
If all transactions goes through index.php and you see something like this in browser address bar
www.yoursite.com/somepath/index.php?..........
and you put Disallow: /somepath/index.php in robots.txt
I believe it will work, but let's wait for dev's reply.
6 chris_bbg888 Jan 04, 2008 19:08
Thank you.
It looks like : http://blog.weiterweg.org/04_bildblog/
Lets wait for dev :roll:
7 edb Jan 04, 2008 20:12
Hi chris_bbg888! I think I went to school with your brother. bob_nnt444 ;) I have some questions of what you're trying to do. If I understand correctly you have a blog in your multiblog installation that you want the search engines to NOT crawl and NOT index.
But what about people? Do you want regular people out there to be able to see posts in this blog, or is this a private blog that only registered members get to see stuff in?
The first thing - anyone can visit but search engines can't - probably can be done but will be tricky. The second thing - keep most people and all search engines out but let registered people in - is easy as pie.
Oh and I would NOT use robots.txt for this. Even if your URLs look like folders they're really not. All your content is in your database. The folders you have in your installation are almost never part of a URL for your site.
8 chris_bbg888 Jan 04, 2008 20:31
Hey,
yes indeed it's my brother ;)
""If I understand correctly you have a blog in your multiblog installation that you want the search engines to NOT crawl and NOT index. ""
YES exactly. It's a private blog and just a few people with the same loginname and password (all using the same) can enter.
I already managed to give other people no access.
Thats a very comfortable solution because a window pops up and no one has to register.
The challenge is to protect the blog from google &bots.
Maybe that helps you by helping me :lol:
9 edb Jan 04, 2008 20:42
Yes that works. I'm no expert on stuff like that, but I'm pretty sure search engines are not going to access that site because they don't have the username and password to log in with. Basically they're just clicking links and remembering all the pages they see right? So if they don't get in they have nothing to cache.
By the way you could have done the same within b2evolution. One method would be to post "protected". That way each user who is logged in will see stuff, but no one else (including search engines) will get in. Another method within b2evolution would be http://forums.b2evolution.net/viewtopic.php?p=63757#63757 but I'm not sure how to do that for ONE blog on a multiblog installation.
10 yabba Jan 04, 2008 20:42
Assuming that your partner blog is 7 *edit* removed link. point made ;)
¥
11 edb Jan 04, 2008 20:44
Oops! I guess it won't work!
So Yabba how can someone use your "login required true" trick for one of many blogs on a system?
12 yabba Jan 04, 2008 20:56
Damn, now that's a tough one because there's so many ways to call up a blog/post.
The best solution is to obviously make all posts protected, you can also enable "redirect_to_post_blog" ( conf/_advanced.php I *think* ) which would 302 to the blog folder and then spark off the htaccess password.
For super certainty I'd add this to the very top of the skin that the blog uses :
<?php
if( $Blog->ID == 7 && ( !is_logged_in() || $current_USer->check_perms( 'blog_ismember', 'any', false, $Blog ) ) )
{ // not a member
header_redirect( $htsrv_url.'login.php' );
}
?>
¥
13 chris_bbg888 Jan 04, 2008 21:28
first of all thanks for all your ideas!
maybe you guys can explain how the content of the blog is being accessed, I'm not shure if I understand it completely.
What I want to do is to use some kind of secret blog for partners, which I don't want to be indexed by the google bot, so the contetn shall only be acessable to partners with login.
Right now the partnerblog is redirected to a subfolder, which is protected with a htaccessfile. But does this really prevent the content of the blog of being indexed by bots? I mean, the content is still in the same database as the other blogs content, isn't it? In the subfolder is just a simple php-file that redirects the browser to the database.
So how could this prevent tha database of being accessed?
Thanks for your help guys, really, I've got a damn big questionmark on my forehead right now... ;)
14 edb Jan 04, 2008 22:00
search bots do NOT access your database. They can only follow links right?
If you make a post "protected" then no one will see it unless the visitor is a registered member AND is logged in. Search engines will never be logged in, so they won't see protected posts. For example http://wonderwinds.com/weblog.php/2008/01/you-can-t-see-this is a real blog post, but YOU can't see it because you're not a blogger on my blog.
Hey it's also a bug!
Forcing a login requirement for a blog is possible, as Yabba-the-Merciless has shown, but the absolute easiest is to post "protected" instead of "published".
Hi
To prevent a folder from crawling just put robot.txt in root folder (usually /www or /public_html) with this content
User-agent: *
Disallow: /your/folder/path/
Disallow: /another/folder/
Disallow: /another/file.php
And if you set special permissions for the blog you can protect it from everyone (include google)