Recent Topics

1 Jun 28, 2006 22:05    

My provider took my site offline because I got to many google requests.
About 25000 a day.
He blames my site to be too popular..
I know this is the world upside down, but do I have to take the same measures as someone has to take if he gets slashdotted ?

2 Jun 29, 2006 16:26

Are there too many requests from the Google indexing bot, or referals from actual readers clicking through from google?

If it's the first, then you could use robots.txt to only allow google to index your front page.

3 Jun 29, 2006 16:34

My blog is not realy a blog but a daily updated databese with all sorts of information.
Movie reviews, songtexts, recepies for food, poems...
So indeed all those seperate articles are in google - and I do want them to stay there offcourse.
In my stats I see a lot - a lot - articles that are found trough google..
But not all 28000. It's about 1 or 2 per minute maximum.

So updating only my frontpage is not at all a solution that I want.

4 Jun 29, 2006 19:20

Might be because google follows all you links, all your categories, calendars etc, - this way, every content is indexed several times - a common problem. Look at som Search Engine Optimisation posts here in the forum. I've put this code in the header of the _main.php in my template:

<meta name="robots" content="<?php if( ( $disp!='posts' && $disp!='single' ) || ( $disp=='posts' && ( $paged>1 || $cat!='' || $m!=0 || ( is_array( $catsel ) && count( $catsel )>0 ) || $w>=0 || $s!='' ) ) ) echo( 'no' ); ?>index,follow"/>

It helps a bit as only the main page and the archive is allowed to be indexed.

5 Jul 02, 2006 14:28

It's indedd the indexing robots who are 'misbehaving'...

Can I ban them (yahoo slurp robot ! grrr) ?


Form is loading...