Recent Topics

1 Oct 23, 2009 16:13    

My b2evolution Version: 3.3.1

Should I be posting my blog link when asking for help? Anyway, when I use the Facebook system to import my blog to notes, it comes back with the error:

"Import Failed
We couldn't find a feed using the URL you provided."

I've used my normal blog address, the tempskin=_rss2, _atom, _rss, etc. and all bring back the same error.

Additionally, the template skins do not completely pass feed validation, so I duplicated the _rss2 tempskin and renamed _facebook, then did some minor editing which made the skin pass the feed validation test. Still, I get the error from Facebook.

I have contacted Facebook without a response. Has anybody else had this problem? Do you know of a solution, besides replacing me with someone who knows what they are doing? :oops:

I just read the Sticky post. My URL is:

http://blog.bulbmeister.com/

2 Oct 23, 2009 17:48

Still no solution, but, since I learned my feed skins were not actually installed, I went and installed them. This solved the feed validation problem, as _atom, _rss2, and _rss feeds all pass now.

Facebook still won't process the feed, though, which is showing all posts from 3 different blogs, an adaptation from the "View All" option that existed in pre-version 2, I think.

Somebody please help, either with showing me a feature I need to enable or disable, or by just letting me know it's Facebook's problem. Thanks.

3 Oct 23, 2009 19:31

As far as I remember the last time I imported a blog into facebook notes I just used the blog url (not the feed url) and it worked fine with no changes to the feeds at all.

L

4 Oct 23, 2009 19:37

Thanks, lturner, but none of my links have worked. I have used the web address and the feeds without success.

5 Oct 23, 2009 19:41

have you validate your feed? My feed is imported withouth problem.. Maybe some invalid tag..

7 Oct 23, 2009 19:51

strange.. My blog is imported withouth problem...

8 Oct 23, 2009 19:54

Walter, what b2evolution blog version are you running?

9 Oct 23, 2009 19:55

hum... almost 4.0 ;) (developer, using whissip branch)

10 Oct 23, 2009 20:11

I am using the latest stable b2evo on my blog and that imports into facebook no problem. It is strange as your feeds seem to be ok when loaded into a feed reader.

L

11 Oct 23, 2009 20:21

I just went to my server admin log file and it's spitting out the following error in relation to the attempt:

"client denied by server configuration..."

I'm guessing my server is being too strict, or I need to add something to a robots.txt file or .htaccess file. I don't know, though, as I'm not adept at this kind of thing.

12 Oct 23, 2009 20:23

hum.. strange.. maybe the antispam machine in b2evolution?

13 Oct 23, 2009 20:52

I'm feeling pretty stupid. I had a string in my blog's .htaccess file that is designed to keep away bad robots. After removing that string, I finally got the blogs to import to notes in facebook. Images did not transfer, though. Is that normal?

That said, might anyone know how I could keep the following information in my .htaccess file and not deny the facebook robot with a proper edit?


# BEGIN found at http://www.unflux.net/forum/ftopic244.html

SetEnvIfNoCase User-Agent "Download Ninja 2.0" bad_bot
SetEnvIfNoCase User-Agent "Fetch API Request" bad_bot
SetEnvIfNoCase User-Agent "HTTrack" bad_bot
SetEnvIfNoCase User-Agent "ia_archiver" bad_bot
SetEnvIfNoCase User-Agent "JBH Agent 2.0" bad_bot
SetEnvIfNoCase User-Agent "QuepasaCreep" bad_bot
SetEnvIfNoCase User-Agent "Program Shareware 1.0.0" bad_bot
SetEnvIfNoCase User-Agent "TestBED.6.3" bad_bot
SetEnvIfNoCase User-Agent "WebAuto" bad_bot
SetEnvIfNoCase User-Agent "WebCopier" bad_bot
SetEnvIfNoCase User-Agent "Wget/1.8.2" bad_bot
SetEnvIfNoCase User-Agent "Offline Explorer" bad_bot
SetEnvIfNoCase User-Agent "Franklin Locator" bad_bot
SetEnvIfNoCase User-Agent "LWP::Simple" bad_bot
SetEnvIfNoCase User-Agent "Larbin" bad_bot
SetEnvIfNoCase User-Agent "AA" bad_bot
SetEnvIfNoCase User-Agent "Rufus Web Miner" bad_bot
SetEnvIfNoCase User-Agent "Port Huron Labs" bad_bot
SetEnvIfNoCase User-Agent "Sphider" bad_bot
SetEnvIfNoCase User-Agent "voyager/1.0" bad_bot
SetEnvIfNoCase User-Agent "DynaWeb" bad_bot

SetEnvIfNoCase User-Agent "EmailCollector/1.0" spam_bot
SetEnvIfNoCase User-Agent "EmailSiphon" spam_bot
SetEnvIfNoCase User-Agent "EmailWolf 1.00" spam_bot
SetEnvIfNoCase User-Agent "ExtractorPro" spam_bot
SetEnvIfNoCase User-Agent "Crescent Internet ToolPak" spam_bot
SetEnvIfNoCase User-Agent "CherryPicker/1.0" spam_bot
SetEnvIfNoCase User-Agent "CherryPickerSE/1.0" spam_bot
SetEnvIfNoCase User-Agent "CherryPickerElite/1.0" spam_bot
SetEnvIfNoCase User-Agent "NICErsPRO" spam_bot
SetEnvIfNoCase User-Agent "WebBandit/2.1" spam_bot
SetEnvIfNoCase User-Agent "WebBandit/3.50" spam_bot
SetEnvIfNoCase User-Agent "webbandit/4.00.0" spam_bot
SetEnvIfNoCase User-Agent "WebEMailExtractor/1.0B" spam_bot
SetEnvIfNoCase User-Agent "autoemailspider" spam_bot

<Limit GET POST HEAD>
Order Allow,Deny
Allow from all
Deny from env=bad_bot
Deny from env=spam_bot
deny from 111.11.11.11
deny from 111.11.11.12
</Limit>

# END found at http://www.unflux.net/forum/ftopic244.html

14 Oct 23, 2009 21:20

Finally, I figured out why the images weren't showing in my notes. I had to adjust hotlink protection to include, "feed://...". Thanks for all your help. Something about this conversation sent me to look at my error log, which got me going.

IF you have an answer for my robots denial string, I would appreciate the help, but I can probably figure this one out eventually.

Have a great weekend!

15 Oct 23, 2009 23:00

Glad you got it working. Sorry, can't really help on the other bits.

L


Form is loading...