1 austriaco Jun 20, 2008 09:22
3 austriaco Jun 20, 2008 13:00
Well, that sounds pretty complicated, as I was expecting. Altough for what I've read so far, mysqldump has a --where option to select what to dump and what to left.
Maybe it is easier in this case to simply create a different DB and separate completely the blogs, users, settings, etc.
I'm pretty much a neophite with mysql, but let's see:
To select all categories beloging to a given blog I do the following:
select cat_ID,cat_parent_ID,cat_blog_ID from evo_categories WHERE cat_blog_ID = 11 ;
In this case, for example, blog #11 has four categories: #79, 80, 82 & 84
Now to select all post belonging to blog # 11, I simply select al posts which belong to any o those four categories:
select post_ID,post_main_cat_ID,post_urltitle FROM evo_items__item WHERE post_main_cat_ID = 79 OR post_main_cat_ID = 80 OR post_main_cat_ID = 82 OR post_main_cat_ID = 84 ;
This is fine, if there are only a bunch of categories. But what if our user who wants to export his/her database has 50+ categories or something of the sort?
How do you combine SQL #1 & #2 above to extract all post belonging to a given blog?
OK, that's for the part dealing with posts. And that's the most important task, to extract the content and the categories, which can be many and can be a pain in the neck to recreate. For the rest of the info, I guess our fictitous user would have to make some sacrifices... :)
4 austriaco Jun 20, 2008 13:22
Austriaco wrote:
How do you combine SQL #1 & #2 above to extract all post belonging to a given blog
The answer to that question is something like:
select post_ID, post_main_cat_ID, post_urltitle FROM evo_items__item WHERE post_main_cat_ID IN (SELECT cat_ID from evo_categories WHERE cat_blog_ID = 11)
Now the question is whether mysqldump accepts a subquery in the "--where=" options...
And the answer is no:
Enter password:
$ mysqldump -u lnieves -p --where="evo_items__item.post_main_cat_ID IN (SELECT cat_ID from evo_categories WHERE cat_blog_ID = 11)" libenie2_bvlt1 evo_items__item
mysqldump: Couldn't execute 'SELECT /*!40001 SQL_NO_CACHE */ * FROM `evo_categories` WHERE evo_items__item.post_main_cat_ID IN (SELECT cat_ID from evo_categories WHERE cat_blog_ID = 11)': Table 'evo_categories' was not locked with LOCK TABLES (1100)
5 yabba Jun 20, 2008 13:46
Reverse the process, copy the whole database and delete the bits you don't want ;)
¥
6 austriaco Jun 20, 2008 13:52
¥åßßå wrote:
Reverse the process, copy the whole database and delete the bits you don't want ;)
¥
You're joking, right? :)
Well I arrived at this:
mysqldump --lock-all-tables -u lnieves -p --where="evo_items__item.post_main_cat_ID IN (SELECT cat_ID from evo_categories WHERE cat_blog_ID = 11)" libenie2_bvlt1 evo_items__item
That will dump all posts belonging to Blog #11 :)
Now I would need to do something similar for comments. OK, now I have and idea of where I'm going.
7 slamp Jun 20, 2008 14:52
¥åßßå wrote:
Reverse the process, copy the whole database and delete the bits you don't want ;)
¥
I would do the same. Dump all, restore on localhost, delete all blogs you don't want. Dump the DB on localhost.
8 edb Jun 22, 2008 13:24
I think if you were trying to offer to a blogger "back up YOUR blog" then the idea of reversing the process wouldn't be exactly what was in mind? Like, suppose I have 10 bloggers who each have a blog. I want to let each one back up THEIR content for like if they want to move it elsewhere or whatever. In that case I pretty much wouldn't want them to get all the content with an expectation that they would filter out their stuff later.
Yeah dumping what you didn't want later is actually easier, but it should be easy to envision situations where the actual installation owner would want a method for any given individual to grab his/her content.
9 austriaco Jun 22, 2008 14:29
Yeah, that was what I was thinking at the beginning. I'm offering one friend the opportunity to blog from a sub-domain of a b2evo installation. I thought I can save myself the time involved in creating a new database and installing a full b2evo just for him, by just creating a blog in the existing installation and when he decides to abandon me :( he can take his data with him.
But I also think the solution to this problem would have a broader field of application, for instance for people who use the autblogger plugin.
Wow this would be cool.
So here's some "top of my head" thoughts, hoping that it sparks some discussion and/or hacks. First you need to query the database a lot to make this happen. For example: what categories (and of course sub-categories) are assigned to this blog? Because posts are really assigned to categories, so to get the posts means you have to get the cats first. Also if you are allowing "cross post across multiple blogs" then you will really have to dig deep to find all the posts that MIGHT be attached to that blog.
Database backups typically have everything. To get only one blog would, to me, imply that you only want info related to bloggers (users in table-land) who can post in that blog, so that is another thing you would have to carefully extract accurately. Like: the generic evo_users table would only pull the users who can post in that blog. Further you would probably want to only pull evo_settings that applied to that particular user. hmmm... It might be that the user-specific settings are stored in the user profile. It is worth investigating though to see if you are pulling ONLY the info appropriate to those bloggers who can post in the blog you are doing the database dump about.
Other areas of interest will be widgets and plugins. It should be no problem to pull the widgets associated with a blog, but I can't even guess at how to make sure you have the appropriate plugin info in the event that a plugin has created a widget that the blog you are backing up is using.
Wow this is an awesome question! Way back when I did the "give a blogger a blog" hack did I envision the blogger wanting a backup of their blog. I have this funny feeling that both the obsolete AutoBlog and current UserBlog plugins thought of this either. Great place to go though. It will be exciting, on a digital level, to see what can be done regarding a complete answer to this question.