I was wondering about the possibility of zipping and tarring up the _data directory to populate sbds instead of using mysqldumps. For example, instead of just offering latest.tar for download on your website, would it also work to offer latest_data.tar as a download? There is probably a reason why mysqldumps are what people typically do, but I'm wondering if this is a special case where the _data directory might be a faster solution because the database restore took around 8 hours on a 6 virtual core machine with 16GB and the approach of downloading overwriting the _data directory could be much faster.
Haha, the .tar
download itself was a whole other battle for me.
The short answer is the ibdata...
and ib_logfile...
files are required as well as the steem/
folder, but those files contain data about all the databases on that server. Even so, this approach should work if you backup the entire _data
folder, but nothing short of that. That would almost assuredly go faster than dealing with the dump.
Creating my .tar
is no walk in the park either. I'm looping through the tables and doing a mysqldump
one at a time and appending each of them to the .tar
file. This database is so huge a standard mysqldump
of the entire thing never finished once in my testing.
It is difficult and expensive to be working with such large databases which got me thinking about a way to potentially reduce the size of the database by more than 50% without too much of a sacrifice. Anyway, I noticed that the sbds_core_blocks table accounts for around two-thirds of the entire database size, and I'm guessing that the mediumtext entry named, "raw" (probably for raw blockchain data) probably accounts for the vast majority of that space, and it got me wondering if it might be possible to simply replace the "raw" entries with a single character or something save space. For example, I think doing this might enable the entire database to fit comfortably on a single $40/month droplet with room to spare.
I've been asking myself this exact same question as well, your approach just may work. I'll definitely post if I find some answers.
Another thing that may be worth considering is that the comments table is the second biggest in storage space, and it may be worthwhile to save space removing removing older edited versions of the "body" mediumtext entries. For example, sometimes I edit a big long post a few times to correct a few typos and this type of editing may be fairly common and may be able to reduce the database size a lot.
Yeah, great point! The body
is fully duplicated on edits so that really could save some significant space.