I’ve probably mentioned before how much of a PHP/MySQL/WordPress nerd I am. If I haven’t so far, I’ve said so now.
One of the sites I operate requires numerous tables with about five million rows all together. Automated backups are easy with any number of WordPress plugins. But, what to do when you’ve got a HUGE SQL file and really small upload limits in phpMyAdmin and short server timeouts preventing a single script from reviving said huge file?
The brilliant scheme I came up with (I’m being sarcastic here) was to take those large files and split them up into smaller files of about 3MB each. This approach was problematic for a number of reasons. It required me to download a huge file and slice it up into small files between MySQL inserts. Copying and pasting 25,000 rows at a time was giving my poor laptop fits. It also meant I had to manipulate the full plaintext MySQL file, rather than a GZipped version. On top of all of that, I was paranoid that I would accidentally omit a single row and have to start all over again. In order to overcome the problem of server timeouts, I wrote a script that would load one small file, finish, then call itself again incrementing to the next file to be loaded, LRR.
There’s nothing good about this extremely time and labor intensive approach that was greatly error prone.
I wrote the above original serial-file-upload schema about a year and a half ago. My PHP/MySQL kung fu is now much stronger. Back then I never used scripts written by others – foolishly thinking that it would take me longer to understand how to integrate their scripts than it would be to build a serviceable script from scratch. I went so far as to write my own AJAX library. It was functional, but no where near the quality and reliability of jQuery.
Much humility and less hubris later, I’m more confident in my ability to read, understand, and interface with scripts written by others. This now allows me to focus on only writing those things that are truly unique and critical to my projects, without having to re-invent the wheel.
What’s the name of this magnificent script I’m using for bulk uploads of really large MySQL databases? Well, this is where the “tragically terrible name” bit from the subject line of this post. The script is called <wince cringe>, “BigDump.”</wince cringe> This script just automated a HUGE chunk of work for me. Instead of struggling with more than 4 million lines of MySQL, I uploaded a SQL file, changed some variables in the BigDump script, and ran the script.
The total upload time is probably about the same as the script I was using. The benefit is in not having to handle a ton of code before getting around to running the script