Phabricator offers a bin/storage shell command. It allows to run the mysql client with the options from the application configuration. That’s useful in a modern distributed environment, when it’s not always straightforward to know what server contains what databases and what credentials to use. In a Docker container, for example, that could be a linked […]
Read moreFollow-up: a BASH script to split a MySQL dump by database
In this post, we’ve seen how to split a large MySQL dump by database. I’ve been asked a script to automate the process. Here you are. Note: On FreeBSD, replace AWK=awk by AWK=gawk and install lang/gawk port, so we can use GNU awk.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 |
#!/usr/bin/env bash AWK=awk REGEX_NAME="Current Database: `(.*)`" # Checks argument and prints usage if needed if [ "$#" -lt "1" ] then echo "Usage: $0 <dump.sql>" exit 1 fi # Splits dump into temporary files $AWK '/Current Database\: .*/{g++} { print $0 > g".tmpsql" }' $1 # Renames files or appends to existing one (to handle views) for f in *.tmpsql do DATABASE_LINE=`head -n1 $f` [[ $DATABASE_LINE =~ $REGEX_NAME ]] TARGET_FILE="${BASH_REMATCH[1]}.sql" if [ -f $TARGET_FILE ]; then cat $f >> $TARGET_FILE rm $f else mv $f ${BASH_REMATCH[1]}.sql fi done |
Split a large SQL dump by database
You created a MySQL backup of a large server installation with dozens of databases and wish to get the schema and data for one of them. You now have to deal with a file of hundreds of MB in a text editor. How convenient. Split a dump into several files You can quickly split this […]
Read more