How to import and export large datasets in mysql

originally posted at http://www.goodfeelingplace.com/how-to-import-and-export-large-datasets-in-mysql
I’ve seen many questions around the importing and exporting of large tables and/or databases in mysql, so I’d like to share how I’ve learned to manage those processes.  Using phpmyadmin for these tasks is not suitable because web servers are not designed for running large scripts or uploading/downloading large files.  You can changes the limits in phpmyadmin but there are several different limitations in that environment which you may run into.

For exporting large databases you can use the mysqldump command from a prompt (unix shell or dos prompt).  For exporting large tables, I login to mysql and use a query with the syntax “SELECT column1name, column2name, column3name, … FROM myDatabaseName.myTableName INTO OUTFILE ‘/path/to/outfilename.csv’ FIELDS TERMINATED BY ‘,’ ENCLOSED BY ‘\” ESCAPED BY ‘\\’;” – this gives you a csv file in the location you specify with the fields surrounded by single quotes, separated by commas, and escaped by a backslash.  The only thing that needs to be escaped in this context is a single quote.

For importing large databases created from mysqldump or phpmyadmin, I save the sql dump file into a local directory, cd to that directory (in unix shell or windows dos), login to mysql from the command line, then type the following command: source dbdump.sql (or whatever the dump file name is with your db data).

For importing large tables that were exported using the export syntax I mentioned above, I login to mysql and then type something like “LOAD DATA INFILE ‘/path/to/outfile.csv’ INTO TABLE myDatabaseName.myTableName FIELDS TERMINATED BY ‘,’ ENCLOSED BY ‘\” ESCAPED BY ‘\\’;”

Join the Conversation

1 Comment

Leave a comment

Your email address will not be published. Required fields are marked *

CommentLuv badge