Importing large MySQL tables from command line

mysqlAs you may think this is a trivial thing, it actually isn’t. You can try and import 100GB .sql file and you will fail trying (Assuming you use default MySQL configuration). Recently I had some issues importing really large databases/tables, so I decided to share some of my settings that helped me import faster and with less errors like:

Lost connection to MySQL server
Modification on server side

Here are my aditional settings in my.cnf:

delay_key_write=ALL
bulk_insert_buffer_size=512M
max_allowed_packet = 1G
innodb_flush_log_at_trx_commit = 0
Modification of .sql file

If you don’t need your queries pushed into binary log, you should also set SET sql_log_bin = 0; in your .sql file and save some I/O.

Modification of  CLI command

I have also executed import command with additional parameter:

mysql --max_allowed_packet=1G my_database < mydatabase.sql

Try these settings out and keep me posted if they work. Any additional modifications that would improve InnoDB or MyISAM import are welcome in comments.

Leave a Reply

help-hint.png
Purpose of the commenting system is to share your experience. I encourage you to post feedback with your own suggestions, ideas or optimizations regarding the topic of a blog post. What commenting system isn't for, is asking questions about similar issues of yours and requesting support for it. Blog post is provided as is and I am not here to solve all your problems. Please bear that in mind and try to avoid posting such comments. I do take privilege to remove comment from my blog for any reason whatsoever. Usually I do it when I sense a comment was posted only for spam/seo reasons or is out of blog post's topic. Thank you for reading this, now you may continue :)
 

Your email address will not be published. Required fields are marked *