21 Mar 2013
Importing large MySQL tables from command line
As you may think this is a trivial thing, it actually isn’t. You can try and import 100GB .sql file and you will fail trying (Assuming you use default MySQL configuration). Recently I had some issues importing really large databases/tables, so I decided to share some of my settings that helped me import faster and with less errors like:
Lost connection to MySQL server
Modification on server side
Here are my aditional settings in my.cnf
:
delay_key_write=ALL bulk_insert_buffer_size=512M max_allowed_packet = 1G innodb_flush_log_at_trx_commit = 0
Modification of .sql file
If you don’t need your queries pushed into binary log, you should also set SET sql_log_bin = 0;
in your .sql file and save some I/O.
Modification of CLI command
I have also executed import command with additional parameter:
mysql --max_allowed_packet=1G my_database < mydatabase.sql
Try these settings out and keep me posted if they work. Any additional modifications that would improve InnoDB or MyISAM import are welcome in comments.