Moving big mysql database from another VPS to Linode
I am trying to move my site to Linode and I have a big database. The dump .sql file is around 20Gb. I have copied this to my Linode server and try to import it to Mysql but it fails. It exits suddenly and many of the tables are empty but I don't get any error messages or notifications. At the same point I am also logged out from my ssh.
I am using
mysqldump -u root-p <password>-Q --add-drop-table -O add-locks=FALSE -O lock-tables=FALSE favorious_production | gzip -c > production_dump.sql scp production_dump.sql.gz firstname.lastname@example.org:~</password>
on the other VPS.
Then on my Linode:
gunzip production_dump.sql.gz mysql -u root -p favorious_production < production_dump.sql
I tried this with a smaller database dump file and it worked but when I try it with the production one it imports data for about 2 hours and then exits with no error message but with not all data in.
Has anyone had this problem before? I am trialling Linode and I really like it so far but I cannot move the site here unless I manage to move the DB. I will also try moving the MYSQL data files but I am told that this is less likely to work.
If it does and you still can't successful load the DB file - perhaps try replicating it from the old DB server to the new DB server. Then once it's up and running on the new DB server, remove the replication relation.
Also try checking /var/log/
Are you running out of disk space? (With 2880 you only have 128GB.) Is the size of the database 20GB before gzipping or after? What is the size of the live database?
I have checked the MD5 sum and all is fine.
I also checked what's happening during the import. CPU load is around 1 and no swap is used.
I tried importing a version of the file that was created using –skip-extended-insert. This means that it has a separate INSERT statement from every row.
The SQL data file is 20Gb unzipped.
I guess all I have left is the replication. I will read more about it and try to set it up.
> At the same point I am also logged out from my ssh.
Have you tried running the job under nohup or screen? If it's your net connection dropping or your laptop going to sleep, then nohup or screen will save the day.
@obs - thanks for the maatkit tip. I didn't use it for this but it looks like a good tool for various tasks so I will give it a go.
Also, which file system are you using? Any limits on file sizes (though if the MD5 is ok, then presumably all's well).
The import hasn't finished yet. It's been a day and a half and I am estimating it needs another 2-3 days. The total size of the main mysql data file will be around 40Gb.