Tutorial··4 min read

How to Split a mysqldump File Into Smaller Parts

Your mysqldump is too big to handle. Here's how to break it into importable chunks without breaking the SQL.

mysqldump creates one big file. That's great for backups, not so great when you need to actually restore it somewhere with size limits.

The naive approach (and why it breaks)

You might think to just use the Unix split command:

split -l 10000 dump.sql chunk_

This splits by line count. The problem? SQL statements can span multiple lines. You'll end up with files that have half an INSERT statement at the end and the other half at the start of the next file. Neither will import.

What proper splitting needs to do

  • Keep multi-line statements together
  • Preserve the header (SET statements, charset settings)
  • Include the header in each chunk so they're independently importable
  • Not break in the middle of a transaction

Method 1: Use a SQL-aware splitter

Tools that understand SQL structure will split at statement boundaries, not arbitrary line counts.

Split your mysqldump

Splits at SQL statement boundaries, keeps headers intact.

Open SQLSplit

Method 2: Export tables separately

If you still have access to the source database, export each table as a separate file:

# Export just the schema
mysqldump --no-data database_name > schema.sql

# Export each table's data separately
mysqldump --no-create-info database_name table1 > table1_data.sql
mysqldump --no-create-info database_name table2 > table2_data.sql

Import schema first, then data files in any order.

Method 3: Use --where for really big tables

If one table is the problem (like a huge logs table), split it by rows:

mysqldump database_name big_table --where="id < 1000000" > big_table_1.sql
mysqldump database_name big_table --where="id >= 1000000 AND id < 2000000" > big_table_2.sql

What about gzip compressed dumps?

If your dump is compressed (.sql.gz), you need to either:

  • Decompress first: gunzip dump.sql.gz
  • Use a tool that handles .gz files directly

SQLSplit handles .gz files — drop them in and it decompresses automatically before splitting.

Choosing split size

Rule of thumb:

  • phpMyAdmin: 10MB or 10,000 lines per chunk
  • Command line import on shared host: 50MB per chunk
  • Command line on dedicated server: Often don't need to split at all

When in doubt, go smaller. More chunks means more imports, but each import is more likely to succeed.

Frequently Asked Questions

Can I use the Linux split command for SQL files?

Not reliably. The split command divides by lines or bytes without understanding SQL syntax. This often breaks statements in the middle, creating invalid SQL files.

How do I split a mysqldump by table?

Use mysqldump with specific table names: mysqldump database_name table1 table2 > partial.sql. Or export with --no-create-info for data-only dumps of each table.

What's the best way to split a multi-gigabyte mysqldump?

For very large dumps, export tables separately from the source if possible. If you only have the dump file, use a SQL-aware splitter that preserves statement boundaries and includes headers in each chunk.

Related articles

View all

Advertisement