CSV File Size Limits (Excel, Sheets, Databases)
What is the maximum size for a CSV file? Learn the limits of Excel, Google Sheets, and databases, and how to handle files that are too big.
CSV File Size Limits (Excel, Sheets, Databases)
"File too large." "Not all data loaded." "Application not responding."
We've all been there. You try to open a CSV, and your computer chokes.
Technically, CSV files have no size limit. A CSV is just text. It can be 1KB or 1TB. The limit comes from the software you use to open it.
Here are the hard limits for the most common tools.
1. Microsoft Excel
The Limit: 1,048,576 rows. The Behavior: If you open a CSV with 2 million rows, Excel will load the first 1,048,576 and simply delete the rest. It usually gives a warning: "Text file contains more data than will fit on one worksheet."
Column Limit: 16,384 columns (Column XFD).
Workaround:
- Split the CSV into multiple files.
- Use Excel's "Power Query" (Data > Get Data) to link to the file without loading it all into the grid.
2. Google Sheets
The Limit: 10 million cells. The Behavior: This is a cell limit, not just rows.
- If you have 1 column, you can have 10 million rows.
- If you have 10 columns, you can have 1 million rows.
- If you have 100 columns, you can have 100,000 rows.
File Size Limit: 100 MB for import.
Workaround: None. You must split the data.
3. Text Editors (Notepad, VS Code)
Notepad (Windows): Struggles with files > 500MB. Notepad++: Can handle larger files (up to 2GB on 64-bit), but gets slow. VS Code: Optimized for code, not huge data. Will warn you on large files. Vim / Sublime Text: Generally handle large files better than standard editors.
The Limit: Usually limited by your computer's RAM. If you have 16GB RAM, you can't open a 20GB text file in a standard editor because it tries to load it all into memory.
Solution: Use "Large File Viewer" tools (like LTFViewer) that stream data from disk.
4. Databases (MySQL, PostgreSQL)
The Limit: Effectively unlimited.
The Behavior: Databases are designed for this. A table can hold billions of rows.
Import Limit: The LOAD DATA command has no inherent limit, though server configurations (max_allowed_packet) might need tweaking.
Solution: This is the correct place for large data. Import your huge CSV into a database, then query just the slice you need.
5. Python (Pandas)
The Limit: Your RAM.
The Behavior: pd.read_csv() loads the entire file into memory. A 1GB CSV might take 2-3GB of RAM to process. If you run out of RAM, Python crashes (MemoryError).
Solution: Process in chunks.
for chunk in pd.read_csv('huge_file.csv', chunksize=10000):
process(chunk)
Summary Table
| Tool | Row Limit | File Size Limit (Approx) |
|---|---|---|
| Excel | 1,048,576 | ~100-200MB (practical) |
| Google Sheets | 10M cells | 100MB import |
| Notepad | N/A | ~500MB |
| MySQL/Postgres | Unlimited | Unlimited |
| Python (Pandas) | RAM dependent | RAM dependent |
What to Do with a "Too Big" CSV
- Split It: Break it into chunks of 1 million rows. -> Split Tool
- Filter It: Use a command line tool or script to extract only the rows you need (e.g., "Only rows from 2024").
- Database It: Import to SQL and use queries.
File too big for Excel? HappyCSV's Split Tool can chop your massive CSV into Excel-safe chunks in seconds.
Related Articles
Anonymize CSV Data (GDPR/Testing)
How to mask sensitive data in CSV files. Anonymize names, emails, and phones for testing or GDPR compliance.
Batch Convert Multiple Excel Files to CSV
How to convert 100 Excel files to CSV at once. Use VBA macros, Python scripts, or batch converters to save hours of manual work.
Best Free CSV Viewers for Mac & Windows
Excel isn't the only way to open CSVs. Check out the best free CSV viewers like Tad, Miller, and online tools for large files.
Need to handle CSV files?
HappyCSV is the free, secure way to merge, split, and clean your data — all in your browser.