Import CSV to MySQL/PostgreSQL (Step-by-Step Guide)
Learn how to import CSV files into MySQL and PostgreSQL databases. Command-line and GUI methods with error handling and best practices.
Import CSV to MySQL/PostgreSQL (Step-by-Step Guide)
Got a CSV file that needs to go into a database? Whether you're using MySQL or PostgreSQL, I'll show you exactly how to import your data - both command-line and GUI methods.
Plus how to avoid the common errors that waste hours of troubleshooting.
Before You Import: Prepare Your CSV
Don't just throw a raw CSV at your database and hope it works. Clean it up first.
1. Clean Column Headers
Bad headers:
First Name,Email Address,Phone #,Date Of Birth
Good headers:
first_name,email,phone,date_of_birth
Rules:
- Lowercase only
- No spaces (use underscores)
- No special characters (#, -, /, etc.)
- Start with letter (not number)
- Match your database column names exactly
2. Handle NULL Values
CSV with empty cells:
John,john@email.com,
Sarah,,555-5678
What databases see:
- Empty string:
"" - NULL: No value at all
Decision: Does empty mean NULL or empty string? Most of the time, NULL is what you want.
How to specify NULL in CSV:
John,john@email.com,\N
Sarah,\N,555-5678
\N is the standard NULL indicator for database imports.
3. Fix Data Types
Text with quotes: If your text has commas or quotes, wrap in double quotes:
"Smith, Inc","123 Main St, Apt 5"
Dates:
Use standard format: YYYY-MM-DD
2024-12-05 (not 12/5/2024 or Dec 5, 2024)
Booleans:
true/false or 1/0 or t/f
Pick one format and be consistent.
4. Encoding Must Be UTF-8
Most databases expect UTF-8. If your CSV is in another encoding (Windows-1252, ISO-8859-1), convert it first.
Check encoding:
- Open in text editor
- Look for garbled characters
- If you see
éinstead ofé, wrong encoding
Fix: Save as UTF-8 in your text editor before importing.
Method 1: MySQL Command Line
The fastest way for MySQL imports.
Create Table First
Before importing, create the table structure:
CREATE TABLE customers (
id INT AUTO_INCREMENT PRIMARY KEY,
first_name VARCHAR(100),
last_name VARCHAR(100),
email VARCHAR(255) UNIQUE,
phone VARCHAR(20),
created_at DATE
);
Match CSV columns: Your table should have columns for every field in your CSV (or at least the ones you want to import).
Load Data Command
LOAD DATA INFILE '/path/to/your/file.csv'
INTO TABLE customers
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'
IGNORE 1 ROWS
(first_name, last_name, email, phone, created_at);
Breakdown:
LOAD DATA INFILE: MySQL import command/path/to/your/file.csv: Full absolute path to CSVINTO TABLE customers: Target table nameFIELDS TERMINATED BY ',': Comma-separatedENCLOSED BY '"': Fields wrapped in quotesLINES TERMINATED BY '\n': Unix line endingsIGNORE 1 ROWS: Skip header row(column list): Map CSV columns to table columns
Common MySQL Errors
Error: "File not found"
Why: MySQL can't access the file path.
Fix:
- Use absolute path:
/Users/you/data/file.csv(not~/data/file.csv) - Check file permissions (MySQL user needs read access)
- On some systems, move file to MySQL's secure-file-priv directory
Find secure directory:
SHOW VARIABLES LIKE 'secure_file_priv';
Error: "Access denied"
Why: MySQL user doesn't have FILE privilege.
Fix:
GRANT FILE ON *.* TO 'your_user'@'localhost';
FLUSH PRIVILEGES;
Error: "Data truncated for column"
Why: Your CSV data is longer than the column width.
Fix: Increase column size:
ALTER TABLE customers MODIFY COLUMN first_name VARCHAR(200);
Method 2: MySQL Workbench (GUI)
Prefer clicking buttons? MySQL Workbench has a visual import wizard.
Steps:
- Right-click on table → Table Data Import Wizard
- Browse to your CSV file
- Click Next
- Map CSV columns to table columns
- Review and click Next
- Import executes
Pros: Visual, easier for beginners
Cons: Slower than command line, less control
Method 3: PostgreSQL COPY Command
PostgreSQL's version of import.
Create Table First
CREATE TABLE customers (
id SERIAL PRIMARY KEY,
first_name VARCHAR(100),
last_name VARCHAR(100),
email VARCHAR(255) UNIQUE,
phone VARCHAR(20),
created_at DATE
);
COPY Command
COPY customers(first_name, last_name, email, phone, created_at)
FROM '/path/to/file.csv'
DELIMITER ','
CSV HEADER;
Breakdown:
COPY customers: Target table(column list): Which columns to importFROM '/path/to/file.csv': File pathDELIMITER ',': Comma-separatedCSV: File is CSV formatHEADER: First row is headers (skip it)
For files with NULL values:
COPY customers(first_name, last_name, email, phone, created_at)
FROM '/path/to/file.csv'
DELIMITER ','
CSV HEADER
NULL '\N';
Common PostgreSQL Errors
Error: "Permission denied"
Why: PostgreSQL user can't read the file.
Fix:
- Move file to /tmp directory (usually readable)
- Or grant read permission:
chmod 644 /path/to/file.csv
Error: "Invalid byte sequence for encoding UTF8"
Why: File encoding doesn't match database encoding.
Fix: Convert CSV to UTF-8 first.
Error: "Extra data after last expected column"
Why: CSV has more columns than specified in COPY command.
Fix: Either:
- Add missing columns to your COPY statement
- Or remove extra columns from CSV
Method 4: pgAdmin (PostgreSQL GUI)
PostgreSQL's equivalent of MySQL Workbench.
Steps:
- Right-click table → Import/Export
- Select Import
- Choose file
- Set format to CSV
- Configure delimiter, header, NULL string
- Execute
Columns tab: Map CSV columns to table columns if names don't match exactly.
Handling Special Cases
Case 1: CSV Columns Don't Match Table
Your CSV:
fullname,email_address,phone_number
Your table:
first_name,last_name,email,phone
Solution: Either:
- Modify CSV headers to match table
- Or use column mapping in import wizard
- Or import to staging table, then transform with SQL
Staging table approach:
-- Import to temp table
CREATE TABLE temp_import (fullname TEXT, email_address TEXT, phone_number TEXT);
COPY temp_import FROM 'file.csv' CSV HEADER;
-- Transform and insert
INSERT INTO customers (first_name, last_name, email, phone)
SELECT
SPLIT_PART(fullname, ' ', 1),
SPLIT_PART(fullname, ' ', 2),
email_address,
phone_number
FROM temp_import;
-- Clean up
DROP TABLE temp_import;
Case 2: Data Type Mismatches
CSV has: 12/5/2024
Table expects: DATE in YYYY-MM-DD format
Fix: Import as TEXT first, then convert:
-- Import to staging table with TEXT columns
CREATE TABLE staging (date_text TEXT, amount_text TEXT);
COPY staging FROM 'file.csv' CSV HEADER;
-- Convert and insert
INSERT INTO real_table (date_col, amount_col)
SELECT
TO_DATE(date_text, 'MM/DD/YYYY'),
CAST(REPLACE(amount_text, '$', '') AS DECIMAL(10,2))
FROM staging;
Case 3: Really Large CSVs
Problem: CSV has 100M rows, import takes forever or times out.
Solution: Batch import.
Split CSV first:
split -l 1000000 large.csv chunk_
Import each chunk:
LOAD DATA INFILE 'chunk_aa.csv' INTO TABLE ...;
LOAD DATA INFILE 'chunk_ab.csv' INTO TABLE ...;
Or use a script to loop through chunks.
Data Type Matching Guide
Text → VARCHAR/TEXT:
name VARCHAR(100)
description TEXT
Numbers:
- Integers:
INT,BIGINT - Decimals:
DECIMAL(10,2),NUMERIC(10,2) - Money:
DECIMAL(15,2)(avoid FLOAT for currency)
Dates/Times:
- Date only:
DATE(YYYY-MM-DD) - Date + time:
DATETIME(MySQL) orTIMESTAMP(PostgreSQL) - Time only:
TIME
Booleans:
- MySQL:
TINYINT(1)orBOOLEAN - PostgreSQL:
BOOLEAN
NULL values:
- Allow NULL: Don't specify NOT NULL
- Disallow NULL: Add
NOT NULLconstraint
After Import: Verify Data
Don't assume it worked. Check:
Row count matches:
SELECT COUNT(*) FROM customers;
Should match your CSV row count (minus header row).
Spot-check data:
SELECT * FROM customers LIMIT 10;
Look for:
- Data in correct columns
- No garbled text (encoding issues)
- Dates formatted correctly
- NULLs where expected
Check for duplicates (if relevant):
SELECT email, COUNT(*)
FROM customers
GROUP BY email
HAVING COUNT(*) > 1;
Validate data ranges:
-- Any negative ages?
SELECT * FROM customers WHERE age < 0;
-- Any future birthdates?
SELECT * FROM customers WHERE birth_date > CURRENT_DATE;
Best Practices
1. Always Backup Before Importing
-- Create backup
CREATE TABLE customers_backup AS SELECT * FROM customers;
-- If import goes wrong
DROP TABLE customers;
ALTER TABLE customers_backup RENAME TO customers;
2. Use Transactions
Wrap imports in transaction so you can roll back if something goes wrong:
BEGIN;
LOAD DATA INFILE 'file.csv' INTO TABLE customers ...;
-- Check if it looks right
SELECT COUNT(*) FROM customers;
-- If good:
COMMIT;
-- If bad:
ROLLBACK;
3. Test with Sample First
Import first 100 rows to a test table:
- Create test table (same structure)
- Import sample CSV (first 100 rows)
- Verify data looks correct
- Then import full dataset
4. Index After Import, Not Before
Adding indexes before importing = slower import.
Better approach:
- Drop indexes
- Import data
- Recreate indexes
-- Before import
DROP INDEX idx_email ON customers;
-- Import data
LOAD DATA...
-- After import
CREATE INDEX idx_email ON customers(email);
5. Disable Foreign Key Checks (If Needed)
If you have foreign key constraints, temporarily disable during import:
MySQL:
SET FOREIGN_KEY_CHECKS=0;
LOAD DATA...
SET FOREIGN_KEY_CHECKS=1;
PostgreSQL:
ALTER TABLE customers DISABLE TRIGGER ALL;
COPY...
ALTER TABLE customers ENABLE TRIGGER ALL;
The Bottom Line
CSV to database import process:
- Clean CSV (headers, encoding, format)
- Create table with proper schema
- Use LOAD DATA (MySQL) or COPY (PostgreSQL)
- Verify data integrity after import
Common gotchas:
- File permissions
- NULL handling
- Data type mismatches
- Encoding issues
Pro tip: When in doubt, import to staging table first. Transform data with SQL, then insert into real table. Gives you more control and easier error recovery.
Prepare CSVs for database import: HappyCSV's tools help generate SQL INSERT statements and format your CSV for database compatibility.
Related Articles
Anonymize CSV Data (GDPR/Testing)
How to mask sensitive data in CSV files. Anonymize names, emails, and phones for testing or GDPR compliance.
Batch Convert Multiple Excel Files to CSV
How to convert 100 Excel files to CSV at once. Use VBA macros, Python scripts, or batch converters to save hours of manual work.
Best Free CSV Viewers for Mac & Windows
Excel isn't the only way to open CSVs. Check out the best free CSV viewers like Tad, Miller, and online tools for large files.
Need to handle CSV files?
HappyCSV is the free, secure way to merge, split, and clean your data — all in your browser.