JSONL Database Import & Export: PostgreSQL, MongoDB & More

A comprehensive guide to importing JSONL files into databases and exporting database records as JSONL. Covers PostgreSQL COPY commands, MongoDB CLI tools, MySQL JSON_TABLE, SQLite scripts, and production-ready bulk import patterns.

Last updated: February 2026

Why Use JSONL for Database Import & Export?

JSONL (JSON Lines) has become a preferred format for moving data between databases and external systems. Each line is an independent, self-contained JSON object, which means you can stream records one at a time without loading the entire dataset into memory. This is critical when you are migrating millions of rows between PostgreSQL instances, syncing MongoDB collections to a data warehouse, or feeding database exports into machine learning pipelines.

Compared to CSV, JSONL preserves data types, supports nested objects and arrays, and eliminates the ambiguity of delimiter escaping. A JSONL file can carry a PostgreSQL jsonb column exactly as-is, while CSV would require you to flatten or escape the nested structure. Compared to a full JSON array, JSONL is streamable: you can process, validate, or transform each record the moment it is read, rather than waiting for the entire file to parse.

Every major database system now has tooling for JSONL. PostgreSQL can COPY rows through jsonb casting, MongoDB ships with mongoimport and mongoexport that default to JSONL, MySQL 8.0 added JSON_TABLE for structured extraction, and SQLite has a JSON1 extension. In the following sections, you will learn the exact commands, scripts, and best practices for each database.

PostgreSQL: Import & Export JSONL

PostgreSQL offers several approaches for working with JSONL. The COPY command provides the fastest bulk import path, while the jsonb data type lets you store and query semi-structured data natively. For structured tables, you can parse JSONL fields into typed columns during import.

The fastest way to import JSONL into PostgreSQL is to load each line as a jsonb value using the COPY command. Create a table with a single jsonb column, then use COPY FROM to stream the file directly into the database.

Import JSONL into a jsonb Column
-- Create a staging table with a single jsonb column
CREATE TABLE staging_import (data jsonb);
-- Import the JSONL file (each line becomes one row)
\COPY staging_import (data) FROM 'users.jsonl';
-- Verify the import
SELECT count(*) FROM staging_import;
SELECT data->>'name' AS name, data->>'email' AS email
FROM staging_import
LIMIT 5;
-- Move data into a structured table
INSERT INTO users (name, email, age)
SELECT
data->>'name',
data->>'email',
(data->>'age')::int
FROM staging_import;

To export data as JSONL, use row_to_json() or to_jsonb() to convert each row into a JSON object, then COPY the result to a file. Each row becomes one line in the output file.

Export PostgreSQL Rows as JSONL
-- Export entire table as JSONL
\COPY (
SELECT row_to_json(t)
FROM (SELECT id, name, email, created_at FROM users) t
) TO 'users_export.jsonl';
-- Export with filtering and transformation
\COPY (
SELECT row_to_json(t)
FROM (
SELECT id, name, email,
created_at::text AS created_at
FROM users
WHERE active = true
ORDER BY id
) t
) TO 'active_users.jsonl';
-- Verify: each line is valid JSON
-- {"id":1,"name":"Alice","email":"alice@example.com",...}

For production workloads, use a Python script with psycopg2 to read a JSONL file and insert records into typed columns. This gives you full control over validation, error handling, and batch sizing.

Python Script for Structured PostgreSQL Import
import json
import psycopg2
from psycopg2.extras import execute_values
def import_jsonl_to_postgres(file_path, conn_string):
conn = psycopg2.connect(conn_string)
cur = conn.cursor()
batch = []
batch_size = 1000
total = 0
with open(file_path, 'r') as f:
for line_num, line in enumerate(f, 1):
line = line.strip()
if not line:
continue
try:
record = json.loads(line)
batch.append((
record['name'],
record['email'],
record.get('age'),
))
except (json.JSONDecodeError, KeyError) as e:
print(f"Skipping line {line_num}: {e}")
continue
if len(batch) >= batch_size:
execute_values(
cur,
"INSERT INTO users (name, email, age) "
"VALUES %s ON CONFLICT (email) DO NOTHING",
batch
)
total += len(batch)
batch = []
print(f"Imported {total} records...")
if batch:
execute_values(
cur,
"INSERT INTO users (name, email, age) "
"VALUES %s ON CONFLICT (email) DO NOTHING",
batch
)
total += len(batch)
conn.commit()
cur.close()
conn.close()
print(f"Done. Imported {total} records.")
# Usage
import_jsonl_to_postgres('users.jsonl', 'postgresql://localhost/mydb')

MongoDB: Native JSONL Support

MongoDB has first-class JSONL support through its command-line Database Tools. The mongoimport and mongoexport utilities default to JSONL format, making MongoDB one of the easiest databases to work with for JSONL data exchange. Each line in a JSONL file maps directly to a MongoDB document.

mongoimport reads a JSONL file and inserts each line as a document into the specified collection. It supports upsert mode, field type coercion, and parallel insertion for high throughput.

Import JSONL with mongoimport
# Basic import: each line becomes a document
mongoimport \
--uri="mongodb://localhost:27017/mydb" \
--collection=users \
--file=users.jsonl
# Upsert mode: update existing documents by _id
mongoimport \
--uri="mongodb://localhost:27017/mydb" \
--collection=users \
--file=users.jsonl \
--mode=upsert \
--upsertFields=email
# Drop collection before import (clean slate)
mongoimport \
--uri="mongodb://localhost:27017/mydb" \
--collection=users \
--file=users.jsonl \
--drop
# Import with parallel workers for large files
mongoimport \
--uri="mongodb://localhost:27017/mydb" \
--collection=users \
--file=large_users.jsonl \
--numInsertionWorkers=4

mongoexport writes each document in a collection as a single JSON line. You can filter, project, and sort the output. The result is a valid JSONL file ready for processing by any downstream system.

Export MongoDB to JSONL with mongoexport
# Export entire collection as JSONL (default format)
mongoexport \
--uri="mongodb://localhost:27017/mydb" \
--collection=users \
--out=users_export.jsonl
# Export with query filter
mongoexport \
--uri="mongodb://localhost:27017/mydb" \
--collection=users \
--query={"active": true} \
--out=active_users.jsonl
# Export specific fields only
mongoexport \
--uri="mongodb://localhost:27017/mydb" \
--collection=users \
--fields=name,email,created_at \
--out=users_partial.jsonl
# Export with sorting
mongoexport \
--uri="mongodb://localhost:27017/mydb" \
--collection=users \
--sort={"created_at": -1} \
--out=users_sorted.jsonl

MySQL: JSONL with JSON_TABLE & Scripts

MySQL does not have a built-in JSONL import command like MongoDB, but MySQL 8.0 introduced JSON_TABLE which lets you extract structured data from JSON strings. For bulk JSONL import, the most reliable approach combines LOAD DATA INFILE for raw loading with JSON_TABLE for extraction, or uses a scripting language like Python.

Load the JSONL file as raw text into a staging table, then use JSON_TABLE to extract fields into a structured table. This two-step approach works entirely within MySQL.

Import JSONL Using a Staging Table & JSON_TABLE
-- Step 1: Create a staging table for raw lines
CREATE TABLE jsonl_staging (
id INT AUTO_INCREMENT PRIMARY KEY,
raw_line TEXT NOT NULL
);
-- Step 2: Load the JSONL file as raw text
LOAD DATA INFILE '/var/lib/mysql-files/users.jsonl'
INTO TABLE jsonl_staging
LINES TERMINATED BY '\n'
(raw_line);
-- Step 3: Extract structured data with JSON_TABLE
INSERT INTO users (name, email, age)
SELECT jt.name, jt.email, jt.age
FROM jsonl_staging,
JSON_TABLE(
raw_line, '$'
COLUMNS (
name VARCHAR(255) PATH '$.name',
email VARCHAR(255) PATH '$.email',
age INT PATH '$.age'
)
) AS jt;
-- Step 4: Clean up staging table
DROP TABLE jsonl_staging;
-- Verify the import
SELECT * FROM users LIMIT 5;

For more flexibility, use a Python script with mysql-connector-python to read a JSONL file and batch-insert records. This approach handles large files efficiently and provides detailed error reporting.

Python Script for MySQL JSONL Import
import json
import mysql.connector
def import_jsonl_to_mysql(file_path, config):
conn = mysql.connector.connect(**config)
cursor = conn.cursor()
batch = []
batch_size = 1000
total = 0
insert_sql = (
"INSERT INTO users (name, email, age) "
"VALUES (%s, %s, %s) "
"ON DUPLICATE KEY UPDATE name=VALUES(name)"
)
with open(file_path, 'r') as f:
for line_num, line in enumerate(f, 1):
line = line.strip()
if not line:
continue
try:
record = json.loads(line)
batch.append((
record['name'],
record['email'],
record.get('age'),
))
except (json.JSONDecodeError, KeyError) as e:
print(f"Skipping line {line_num}: {e}")
continue
if len(batch) >= batch_size:
cursor.executemany(insert_sql, batch)
conn.commit()
total += len(batch)
batch = []
print(f"Imported {total} records...")
if batch:
cursor.executemany(insert_sql, batch)
conn.commit()
total += len(batch)
cursor.close()
conn.close()
print(f"Done. Imported {total} records.")
# Usage
import_jsonl_to_mysql('users.jsonl', {
'host': 'localhost',
'user': 'root',
'password': 'secret',
'database': 'mydb'
})

SQLite: Lightweight JSONL Import

SQLite is an excellent choice for local development and smaller datasets. While SQLite does not have a native JSONL import command, its JSON1 extension provides json_extract() for working with JSON data, and a simple Python script can import JSONL files efficiently using SQLite's built-in transaction support.

Use Python's built-in sqlite3 module to read a JSONL file and insert records into an SQLite database. Wrapping all inserts in a single transaction dramatically improves performance for large files.

Python Script for SQLite JSONL Import
import json
import sqlite3
def import_jsonl_to_sqlite(jsonl_path, db_path):
conn = sqlite3.connect(db_path)
cursor = conn.cursor()
# Create table if it doesn't exist
cursor.execute("""
CREATE TABLE IF NOT EXISTS users (
id INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
email TEXT UNIQUE,
age INTEGER,
raw_json TEXT
)
""")
total = 0
errors = 0
with open(jsonl_path, 'r') as f:
# Use a transaction for bulk insert performance
conn.execute('BEGIN TRANSACTION')
for line_num, line in enumerate(f, 1):
line = line.strip()
if not line:
continue
try:
record = json.loads(line)
cursor.execute(
"INSERT OR IGNORE INTO users "
"(name, email, age, raw_json) "
"VALUES (?, ?, ?, ?)",
(
record['name'],
record.get('email'),
record.get('age'),
line, # Store original JSON
)
)
total += 1
except (json.JSONDecodeError, KeyError) as e:
errors += 1
print(f"Line {line_num}: {e}")
conn.commit()
conn.close()
print(f"Imported {total} records, {errors} errors")
# Usage
import_jsonl_to_sqlite('users.jsonl', 'app.db')
# Query with json_extract (SQLite JSON1 extension)
# SELECT json_extract(raw_json, '$.name') FROM users;

Bulk Import Best Practices

Importing large JSONL files into a database requires careful attention to transactions, error handling, and resource management. These best practices apply across all database systems and will help you build reliable, production-ready import pipelines.

Batch Transactions

Performance

Never insert one row per transaction. Group inserts into batches of 500-5000 rows per transaction. This reduces disk I/O and lock overhead by 10-100x. Most databases see the biggest gains at batch sizes of 1000. Going larger offers diminishing returns and increases the cost of a rollback if something fails.

Line-Level Error Handling

Reliability

Parse and validate each JSONL line individually. Log the line number and error message for any line that fails validation or insertion. Use ON CONFLICT / ON DUPLICATE KEY to handle unique constraint violations gracefully instead of aborting the entire import. Save failed lines to a separate error file for later review.

Progress Tracking

Observability

For files with millions of lines, log progress every N records (for example, every 10,000). Track total lines read, successful inserts, skipped duplicates, and errors. Calculate and display the import rate (records/second) to identify performance bottlenecks. This feedback is essential for monitoring long-running imports.

This generic pattern combines batching, error handling, and progress tracking. Adapt it to any database by swapping the insert function.

Production-Ready Batch Import Pattern
import json
import time
def batch_import_jsonl(file_path, insert_fn, batch_size=1000):
"""Generic JSONL batch importer with error handling."""
batch = []
stats = {
'total_lines': 0,
'imported': 0,
'skipped': 0,
'errors': 0
}
start_time = time.time()
error_file = open(file_path + '.errors', 'w')
with open(file_path, 'r') as f:
for line_num, line in enumerate(f, 1):
stats['total_lines'] = line_num
line = line.strip()
if not line:
stats['skipped'] += 1
continue
try:
record = json.loads(line)
batch.append(record)
except json.JSONDecodeError as e:
stats['errors'] += 1
error_file.write(f"{line_num}\t{e}\t{line}\n")
continue
if len(batch) >= batch_size:
inserted = insert_fn(batch)
stats['imported'] += inserted
batch = []
if line_num % 10000 == 0:
elapsed = time.time() - start_time
rate = stats['imported'] / elapsed
print(
f"Progress: {line_num:,} lines | "
f"{stats['imported']:,} imported | "
f"{rate:,.0f} records/sec"
)
# Flush remaining batch
if batch:
inserted = insert_fn(batch)
stats['imported'] += inserted
error_file.close()
elapsed = time.time() - start_time
print(f"\nComplete in {elapsed:.1f}s")
print(f" Lines: {stats['total_lines']:,}")
print(f" Imported: {stats['imported']:,}")
print(f" Errors: {stats['errors']:,}")
return stats

Prepare Your Data with Free JSONL Tools

Before importing JSONL into your database, validate the file format and convert between formats. Our free online tools process everything locally in your browser, so your data never leaves your machine.

Work with JSONL Files Online

View, validate, and convert JSONL files up to 1GB right in your browser. No uploads required, 100% private.

Frequently Asked Questions

Import/Export JSONL β€” PostgreSQL, MongoDB, BigQuery & Mor...