JSON vs JSONL: What's the Difference?
A complete guide to understanding JSON and JSONL (JSON Lines) formats, their differences, and when to use each one.
Last updated: February 2026
Quick Comparison
| Feature | JSON | JSONL (JSON Lines) |
|---|---|---|
| Structure | Single object or array | One JSON object per line |
| File Extension | .json | .jsonl or .ndjson |
| Parsing | Must parse entire file | Can parse line by line |
| Streaming | Difficult | Native support |
| Large File Handling | Memory intensive | Memory efficient |
| Human Readable | Formatted with indentation | Compact, one record per line |
| Schema Validation | JSON Schema supported | Per-line JSON Schema |
| Appending Data | Must rewrite entire file | Simply append a new line |
What is JSON?
JSON (JavaScript Object Notation) is a lightweight data interchange format that is easy for humans to read and write, and easy for machines to parse and generate. It is based on a subset of JavaScript and has become the de facto standard for data exchange on the web.
A JSON file contains a single JSON value - typically an object or an array. The entire file must be parsed as one unit to access any data within it.
[
{"name": "Alice", "age": 30, "city": "New York"},
{"name": "Bob", "age": 25, "city": "London"},
{"name": "Charlie", "age": 35, "city": "Tokyo"}
]What is JSONL (JSON Lines)?
JSONL (JSON Lines), also known as Newline Delimited JSON (NDJSON), is a text format where each line is a valid JSON object. Lines are separated by newline characters (\n). Unlike standard JSON, there is no wrapping array or object around the data.
Each line in a JSONL file is self-contained and can be parsed independently. This makes JSONL ideal for streaming data, log files, and processing large datasets without loading everything into memory.
{"name": "Alice", "age": 30, "city": "New York"}
{"name": "Bob", "age": 25, "city": "London"}
{"name": "Charlie", "age": 35, "city": "Tokyo"}Key Differences Between JSON and JSONL
1. File Format & Structure
JSON wraps all data in a single object or array. The file must be syntactically valid as a whole - a missing bracket at the end invalidates the entire file.
JSONL stores one JSON object per line. Each line is independent - if one line has an error, the other lines can still be parsed.
2. Parsing & Memory Usage
JSON requires loading and parsing the entire file at once. A 1GB JSON file would need approximately 1GB+ of memory to parse.
JSONL can be read and parsed line by line (streaming). Even a 10GB JSONL file can be processed with minimal memory by reading one line at a time.
3. Streaming & Real-time Data
JSON is not designed for streaming. You cannot start processing data until the entire file or response is received and parsed.
JSONL is ideal for streaming. Each line can be processed as soon as it's received, making it perfect for real-time data feeds, logs, and server-sent events.
4. Appending & Writing Data
Appending data to a JSON array requires reading the entire file, parsing it, adding the new element, and rewriting the complete file.
Appending to a JSONL file is as simple as adding a new line at the end. No need to read or modify existing data. This is O(1) vs O(n) for JSON.
5. Human Readability
JSON supports pretty-printing with indentation, making it very readable for configuration files and API responses.
JSONL is compact by design - one record per line. It's optimized for machine processing rather than human reading, though individual lines can be pretty-printed.
When to Use JSON vs JSONL
- Working with configuration files
- Building REST API request/response bodies
- Storing small, structured data (under 10MB)
- Data needs to be human-readable and editable
- Working with nested, hierarchical data structures
- Browser-side data exchange
- Processing large datasets (100MB+)
- Streaming data in real-time
- Writing log files and event data
- Training machine learning models (OpenAI, Hugging Face)
- ETL pipelines and data processing
- Appending data frequently
- Processing data with Unix tools (grep, awk, sed)
Common Use Cases
Machine Learning & AI
JSONLJSONL is the standard format for ML training data. OpenAI uses JSONL for fine-tuning datasets, and many ML frameworks expect JSONL input. Each line represents one training example.
Application Logging
JSONLServer logs, application events, and audit trails are naturally suited for JSONL. Each event is written as a single line, making it easy to append, search with grep, and process with streaming tools.
Web APIs & Configuration
JSONREST APIs typically use JSON for request and response bodies. Configuration files (.json) use JSON for readable, structured settings with nested objects.
Big Data & Analytics
JSONLData pipelines, ETL processes, and analytics tools use JSONL for processing large volumes of data. Each record can be processed independently, enabling parallel processing and MapReduce patterns.
Code Examples
// Reading JSON - must load entire file
const data = JSON.parse(fs.readFileSync('data.json', 'utf8'));
// Access data
data.forEach(item => {
console.log(item.name);
});// Reading JSONL - stream line by line
import { createReadStream } from 'fs';
import { createInterface } from 'readline';
const rl = createInterface({
input: createReadStream('data.jsonl')
});
for await (const line of rl) {
const item = JSON.parse(line);
console.log(item.name);
}// JSON array to JSONL
const jsonArray = JSON.parse(fs.readFileSync('data.json', 'utf8'));
const jsonl = jsonArray
.map(item => JSON.stringify(item))
.join('\n');
fs.writeFileSync('data.jsonl', jsonl);// JSONL to JSON array
const lines = fs.readFileSync('data.jsonl', 'utf8')
.split('\n')
.filter(line => line.trim());
const jsonArray = lines.map(line => JSON.parse(line));
fs.writeFileSync('data.json', JSON.stringify(jsonArray, null, 2));