JSONL in JavaScript: Read, Write & Parse
A complete guide to working with JSONL (JSON Lines) files in JavaScript. Learn to read, write, parse, and stream JSONL data in both Node.js and the browser using built-in APIs and popular libraries.
Last updated: February 2026
Why JavaScript for JSONL?
JavaScript is a natural fit for working with JSONL files. JSON itself was born from JavaScript, so the language has first-class parsing support through the built-in JSON.parse() and JSON.stringify() methods. Whether you are building a Node.js data pipeline, processing log files on a server, or letting users upload JSONL in the browser, JavaScript gives you the tools to handle every step.
JSONL (JSON Lines) stores one JSON object per line, making it ideal for streaming, append-only logging, and processing large datasets line by line. Node.js streams align perfectly with this format, letting you read and transform millions of records without running out of memory. On the client side, the browser FileReader and Streams APIs enable JSONL processing entirely on the user's device. In this guide, you will learn how to read JSONL in both Node.js and the browser, write JSONL files, build streaming transform pipelines, and pick the right npm library for your project.
Reading JSONL in Node.js
Node.js provides the readline module to efficiently read files line by line. Combined with fs.createReadStream, this is the recommended way to process JSONL files on the server because it streams the data instead of loading the entire file into memory.
Use fs.createReadStream piped into readline.createInterface to read a JSONL file one line at a time. This approach uses minimal memory regardless of file size.
import { createReadStream } from 'node:fs';import { createInterface } from 'node:readline';async function readJsonl(filePath) {const records = [];const rl = createInterface({input: createReadStream(filePath, 'utf-8'),crlfDelay: Infinity,});for await (const line of rl) {const trimmed = line.trim();if (trimmed) {records.push(JSON.parse(trimmed));}}return records;}// Usageconst data = await readJsonl('data.jsonl');console.log(`Loaded ${data.length} records`);console.log(data[0]);
For small JSONL files that fit comfortably in memory, you can read the entire file at once and split by newline. This trades streaming efficiency for simplicity.
import { readFileSync } from 'node:fs';const records = readFileSync('data.jsonl', 'utf-8').split('\n').filter(line => line.trim()).map(line => JSON.parse(line));console.log(`Loaded ${records.length} records`);
Reading JSONL in the Browser
In the browser, users can upload JSONL files through a file input. You can read the file using the FileReader API or the modern Streams API, all without sending data to a server.
Use FileReader to read a file selected by the user, then split the text content into lines and parse each one. This keeps all data client-side for privacy.
function parseJsonlFile(file) {return new Promise((resolve, reject) => {const reader = new FileReader();reader.onload = () => {const text = reader.result;const records = text.split('\n').filter(line => line.trim()).map(line => JSON.parse(line));resolve(records);};reader.onerror = () => reject(reader.error);reader.readAsText(file, 'utf-8');});}// Usage with an <input type="file"> elementconst input = document.querySelector('input[type="file"]');input.addEventListener('change', async (e) => {const file = e.target.files[0];const records = await parseJsonlFile(file);console.log(`Parsed ${records.length} records`);});
For large files in the browser, use the Streams API with TextDecoderStream to process the file chunk by chunk. This avoids loading the entire file into memory at once.
async function* streamJsonl(file) {const stream = file.stream().pipeThrough(new TextDecoderStream());const reader = stream.getReader();let buffer = '';while (true) {const { done, value } = await reader.read();if (done) break;buffer += value;const lines = buffer.split('\n');buffer = lines.pop(); // Keep incomplete last linefor (const line of lines) {const trimmed = line.trim();if (trimmed) yield JSON.parse(trimmed);}}// Handle remaining bufferif (buffer.trim()) {yield JSON.parse(buffer.trim());}}// Usagefor await (const record of streamJsonl(file)) {console.log(record);}
Writing JSONL in Node.js
Writing JSONL files in Node.js is straightforward: serialize each JavaScript object to a JSON string and append a newline character. For best performance with large datasets, use a write stream instead of building the entire string in memory.
Use fs.writeFileSync or fs.createWriteStream to write records to a JSONL file. Each record is a single line of valid JSON followed by a newline.
import { writeFileSync } from 'node:fs';const records = [{ id: 1, name: 'Alice', role: 'engineer' },{ id: 2, name: 'Bob', role: 'designer' },{ id: 3, name: 'Charlie', role: 'manager' },];const jsonl = records.map(record => JSON.stringify(record)).join('\n') + '\n';writeFileSync('output.jsonl', jsonl, 'utf-8');console.log(`Wrote ${records.length} records`);
When writing millions of records, use a writable stream to avoid building the entire output string in memory. The stream handles backpressure automatically, pausing your writes when the OS buffer is full.
import { createWriteStream } from 'node:fs';async function writeJsonl(filePath, records) {const stream = createWriteStream(filePath, 'utf-8');for (const record of records) {const line = JSON.stringify(record) + '\n';// Respect backpressureif (!stream.write(line)) {await new Promise(resolve => stream.once('drain', resolve));}}stream.end();await new Promise(resolve => stream.on('finish', resolve));console.log('Write complete');}// Usageconst data = Array.from({ length: 100000 }, (_, i) => ({id: i + 1,value: Math.random(),timestamp: new Date().toISOString(),}));await writeJsonl('large_output.jsonl', data);
Transform Streams for JSONL Pipelines
Node.js Transform streams let you build composable data pipelines that read JSONL, process each record, and write the results. This pattern is ideal for ETL jobs, log processing, and data migration.
import { createReadStream, createWriteStream } from 'node:fs';import { createInterface } from 'node:readline';import { Transform } from 'node:stream';import { pipeline } from 'node:stream/promises';// Transform: parse JSONL line into object, process, stringify backconst transformJsonl = new Transform({objectMode: true,transform(chunk, encoding, callback) {const line = chunk.toString().trim();if (!line) return callback();try {const record = JSON.parse(line);// Add your transformation hererecord.processed = true;record.processedAt = new Date().toISOString();callback(null, JSON.stringify(record) + '\n');} catch (err) {callback(err);}},});// Build the pipelineconst rl = createInterface({input: createReadStream('input.jsonl', 'utf-8'),crlfDelay: Infinity,});const output = createWriteStream('output.jsonl', 'utf-8');for await (const line of rl) {transformJsonl.write(line);}transformJsonl.end();await pipeline(transformJsonl, output);console.log('Pipeline complete');
This pipeline reads a JSONL file line by line, applies a transformation to each record (adding processed and processedAt fields), and writes the results to a new file. The Transform stream handles backpressure automatically, so memory usage stays constant even for very large files. You can chain multiple transforms for complex ETL workflows.
JavaScript Libraries for JSONL
While built-in JSON.parse handles most cases well, several npm packages provide convenient utilities for JSONL-specific workflows like streaming, validation, and batch processing.
JSON.parse (Built-in)
Built-inThe built-in JSON.parse() and JSON.stringify() are highly optimized in V8 and sufficient for most JSONL use cases. Combine them with readline for streaming. No dependencies needed, and performance is excellent for files up to several hundred MB.
ndjson
Popularndjson is a popular npm package that provides streaming JSONL (Newline Delimited JSON) parsers and serializers compatible with Node.js streams. It handles parse errors gracefully and integrates well with existing stream pipelines. Great for quick prototyping.
jsonl-parse-stringify
Simplejsonl-parse-stringify is a lightweight, zero-dependency library that provides simple parse() and stringify() methods for JSONL data. It handles edge cases like trailing newlines and empty lines. Ideal when you want a clean synchronous API without stream setup.
Try Our Free JSONL Tools
Don't want to write code? Use our free online tools to view, convert, and format JSONL files right in your browser. All processing happens locally, so your data stays private.