JSONL to CSV Converter
100% client-side. Your file never leaves your browser. No upload.
Convert
JSONL to CSV Converter
Flatten nested JSONL / NDJSON into a flat CSV table. Columns are inferred automatically using dot.notation for nested keys. Open the output in Excel, Google Sheets, BigQuery, or pandas.
Before you start
You need one of the following:
- A
.jsonlor.ndjsonfile (JSON Lines) on your computer, or - JSONL text you can paste directly into the input pane.
Every line must be a valid JSON object. If a line is a string, a number, or a standalone array, my converter will flag it as an error and skip it. Because CSV is a table format, I need objects so I can map keys to column headers.
I use dot-notation to flatten your data. If you have a nested object like {"user": {"id": 1}}, it becomes a column named user.id. There is no hard file size limit, but since this runs entirely in your browser's memory, 100 MB is usually the "safe" ceiling before things get laggy.
How to use it
- Paste your JSONL into the left pane, or drop a
.jsonlfile onto the drop zone. - Pick a Delimiter (comma is the default, but semicolon or tab is often better if your data contains lots of long-form text).
- Click Convert. My script will scan every line to build a master list of all possible columns.
- Review the output on the right. If any lines were malformed, I'll list the specific line numbers and errors at the bottom.
- Click Copy or Download .csv to save your flattened table.
Options explained
Delimiter
This sets the character that separates your columns. While comma is the standard, Tab (TSV) is often safer for complex JSON data because it's much less likely to appear inside your actual values, making the resulting file easier for tools like Excel or pandas to parse without quoting headaches.
Example
Input (JSONL):
{"id": 1, "user": {"name": "Ada"}, "tags": ["admin", "dev"]}
{"id": 2, "user": {"name": "Linus"}, "notes": "Active"}
Output (CSV):
id,user.name,tags.0,tags.1,notes
1,Ada,admin,dev,
2,Linus,,,Active
Notice how the headers are a union of all keys found in the file, and nested arrays are expanded into indexed columns like tags.0.
Tips & common pitfalls
- Column Inference: I have to read the entire input before I can write the first line of CSV. This is because I need to know every possible key that exists in any row to build the header. If you have a 500 MB file, this will be slow.
- Dot-notation for nesting: Nested objects are always flattened.
{"a": {"b": 1}}→a.b. If you have deeply recursive structures, your CSV might end up with hundreds of columns. - Sparse Data: If only one record out of a thousand has a specific field, that column will still be created for every row, but it will be empty for the other 999 records.
- Arrays to Columns: Arrays are flattened by index (
0,1,2). This works great for short, fixed-length lists but creates "column soup" if you have huge arrays.
Troubleshooting
My browser tab is freezing or crashing.
This happens when the resulting CSV string is too large for the browser to hold in a single <textarea>. Try processing a smaller chunk of the file, or use a command-line tool like jq if you are dealing with multi-gigabyte datasets.
The output columns are in a weird order.
I build the columns in the order they appear in the file. If row 10 has a key that row 1 doesn't, that column is appended to the end of the header list. If you need a specific order, you'll need to re-sort them in Excel or Google Sheets after importing.
It says "Line X is not an object".
CSV requires a key-value structure. If your JSONL contains a line like "just a string" or [1, 2, 3] at the top level, I can't turn that into a table row. Wrap those values in an object (e.g., {"data": [1, 2, 3]}) before converting.
Related tools
See also: if you need to do something adjacent on this site, try JSONL to JSON to convert JSONL into a JSON array (or back), Formatter to pretty-print or minify each JSONL record, or OpenAI Fine Tune Validator to validate an OpenAI fine-tune file against the chat schema.
Frequently asked questions
Does my data get uploaded to a server?
No. I built this tool to run entirely in your browser. The conversion logic is handled by JavaScript on your machine. Your data never leaves your computer, which makes it safe for API logs or sensitive datasets. You can even use this offline.
How are nested objects handled?
I use a recursive flattener. Every level of nesting adds a dot to the column name. This is the standard way to prepare JSON data for tools that expect flat tables, like BigQuery, Amazon S3 Select, or legacy SQL databases.
What about commas and quotes inside my values?
I follow RFC 4180 rules. If a value contains your chosen delimiter, a double quote, or a newline, I wrap the entire cell in double quotes and escape any internal quotes by doubling them. It should "just work" when you open it in Excel.
Why are some cells empty?
In JSONL, schemas are often "sparse"—meaning not every object has every field. If a field exists in one line but not another, the CSV will show an empty cell for the missing data to keep the columns aligned.
Is there a limit to how many columns it can create?
Technically no, but Excel has a limit of 16,384 columns. If your JSONL is extremely varied or has deeply nested arrays, you might hit this limit. Try to keep your schema relatively consistent if you plan on using spreadsheet software.
Can I convert a regular JSON array to CSV here?
This tool expects one object per line. If you have a standard JSON file (like [...]), use my JSON ↔ JSONL converter first to turn it into lines, then paste it back here.