jsonlkit.com
JSONL (JSON Lines) utilities, in the browser
Say hi →

JSONL Dataset Stats

updated 4 May 2026

100% client-side. Your data never leaves the page.

Compute stats

Drop a .jsonl file here, or

Overview

Top-level fields

Parse errors

Dataset stats

Drop a JSONL file and get the numbers you'd otherwise compute by hand: row counts, duplicate rate, parse-error count, and a per-field breakdown of fill rate, types, distinct value count, and top values. Useful for sanity-checking an export before you ingest it.

What it measures

How to use it

  1. Paste the JSONL into the box, or drop a .jsonl file onto the drop zone.
  2. Click Compute stats.
  3. Read the Overview table for the high-level counts.
  4. Skim the Top-level fields table — fields with low fill rate or surprising types are usually the interesting ones.
  5. If anything failed to parse, the Parse errors list shows the line, column, source, and a suggested fix. Run them through the auto-fixer to clean up.

What it doesn't do

Related tools

Frequently asked questions

Why is the duplicate count higher than I expected?

Duplicates are computed on the canonical (sorted-keys) JSON form, not the raw line text. Two records that differ only in key order or whitespace count as duplicates. If you actually want byte-for-byte duplicates, the deduplicator has a "literal line" mode.

What about fields that are nested deep?

Use the schema inferrer instead — it walks every level and reports types and required-ness at any depth.