Convert JSON to CSV: Online Tool + Code Examples
JSON is the language of APIs. CSV is the language of spreadsheets and data pipelines. Here is how to convert between them correctly - handling nested objects, special characters, missing fields, and everything else that breaks naive converters.
Why JSON-to-CSV Conversion Is Trickier Than It Looks
At first glance, converting JSON to CSV seems trivial: take an array of objects, use the keys as headers, and the values as rows. But the real world is messier:
- Objects have nested sub-objects that do not map to a single column
- Values may contain commas, quotes, or newlines that break CSV parsing
- Different rows may have different keys (sparse data)
- Arrays within objects need to be stringified or flattened
- Null and undefined values need a consistent representation
A naive converter produces broken CSV files that silently corrupt data. This guide covers how to do it right.
The Ideal Input: A Flat Array of Uniform Objects
JSON-to-CSV conversion is cleanest when your input is an array of objects where every object has the same set of keys and all values are primitives:
[
{ "id": 1, "name": "Alice", "email": "alice@example.com", "active": true },
{ "id": 2, "name": "Bob", "email": "bob@example.com", "active": false },
{ "id": 3, "name": "Carol", "email": "carol@example.com", "active": true }
]
This converts directly to:
id,name,email,active
1,Alice,alice@example.com,true
2,Bob,bob@example.com,false
3,Carol,carol@example.com,true
JavaScript: Production-Ready Converter
Here is a complete, battle-tested function that handles missing fields, special characters, and nested objects:
function jsonToCSV(arr, options = {}) {
if (!Array.isArray(arr) || arr.length === 0) return '';
const {
delimiter = ',',
flatten = true,
nullValue = ''
} = options;
// Flatten nested objects with dot-notation keys
function flattenObj(obj, prefix = '') {
return Object.entries(obj).reduce((acc, [k, v]) => {
const key = prefix ? `${prefix}.${k}` : k;
if (flatten && v !== null && typeof v === 'object' && !Array.isArray(v)) {
Object.assign(acc, flattenObj(v, key));
} else {
acc[key] = v;
}
return acc;
}, {});
}
// Collect all unique headers across all rows
const flatRows = arr.map(row => flattenObj(row));
const headers = [...new Set(flatRows.flatMap(row => Object.keys(row)))];
// Escape a single cell value
function escapeCell(value) {
if (value === null || value === undefined) return nullValue;
const str = Array.isArray(value)
? JSON.stringify(value)
: String(value);
// Wrap in quotes if contains delimiter, quote, or newline
if (str.includes(delimiter) || str.includes('"') || str.includes('\n')) {
return '"' + str.replace(/"/g, '""') + '"';
}
return str;
}
const headerRow = headers.map(h => escapeCell(h)).join(delimiter);
const dataRows = flatRows.map(row =>
headers.map(h => escapeCell(row[h])).join(delimiter)
);
return [headerRow, ...dataRows].join('\n');
}
Usage Examples
// Basic
const csv = jsonToCSV(data);
// Custom delimiter (TSV)
const tsv = jsonToCSV(data, { delimiter: '\t' });
// Disable flattening - nested objects become JSON strings
const noFlatten = jsonToCSV(data, { flatten: false });
// Custom null placeholder
const withNulls = jsonToCSV(data, { nullValue: 'N/A' });
Handling Nested Objects: Flatten vs Stringify
When your JSON contains nested objects, you have two choices:
Option A: Flatten (Dot-Notation Columns)
// Input
{ "user": { "name": "Alice", "city": "NYC" }, "score": 95 }
// Output columns
user.name, user.city, score
Alice, NYC, 95
Best for: analytics, database imports, spreadsheet analysis. Column names are longer but each cell contains a scalar value.
Option B: Stringify (JSON Strings in Cells)
// Input
{ "user": { "name": "Alice", "city": "NYC" }, "score": 95 }
// Output columns
user, score
"{""name"":""Alice"",""city"":""NYC""}", 95
Best for: when you need to preserve the nested structure and will parse it back later. Note the doubled quotes inside the JSON string - that is correct CSV escaping.
Convert JSON ↔ CSV Instantly
Paste your JSON array and get a formatted CSV in one click. Handles nested objects, special characters, and missing fields. Free, 100% in-browser, no data sent to any server.
Open CSV/JSON Converter →Python: Three Approaches
Approach 1: csv.DictWriter (Flat Data)
import csv, json
data = json.loads(open('data.json').read())
with open('output.csv', 'w', newline='', encoding='utf-8') as f:
if not data:
raise ValueError("Empty array")
# Collect all field names (handles sparse rows)
fieldnames = list(dict.fromkeys(k for row in data for k in row))
writer = csv.DictWriter(f, fieldnames=fieldnames, extrasaction='ignore')
writer.writeheader()
writer.writerows(data)
Approach 2: pandas (Nested + Large Data)
import pandas as pd
# Automatically flattens one level of nesting
df = pd.json_normalize(data)
df.to_csv('output.csv', index=False, encoding='utf-8')
# For deeply nested: specify record_path and meta
df = pd.json_normalize(
data,
record_path=['orders'],
meta=['id', 'name'],
errors='ignore'
)
Approach 3: Command Line with jq
# Extract specific fields as CSV
jq -r '["id","name","email"], (.[] | [.id, .name, .email]) | @csv' data.json
# All fields dynamically
jq -r '(.[0] | keys_unsorted) as $keys | $keys, (.[] | [.[$keys[]]]) | @csv' data.json
Step-by-Step: Converting an API Response to CSV
- Fetch the data. Get your JSON from the API. Confirm it is an array of objects - if it is wrapped in a
datakey, extract that first:const rows = response.data; - Inspect the structure. Look at 2-3 sample objects. Identify nested fields, arrays, and any keys that differ between rows.
- Decide on flattening strategy. Do you want
address.cityas a column, oraddressas a JSON string cell? - Run the converter. Use the function above, or paste into our tool for instant conversion.
- Verify the output. Open the CSV in a spreadsheet and confirm row count matches, headers are correct, and no cells are broken across lines due to unescaped newlines.
- Set encoding. Save as UTF-8 with BOM (
\uFEFF) if the file will be opened in Excel on Windows to avoid character encoding issues.
The RFC 4180 CSV Rules Every Converter Must Follow
CSV has a specification - RFC 4180. Most broken converters ignore it. The critical rules:
- Fields containing commas, double-quotes, or newlines must be enclosed in double-quotes
- A double-quote inside a quoted field must be escaped as two double-quotes (
"") - Each record should end with CRLF (
\r\n), though\nis widely accepted - The first record may be a header row with field names
- Each record must have the same number of fields
// Field containing a comma - must be quoted
"New York, NY"
// Field containing a double-quote - must be double-escaped
"He said ""hello"""
// Field containing a newline - must be quoted
"Line 1
Line 2"
Frequently Asked Questions
Can I convert JSON with arrays inside objects to CSV?
Yes, but with a trade-off. An array value like "tags": ["admin", "editor"] cannot map to a single CSV cell natively. Your options are: (1) join the array into a string: "admin|editor", (2) stringify as JSON: "[""admin"",""editor""]", or (3) create one column per array index: tags.0, tags.1. Option 1 is most spreadsheet-friendly.
What about JSON objects that are not arrays?
CSV requires tabular data - rows and columns. A single JSON object (not an array) does not map to CSV rows. You can convert it to a two-column CSV: key and value. Or wrap it in an array first: [myObject] to get a single-row CSV.
How do I handle UTF-8 characters in CSV for Excel?
Excel does not reliably detect UTF-8. Add a BOM (byte order mark) at the start of the file: '\uFEFF' + csvString. This tells Excel to interpret the file as UTF-8 and prevents character corruption with non-ASCII characters like accented letters, Chinese, or Arabic.
How do I convert back from CSV to JSON?
Use our CSV to JSON Converter. In code: read the header row for keys, then map each subsequent row to an object using those keys. Treat numeric strings as numbers if needed.
My CSV has too many columns because of deep nesting. What should I do?
Use selective flattening: only flatten one level and stringify deeper objects. Or pre-process the JSON to extract only the fields you need before conversion. In pandas: pd.json_normalize(data, max_level=1) limits flattening depth.
Skip the code entirely for one-off conversions → Use our free online JSON/CSV Converter.
Usman has 10+ years of experience securing enterprise infrastructure, managing high-traffic servers, and building zero-knowledge security tools. Read more about the author.