Skip to content
Back to Blog
Tutorials

jq Cheat Sheet: 30 Real-World JSON Command-Line Patterns

Master jq with 30 battle-tested patterns for filtering, transforming, and extracting JSON on the command line — from kubectl and AWS to log files.

12 min read

jq Cheat Sheet: 30 Real-World JSON Command-Line Patterns

You pipe kubectl get pods -o json into less, and the terminal freezes on a two-megabyte wall of JSON. All you want is the name of every pod in the Running phase. jq does that in three characters of filter syntax — once you know the vocabulary.

This is not a syntax reference. It is 30 patterns you will actually type, grouped by the task you are trying to accomplish: access, filter, transform, aggregate, format, and glue together with real tools like kubectl, aws, and docker.

When to Use jq vs a Browser Formatter vs Code

jq is not always the right answer. The three honest choices look like this:

SituationBest toolWhy
One API response, need syntax highlight and error line numbersBrowser JSON FormatterVisual diff, zero setup, private in-browser
Shell pipeline, log processing, CI script, remote serverjqComposable, scriptable, no GUI dependency
Business logic, unit tests, complex branchingLanguage code (JS / Python)Real debugger, types, libraries

Pick jq when the task lives inside a shell pipeline — everything else is probably easier somewhere else.

Installation and Your First Pipeline

jq ships as a single binary on every major platform:

# macOS
brew install jq

# Debian / Ubuntu
sudo apt install jq

# Windows (winget)
winget install jqlang.jq

A first pipeline, with the identity filter:

curl -s https://api.github.com/users/octocat | jq .

The . filter takes its input and emits it unchanged, pretty-printed. That alone replaces most “let me open this JSON in an editor” moments.

Five flags cover 90% of real use:

FlagPurpose
-rRaw output — strip surrounding quotes from string results
-cCompact — one JSON value per line (NDJSON)
-sSlurp — read all inputs into a single array
-RRaw input — read lines as strings instead of JSON
-nNull input — do not read stdin, use null as input

The Core Mental Model: Filters and Pipes

A filter takes one JSON value as input and produces zero or more JSON values as output. Filters compose with a pipe |, which sends each output of the left filter as input to the right filter. This is the same mental model as shell pipes, just with JSON values flowing instead of bytes.

# . — identity
echo '{"name":"Alice"}' | jq '.'

# .key — field access
echo '{"name":"Alice"}' | jq '.name'

# .key.sub — deep path
echo '{"user":{"email":"a@x.com"}}' | jq '.user.email'

# .[] — iterate array elements (produces multiple outputs)
echo '[{"id":1},{"id":2}]' | jq '.[] | .id'

# Pipe composition: every output of .items[] feeds into .name
echo '{"items":[{"name":"a"},{"name":"b"}]}' | jq '.items[] | .name'

That is the whole grammar. The 30 patterns below are combinations of these primitives.

30 Patterns You Will Actually Use

Each pattern shows input JSON, the command, and the output. Copy any of them into your terminal.

Access and Extract (Patterns 1–5)

Pattern 1 — Safe access with ?

Access a field that might not exist without crashing:

echo '{"name":"Alice"}' | jq '.address?.city?'
# Output: null

The ? suppresses errors on missing keys. Without it, .address.city would throw a type error if .address is absent.

Pattern 2 — Deep path access

echo '{"user":{"profile":{"email":"a@x.com"}}}' | jq '.user.profile.email'
# Output: "a@x.com"

Pattern 3 — Array slicing

echo '[10,20,30,40,50]' | jq '.[1:3]'
# Output: [20, 30]

echo '[10,20,30,40,50]' | jq '.[-1]'
# Output: 50

Negative indices count from the end. Slices use half-open intervals, like Python.

Pattern 4 — Recursive descent to find every matching key

echo '{"a":{"name":"x"},"b":[{"name":"y"},{"id":1}]}' | jq '.. | .name? | select(. != null)'
# Output: "x"
#         "y"

.. walks every value in the tree. Combined with .name? and select, it extracts every name field regardless of depth — invaluable for exploring unknown JSON schemas.

Pattern 5 — List all keys of an object

echo '{"zebra":1,"apple":2,"mango":3}' | jq 'keys'
# Output: ["apple", "mango", "zebra"]

echo '{"zebra":1,"apple":2,"mango":3}' | jq 'keys_unsorted'
# Output: ["zebra", "apple", "mango"]

keys sorts alphabetically; keys_unsorted preserves insertion order.

Filter (Patterns 6–10)

Pattern 6 — Filter an array by condition

echo '[{"age":20},{"age":30},{"age":40}]' | jq 'map(select(.age > 25))'
# Output: [{"age":30},{"age":40}]

map(f) applies f to each element; select(cond) keeps only elements where the condition holds.

Pattern 7 — String prefix matching

echo '[{"name":"api-gateway"},{"name":"web-ui"},{"name":"api-auth"}]' \
  | jq '.[] | select(.name | startswith("api"))'
# Output: {"name":"api-gateway"}
#         {"name":"api-auth"}

Also useful: endswith("..."), contains("..."), test("^regex$").

Pattern 8 — Combined conditions

echo '[{"type":"A","count":5},{"type":"A","count":15},{"type":"B","count":20}]' \
  | jq '.[] | select(.type == "A" and .count > 10)'
# Output: {"type":"A","count":15}

and, or, not work as you would expect.

Pattern 9 — Delete sensitive fields

echo '{"user":"alice","password":"s3cret","token":"abc"}' | jq 'del(.password, .token)'
# Output: {"user":"alice"}

del() accepts multiple paths and is safe if any path is missing.

Pattern 10 — Deduplicate by field

echo '[{"id":1,"v":"a"},{"id":2,"v":"b"},{"id":1,"v":"a2"}]' | jq 'unique_by(.id)'
# Output: [{"id":1,"v":"a"},{"id":2,"v":"b"}]

unique deduplicates whole values; unique_by(f) deduplicates by the result of a filter.

Transform (Patterns 11–15)

Pattern 11 — Rename fields

echo '[{"first_name":"Alice","age":30}]' | jq 'map({name: .first_name, age})'
# Output: [{"name":"Alice","age":30}]

Shorthand {age} is equivalent to {age: .age}.

Pattern 12 — Add a computed field with string interpolation

echo '[{"first":"Alice","last":"Chen"}]' \
  | jq 'map(. + {fullName: "\(.first) \(.last)"})'
# Output: [{"first":"Alice","last":"Chen","fullName":"Alice Chen"}]

\(expr) evaluates expr and interpolates its value into the string.

Pattern 13 — Flatten nested arrays

echo '[{"tags":["a","b"]},{"tags":["c"]}]' | jq '[.[] | .tags[]]'
# Output: ["a","b","c"]

echo '[[1,2],[3,[4,5]]]' | jq 'flatten'
# Output: [1,2,3,4,5]

flatten takes an optional depth argument: flatten(1) only peels one level.

Pattern 14 — Object to array and back

echo '{"a":1,"b":2}' | jq 'to_entries'
# Output: [{"key":"a","value":1},{"key":"b","value":2}]

echo '[{"key":"a","value":1},{"key":"b","value":2}]' | jq 'from_entries'
# Output: {"a":1,"b":2}

This pair enables transformations that require iterating over object keys — something the dot-path syntax cannot do directly.

Pattern 15 — Deep-merge two objects

echo '{"a":{"x":1},"b":2}' | jq '. * {a:{y:9}, c:3}'
# Output: {"a":{"x":1,"y":9},"b":2,"c":3}

The * operator deep-merges. For shallow merge, use + (right-hand side wins).

Aggregate (Patterns 16–20)

Pattern 16 — Length of arrays, objects, and strings

echo '[1,2,3,4]' | jq 'length'      # 4
echo '{"a":1,"b":2}' | jq 'length'  # 2
echo '"hello"' | jq 'length'        # 5

Pattern 17 — Sum a field

echo '[{"price":10},{"price":25},{"price":5}]' | jq '[.[].price] | add'
# Output: 40

add sums numbers, concatenates strings, or merges arrays — depending on input type.

Pattern 18 — Group by field

echo '[{"cat":"A","n":1},{"cat":"B","n":2},{"cat":"A","n":3}]' | jq 'group_by(.cat)'
# Output: [[{"cat":"A","n":1},{"cat":"A","n":3}],[{"cat":"B","n":2}]]

Each group becomes an inner array. Combine with map to aggregate per group.

Pattern 19 — Sort descending

echo '[{"date":"2026-01-03"},{"date":"2026-01-01"},{"date":"2026-01-02"}]' \
  | jq 'sort_by(.date) | reverse'
# Output: [{"date":"2026-01-03"},{"date":"2026-01-02"},{"date":"2026-01-01"}]

ISO 8601 date strings sort correctly as strings. For other formats, parse first — the Unix timestamp guide covers epoch seconds, milliseconds, and timezone conversion in depth.

Pattern 20 — Max or min by a field

echo '[{"name":"a","rating":4.1},{"name":"b","rating":4.8},{"name":"c","rating":3.9}]' \
  | jq 'max_by(.rating)'
# Output: {"name":"b","rating":4.8}

min_by, max_by return a single element. For the top N, use sort_by(.rating) | reverse | .[:N].

Format Output (Patterns 21–25)

Pattern 21 — CSV output

echo '[{"name":"Alice","age":30},{"name":"Bob","age":25}]' \
  | jq -r '.[] | [.name, .age] | @csv'
# Output: "Alice",30
#         "Bob",25

@csv quotes strings and escapes quotes inside them. -r removes the outer JSON string quotes so the CSV is directly pipeable. For the full round-trip between CSV and JSON in pipelines, see the CSV to JSON conversion guide.

Pattern 22 — TSV output

echo '[{"id":1,"name":"Alice"},{"id":2,"name":"Bob"}]' \
  | jq -r '.[] | [.id, .name] | @tsv'
# Output: 1	Alice
#         2	Bob

Tab-separated output plays well with cut, awk, and column -t.

Pattern 23 — Raw string output

echo '["alpha","beta"]' | jq -r '.[]'
# Output: alpha
#         beta

Without -r, each line would have surrounding quotes. Raw output is what you feed into xargs, while read, or another shell command.

Pattern 24 — NDJSON / JSON Lines

echo '[{"a":1},{"a":2}]' | jq -c '.[]'
# Output: {"a":1}
#         {"a":2}

Each line is a standalone JSON value — the format used by Kafka, Elasticsearch, and most structured loggers. -c also strips all internal whitespace.

Pattern 25 — String interpolation for formatted output

echo '[{"name":"server-1","cpu":0.73},{"name":"server-2","cpu":0.21}]' \
  | jq -r '.[] | "\(.name): \(.cpu * 100)% CPU"'
# Output: server-1: 73% CPU
#         server-2: 21% CPU

Great for summaries and log lines where raw JSON would be noise.

DevOps in the Wild (Patterns 26–30)

Pattern 26 — kubectl: names of every running pod

kubectl get pods -o json \
  | jq -r '.items[] | select(.status.phase=="Running") | .metadata.name'

Pipeline: iterate pods, keep only Running, emit the name as raw string.

Pattern 27 — AWS EC2: instance IDs with public IPs

aws ec2 describe-instances \
  | jq -r '.Reservations[].Instances[] | [.InstanceId, .PublicIpAddress // "none"] | @tsv'

The // alternative operator supplies a fallback when the field is null — avoiding a literal null in the output column.

Pattern 28 — GitHub API: merge paginated results

for p in 1 2 3; do
  curl -s "https://api.github.com/orgs/myorg/repos?per_page=100&page=$p"
done | jq -s 'add | map(.name)'

-s slurps all responses into one array-of-arrays, add concatenates them, then map(.name) extracts names. A common pattern for any paginated API.

Pattern 29 — Filter structured log files

cat app.log | jq -c 'select(.level=="error")'

Assumes the log file is NDJSON (one JSON object per line). Pair with tail -f for live monitoring:

tail -f app.log | jq -c 'select(.level=="error") | {ts: .timestamp, msg: .message}'

Pattern 30 — Docker: all image names in use

docker inspect $(docker ps -q) | jq -r '.[].Config.Image' | sort -u

Good for quickly checking which image versions are running across a host.

Common Errors and How to Fix Them

Every jq user hits these. Knowing the fix up front saves hours.

Cannot iterate over null (null)

The input field you tried to iterate was null or missing. Two fixes:

# Option A: optional operator
echo '{}' | jq '.items[]?'
# Output: (nothing, no error)

# Option B: alternative operator with default
echo '{}' | jq '(.items // [])[]'
# Output: (nothing, no error)

Use ? when you want silent skip. Use // [] when you want to force a concrete empty array so downstream filters still run.

Cannot index array with "key"

You wrote .foo but the current value is an array. Add [] to iterate:

# Wrong
echo '{"users":[{"name":"Alice"}]}' | jq '.users.name'
# Error: Cannot index array with "name"

# Right
echo '{"users":[{"name":"Alice"}]}' | jq '.users[].name'
# Output: "Alice"

Shell quoting trouble

Use single quotes around the entire jq program, double quotes inside for string literals:

# Works everywhere
jq '.users[] | select(.role == "admin")'

# Breaks — double quotes interpreted by shell first
jq ".users[] | select(.role == \"admin\")"

Windows PowerShell edge cases

PowerShell does not treat single quotes the same way. Prefer double quotes around the program and escape inner quotes, or use a here-string:

jq "@'
.users[] | select(.role == \"admin\")
'@"

For anything non-trivial, save the filter to a .jq file and run jq -f filter.jq.

Raw output misuse

-r only affects string results. Giving it an object produces a normal JSON object:

echo '{"a":1}' | jq -r '.'
# Output: {"a":1}     ← unchanged; -r had nothing to strip

If you want a specific field unquoted, select it first: jq -r '.a'.

jq rejects JSON with comments or trailing commas

echo '{"a": 1, /* note */ "b": 2,}' | jq .
# parse error: Invalid numeric literal

jq follows strict RFC 8259 JSON — no comments, no trailing commas, no unquoted keys. If the file is JSON5 or JSONC (common for config files), strip the extensions first. The JSON5 and JSONC formatting guide covers which parsers handle them and how to convert to strict JSON before piping into jq.

jq vs Alternatives: gron, fx, jj, yq

jq is not the only option, and sometimes a different tool is faster:

ToolStrengthWhen to reach for it
gronFlattens JSON into grep-able pathsExploring unknown schemas — you do not know where the key is
fxInteractive TUI explorer with highlightingBrowsing large JSON by hand
jjMuch faster than jq, limited syntaxHot loops processing millions of records
yqSame filter language but for YAMLKubernetes manifests and CI config
Browser JSON FormatterSyntax highlight, precise error messages, zero installDebugging a single response while developing

For day-to-day shell work, jq wins on composability. For one-off exploration, gron is often faster. For YAML, use yq — do not try to pipe through yq-then-jq.

Pro Tips for Daily Use

A few habits that make jq feel native:

  1. Keep a .jqrc in $HOME. Drop helper functions there, and they are available in every jq invocation:

    def running: select(.status.phase == "Running");
    def table(f): [f] | @tsv;
  2. Use jqplay.org for complex filters. Paste your JSON on the left, iterate the filter on the right, ship the working version into your script.

  3. Build your own cheat sheet from history. history | grep 'jq ' | sort -u > ~/jq-patterns.txt captures every pattern you have actually used.

  4. Combine with the browser JSON Formatter for unfamiliar schemas. Explore the structure visually first to find the path you need, then write the jq command.

  5. Watch live values: watch -n 5 "curl -s api.example.com/health | jq '.uptime'" refreshes every 5 seconds — a quick ops dashboard with no dependencies.

FAQ

What is jq and why do developers use it?

jq is a command-line JSON processor. It extracts, filters, and transforms JSON inside shell pipelines without a Python or Node script — the fastest path from API responses, log files, or kubectl output to the field you actually want.

Is jq available on Windows?

Yes. Install via winget install jqlang.jq, Chocolatey choco install jq, or download the binary from jqlang.org. PowerShell quoting rules differ from bash — when in doubt, save filters to a .jq file and run jq -f filter.jq.

How is jq different from a browser JSON formatter?

A browser JSON Formatter is interactive — paste JSON, see highlighting and errors, copy the result. jq is non-interactive — describe the transformation once, run it across a shell pipeline. Use the browser to debug one response; use jq to automate the same operation across hundreds.

Why does jq say “Cannot iterate over null”?

You tried to iterate (.[]) over a value that is null — usually because the field was missing from the input. Fix with the optional operator .items[]? or supply a default with .items // [] | .[].

Can jq modify files in place?

Not directly — jq writes to stdout. Use a temp file or sponge from moreutils: jq '.version = "2.0"' config.json | sponge config.json. Always back up the original first; a mistyped filter will overwrite the file.

How do I use jq with curl responses?

Pipe curl -s into jq. The -s flag silences curl’s progress meter so only the JSON body reaches jq:

curl -s https://api.github.com/users/octocat | jq '.name, .blog'

What’s the difference between jq’s | and the shell |?

Shell pipe sends bytes between processes. jq’s pipe sends JSON values between filters inside one jq invocation. A single jq command with many internal pipes runs in one process — cheaper than chaining jq | jq | jq.

Can jq handle JSON Lines (NDJSON)?

Yes, natively. jq reads each line as an independent JSON value when they are separated by whitespace. Use -c to emit NDJSON and -s to collect NDJSON into a single array.

How do I pretty-print JSON without filtering?

Use the identity filter: cat data.json | jq . or just jq . < data.json. It parses, validates, and pretty-prints with two-space indent — no filter required.

Is there a jq alternative with a GUI?

Yes. fx provides an interactive TUI. For a zero-install GUI, the browser-based JSON Formatter covers most explore-and-validate needs. Web tools like jqplay.org offer jq itself with a live preview.

When should I use jq instead of writing a Python script?

Reach for jq when the task is one-shot, fits a shell pipeline, and stays within filter, transform, and extract semantics. Switch to Python when you need unit tests, complex state, third-party libraries, or branching logic beyond what a .jq file keeps readable.

How do I use regular expressions in jq?

jq exposes regex via test("pattern"), match("pattern"), capture("pattern"), and scan("pattern"), all using PCRE syntax. Pass flags as a second argument: test("abc"; "i") for case-insensitive. match returns offsets and captures; scan emits every non-overlapping match.

Key Takeaways

  1. Mental model first: filter in, zero-or-more JSON values out, compose with |. Everything else is syntax.
  2. Learn by task, not by operator: the 30 patterns above cover roughly 95% of daily jq use.
  3. Handle null explicitly: ? for silent skip, // default for concrete fallback. Most Cannot iterate over null fixes are one of these two.
  4. Know when jq is the wrong tool: single responses belong in a browser JSON Formatter; YAML belongs in yq; complex logic belongs in real code.
  5. Pair with your existing stack: jq shines inside curl, kubectl, aws, docker, and log pipelines. Use it as the glue, not as the logic layer.

For related JSON workflows, see the JSON5 and JSONC formatting guide for config-file syntax extensions, and the CSV to JSON conversion guide for data format migrations where jq fits into the pipeline. When your JSON contains timestamps, the Unix timestamp guide covers the gotchas you will hit while transforming date fields.

Related Articles

View all articles