Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

There's a few jq patterns I've adopted:

   echo "$SOME_JSON" | jq '.[]' --raw-output --compact-output | while read -r LINE ; do ... ; done
...lets you process stuff "record by record" pretty consistently. (and `( xxx ; yyy ; zzz ) | jq --slurp '.'` lets you do the reverse, "absorbing" multiple records into an array.

Don't forget `--argjson`

    echo "{}" | jq --argjson FOO "$( cat test.json )" '{ bar: $FOO }'
...lets you "load" json for merging, processing, formatting, etc. The leading "{}" is moderately necessary because `jq` technically _processes_ json, not generates it.

Finally, it's a huge cheat code for string formatting!!

     $ echo "{}" | jq \
        --arg FOO "hello \$world" \
        --arg BAR "complicated \| chars" \
        --arg ONE 1 \
        --arg TWO 2 \
        '"aaa \( $FOO ) and \( $BAR ) and \( ($ONE | tonumber) + ($TWO | tonumber) ) bbb"'
     "aaa hello $world and complicated \\| chars and 3  bbb"
...optionally with `--raw-output` (un-json-quoted), and even supports some regex substitution in strings via `... | gsub(...)`.

Yes, yes... it's overly complicated compared to you and your fancy "programming languages", but sometimes with shell stuff, the ability to _CAPTURE_ arbitrary command output (eg: `--argjson LS_OUTPUT="$( ls -lart ... )"`), but then also use JSON/jq to _safely_ marshal/deaden the data into JSON is really helpful!



> The leading "{}" is moderately necessary because `jq` technically _processes_ json, not generates it.

The --null-input/-n option is the "out-of-the-box" way to achieve this, and avoids a pipe (usually not a big deal, but leaves stdin free and sometimes saves a fork).

This lets you rewrite your first "pattern":

    jq -cnr --argjson SOME_JSON "$SOME_JSON" '$SOME_JSON[]' | while read ...
We also have a "useless use of cat": --slurpfile does that job better:

    jq -n --slurpfile FOO test.json '{bar: $FOO[]}'
(assuming you are assured that test.json contains one json value; --argjson will immediately fail if this is not the case, but with --slurpjson you may need to check that $FOO is a 1-item array.)

And of course, for exactly the single-file single-object case, you can just write:

    jq '{bar: .}' test.json


Prefer long args in all cases, especially scripts and teaching.

Pipelines allow consistent syntax, but thanks for pointing out all the different variations of file support in jq.


gron does 90% of what I need for json processing, it's a great first step and often the only necessary step.

https://github.com/tomnomnom/gron


I fail to see how your string-formatting example is better than using bash's printf?


You get a json-quoted (or json-quotable) string at the end. In bash, that's sometimes worth its weight in gold.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: