How to use the command 'ajson' (with examples)
ajson
is a versatile command-line tool designed to evaluate JSONPath expressions over JSON data. JSONPath is a robust querying language tailored for JSON, akin to XPath for XML. This utility is especially potent in automating and scripting scenarios where JSON data needs to be parsed, queried, or transformed. Whether JSON data resides in files, is piped via standard input, accessed from the web, or is part of a scripted arithmetic operation, ajson
can efficiently process and filter this data to extract valuable insights or perform calculations.
Use case 1: Read JSON from a file and execute a specified JSONPath expression
Code:
ajson '$..json[?(@.path)]' path/to/file.json
Motivation:
In many scenarios, JSON files contain nested structures with complex data. Extracting certain elements efficiently can be daunting without the right tools. This example serves to show how you can filter specific fields from such structured data, reducing the complexity of manual parsing and making data extraction more intuitive.
Explanation:
$..json[?(@.path)]
: This JSONPath expression queries any JSON nodes with apath
field. The$
symbol represents the root,..
denotes a recursive descent, andjson[?(@.path)]
specifies filtering nodes that contain thepath
element.path/to/file.json
: This denotes the file path of the JSON file you want to process.
Example Output:
This command would output lines containing JSON nodes with the specified path
attribute in your JSON file, simplifying extraction to just finding what you’re asking for efficiently.
Use case 2: Read JSON from stdin
and execute a specified JSONPath expression
Code:
cat path/to/file.json | ajson '$..json[?(@.path)]'
Motivation:
Using JSON file streaming can be crucial when dealing with data coming from pipelines or if you are integrating with other shell commands. This showcases how to integrate ajson
into a data processing pipeline, allowing for dynamic and flexible queries without writing intermediate files.
Explanation:
cat path/to/file.json
: This outputs the content offile.json
and pipes it toajson
.$..json[?(@.path)]
: This is the same JSONPath expression used to filter JSON nodes with apath
attribute.
Example Output:
The result, similar to the first example, will provide filtered data on the terminal. The dynamic aspect allows it to be adjusted to different contexts without changing the fundamental command structure.
Use case 3: Read JSON from a URL and evaluate a specified JSONPath expression
Code:
ajson 'avg($..price)' 'https://example.com/api/'
Motivation:
Accessing JSON data directly from APIs is common in web development and data science. Being able to directly query such data without downloading or parsing it manually facilitates rapid development and testing cycles.
Explanation:
avg($..price)
: This JSONPath expression is leveragingajson
s capability to perform arithmetic operations by calculating the average of allprice
elements found at any level in the JSON data structure.https://example.com/api/
: This is the URL endpoint from which JSON data is fetched.
Example Output:
Executing the above command would display the average price of all the extracted price entries, assuming an appropriate JSON structure from the API.
Use case 4: Read some simple JSON and calculate a value
Code:
echo '3' | ajson '2 * pi * $'
Motivation:
Simple mathematical computations on JSON values can be automated using this feature. This is especially beneficial when integrating with shell scripts to compute values on-the-fly.
Explanation:
echo '3'
: This represents the JSON input as a simple number,3
.2 * pi * $
: This evaluates the circumference of a circle with radius 3 using the mathematical constant pi, showcasingajson
’s ability to handle arithmetic operations directly on numerical JSON inputs.
Example Output:
The output would be approximately 18.84
, which is the circumference multiplied by the value in the JSON input.
Conclusion
ajson
proves to be a powerful tool in the toolkit of anyone who regularly works with JSON data. Through various use cases, it allows us to efficiently query, manipulate, and analyze JSON data, be it local files, streamed data, or even remotely accessed JSON-over-the-web, thus streamlining workflows and freeing up more time for complex problem-solving and analysis.