Working with the JSON data format
Downstream Processing
In addition to the tools that output and format JSON data for the user are several tools that provide downstream processing of JSON as input in other programs (post-processing).
If you receive JSON data via an interface, it is good programming practice to sanity check the received data before further processing. The sanity check includes two stages:
- Syntactic correctness – is the spelling correct? Do all brackets (equal number of opening and closing brackets), commas, and quotation marks fit the bill?
- Correctness of data fields – does the received data structure match the data definition (JSON schema)?
For the first question, it is best to use JSONLint, which is described earlier in this article. For the second stage, you need the JSON schema that describes the data structure. You then compare this description with the received data.
On json-schema.org, you will find an overview of validators [15], sorted by the various programming languages in which they were developed. For example, consider the validate-json
tool implemented in PHP [16]. If you are more into Python, jsonschema
[17] serves the same purpose. The call to the two tools is identical.
Defining the JSON Schema
Listing 10 shows the JSON schema with which you define the exact format of your data structure. The schema matches the book inventory used earlier in this article. The schema was stored in the book-inventory-schema.json
file in the local directory.
Listing 10
JSON Schema
{ "$schema": "http://json-schema.org/draft/2019-09/schema", "title": "Book", "type": "object", "required": ["author", "title", "publication"], "properties": { "author": { "type": "string", "description": "The author's name" }, "title": { "type": "string", "description": "The book's title" }, "publication": { "type": "number", "minimum": 0 }, "tags": { "type": "array", "items": { "type": "string" } } } }
The schema definition references the JSON standard used (the draft from September 2019, in this case) in the second line. The definition contains a number of keywords. Table 3 explains these keywords in more detail; a complete list of all supported keywords is available at json-schema.org [18].
Table 3
JSON Keywords
Keyword | Description |
---|---|
$schema |
Description of the schema specification |
title |
Title of the schema |
type |
Type of JSON data |
properties |
Properties of each value (key and values allowed for the field) |
required |
List of required properties |
properties.type |
Data type of an entry |
properties.minimum |
Minimum value of an entry |
properties.maximum |
Maximum value of an entry |
properties.minLength |
Minimum number of characters for an entry |
properties.maxLength |
Maximum number of characters for an entry |
properties.pattern |
Regular expression for a comparison with the value of an entry |
The next task is to validate the records by checking whether they match the specified schema. Listing 11 shows a single record from the book inventory in readable format. The compact version of the record contains all the parentheses and fields in a single line.
Listing 11
JSON Record
{ "author": "Stephen Fry", "title": "The Hippopotamus", "publication": 1994 }
The validate-json
tool expects two parameters in the call, the dataset and the schema (Listing 12). If everything goes well, the output does not cause any further feedback (line 2); otherwise, validate-json
grumbles (lines 4 and 5). To provoke the error message starting in line 4, we turned the numeric specification for the year of publication (1994
) into a string "1994"
, which means that the data type in the dataset no longer matched the stored data type in the JSON schema. validate-json
has every reason to complain.
Listing 12
Calling validate-json
01 $ validate-json record.json bookinventory-schema.json 02 $ 03 $ validate-json record.json bookinventory-schema.json 04 JSON does not validate. Violations: 05 [publication] String value found, but a number is required
Some programming languages also offer suitable helper libraries. In Python, for example, you can use jsonschema
, and for NodeJS, you can use the Express framework.
Processing JSON
The list of command-line tools and helpers that read, search on, and modify JSON data is quite extensive. We stopped counting after more than 20 entries (see Table 4 for a sample). Developer Ilya Sher maintains a useful, commented overview of options [19].
Table 4
Command-Line Tools
Tool | Application (selection) |
---|---|
faq, Xidel |
Convert formats from and to JSON (BSON, Bencode, JSON, TOML, XML, YAML, etc.) |
fx, gofx, jq, jid |
Filter JSON data |
jello |
Filter JSON data with Python syntax |
jtbl |
Output to a table |
Underscore |
Processing via the command line |
Jtbl, for example, takes JSON records and knits a pretty table from them. In Figure 7, you can see how this table looks for the book inventory. Each record is shown in a separate row. Jtbl can only cope with flat JSON structures. It cannot handle nesting so far.
« Previous 1 2 3 4 Next »
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.
News
-
The GNU Project Celebrates Its 40th Birthday
September 27 marks the 40th anniversary of the GNU Project, and it was celebrated with a hacker meeting in Biel/Bienne, Switzerland.
-
Linux Kernel Reducing Long-Term Support
LTS support for the Linux kernel is about to undergo some serious changes that will have a considerable impact on the future.
-
Fedora 39 Beta Now Available for Testing
For fans and users of Fedora Linux, the first beta of release 39 is now available, which is a minor upgrade but does include GNOME 45.
-
Fedora Linux 40 to Drop X11 for KDE Plasma
When Fedora 40 arrives in 2024, there will be a few big changes coming, especially for the KDE Plasma option.
-
Real-Time Ubuntu Available in AWS Marketplace
Anyone looking for a Linux distribution for real-time processing could do a whole lot worse than Real-Time Ubuntu.
-
KSMBD Finally Reaches a Stable State
For those who've been looking forward to the first release of KSMBD, after two years it's no longer considered experimental.
-
Nitrux 3.0.0 Has Been Released
The latest version of Nitrux brings plenty of innovation and fresh apps to the table.
-
Linux From Scratch 12.0 Now Available
If you're looking to roll your own Linux distribution, the latest version of Linux From Scratch is now available with plenty of updates.
-
Linux Kernel 6.5 Has Been Released
The newest Linux kernel, version 6.5, now includes initial support for two very exciting features.
-
UbuntuDDE 23.04 Now Available
A new version of the UbuntuDDE remix has finally arrived with all the updates from the Deepin desktop and everything that comes with the Ubuntu 23.04 base.