Raspberry Pi on the IoT
Herd Animals
The Amazon Web Services command-line interface and the Amazon Greengrass IoT Core services read and merge Raspberry Pi sensor data.
One of the things that can be meaningfully connected on the Internet of Things (IoT) is sensors, including temperature gauges. Friends of the Raspberry Pi prefer to use them for their practical exercises, because they are readily available from online stores for small change.
With the help of an API, you can read data from the GPIO ports with a few lines of Python code. The questions then arise: Where do I store the data? Who is evaluating the data, and where?
Moreover, generating an alert with a few additional lines in the Python script works fine on a single Rasp Pi; however, what if several (possibly hundreds) of Pis collect data that developers want to evaluate centrally in one place? In this article, I use Amazon Web Services (AWS) IoT Core and Greengrass with a Raspberry Pi to send a text alert if a sensor registers a temperature out of bounds.
Recycling Trade
Anyone who wants to collect and evaluate large volumes of data will tend to use the cloud as a data collection point because it can be reached from anywhere. AWS offers a platform that lets you deliver and evaluate data without having to set up your own virtual machines (VMs) and even offers a suitable client. This setup feeds the cloud solution and makes centralized administration of many Rasp Pis as easy as pie.
One of the standards for transmitting data is the machine-to-machine (M2M) Message Queuing Telemetry Transport protocol (MQTT) [1], which relies on a publish-subscribe mechanism to distribute messages and commands. Participants are allowed to send data on a topic (e.g., living room/lamp/luminosity), and everyone who subscribes to this topic receives the message and can respond as needed. MQTT runs on TCP/IP networks and can also be secured by the Transport Layer Security (TLS) protocol.
The software component that monitors this mechanism is the MQTT server, also known as a broker. Sensors measure and publish their results on an MQTT topic, and subscribers read and use the data.
MQTT and IoT in the Cloud
If the sensors are widely distributed on the network, it makes sense to operate the MQTT service in the cloud instead of starting your own MQTT broker on a separate VM. The large public clouds – AWS [2], Azure [3], Google Compute Engine (GCE) [4] – offer MQTT as a service. The platforms usually have names like IoT Core, and they provide data collection and retrieval services in the cloud.
In large installations, however, how do the sensors discover their task and how can the software be distributed? AWS, Azure, and GCE also offer a solution to this problem in the form of an MQTT client. Not only does it communicate with the respective cloud, it also receives code with work instructions. In this way, it is possible to control sensor functions centrally. The platforms also take care of security by securing connections with TLS and make the setup as simple as possible.
In the example here, I employ AWS and a Raspberry Pi with a temperature sensor. At the end, I add Amazon's serverless computing variant Lambda [5]. This deployment is meant to evaluate the incoming data in a practical way and send an alert over SMS if the temperature is too high.
AWS and IoT
The two AWS IoT technologies are IoT Core [6] for the MQTT broker and IoT Greengrass [7] for the client. Amazon provides online documentation [8] on getting started that explains how to integrate the first IoT collection point (i.e., the core in AWS speak) over the web interface, how to query the data, and how to install the client on a Raspberry Pi.
Because the documentation describing the setup from the web console has been simplified, replicability is unfortunately somewhat neglected. In this article, I describe the setup from the AWS command-line interface (CLI) to help you integrate multiple sensors.
At the command line, many steps are needed to reach the target. Some are interdependent – for example, when a command outputs a return value (e.g., an ID or Amazon Resource Name, ARN) that the next command needs, linking the individual objects together.
AWS Greengrass Core instances connect the outside world to the IoT Core (the MQTT broker). These cores need to register with the central office and use certificates to do so. Either AWS itself generates these (the approach described in this example), or you could upload your own certification authority (CA) and use certificates generated in this way. Amazon documentation [8] presents Greengrass Core as a run time that runs AWS Lambda locally and takes care of messaging, device shadows, and security. Devices send data via the Core to the cloud; thus, the Core acts as an IoT gateway and is itself also a device or "Internet thing."
For the following commands to work, you need to be running an operating system with the AWS command-line client installed. On Debian, Ubuntu, Red Hat, and Gentoo, the package is named awscli. A credentials
file with the entries aws_access_key_id
and aws_secret_access_key
must be present in the ~/.aws
directory under the user's home directory for the CLI to work; otherwise, the boto3 library used by the client will not be able to log in to Amazon.
The access data can be found in the settings of your own AWS account, and you should create an additional config
file in the same directory that defines the AWS region, which can look something like:
[default] region=eu-central-1
This setting tells Amazon to execute all the commands in the AWS facility located in Frankfurt, Germany. Alternatively, all these values can also be determined with environment variables. The command aws help
and Amazon documentation provide more details. If the cited files do not exist yet, a call to aws configure
helps you create them interactively.
Most commands designed to create something in the AWS world are answered by the Amazon servers with a JSON block describing the object. In practice, it has proved useful to add | tee <object name>.json
to each command. IDs and ARNs can be reproduced later.
In the first step, you create a "thing" that represents the Greengrass Core. The iot
without a Greengrass subcommand (Listing 1) helps. The thing created here is called LM_Core
, which needs a certificate:
Listing 1
Create an Internet Thing
aws iot create-keys-and-certificate --set-as-active --certificate-pem -outfile gg.cert.pem --public-key-outfile gg.pubkey.pem--private-key-outfile gg.privatekey.pem
As a result, Amazon returns a JSON block containing the ARN, ID, public and private keys, and certificate in PEM format. The other options make sure the certificate and key end up in your own files; otherwise, you would have to extract them from the JSON block.
The next step is to link the certificate with the thing, to allow the Rasp Pi that uses the certificate to log on to the core. The command requires the ARN from the currently generated certificate:
aws iot attach-thing-principal --thing-name LM_Core --principal arn:aws:iot:eu-central-1:566776501337:cert/6c597228cf5e4da63ee7dc4364f5a282e431a1581014fbd3cd558b86878f4fdc
The Amazon resource name is the certificate name (series of characters after the slash).
Nothing works in AWS without permissions, which requires a policy that allows the users of the certificate to send data to the IoT Core and receive data from it. The following command creates a (very liberal) policy:
aws iot create-policy --policy-name LM_Policy --policy-document file://<path/to/>policy.json
Listing 2 then shows the content of the policy.json
file. Here, AWS also returns an ARN, which you need to link to the certificate with:
Listing 2
JSON Policy Definition
aws iot attach-principal-policy --policy-name LM_policy --principal arn:aws:iot:eu-central-1:566776501337:cert/6c597228cf5e4da63ee7dc4364f5a282e431a1581014fbd3cd558b86878f4fdc
Finally, work begins on the Greengrass components. The Greengrass group (a collection of settings and components) includes:
- The core. The MQTT client that delivers the data to the cloud. It comes with a thing and a certificate.
- A resource. Greengrass defines this as data or devices that it can access. Because the Raspberry Pi needs access to
/dev/gpiomem
, you need to define and include this before setting up the function, because you also need to mount the resource in the function. - The function that integrates the Lambda service to be run on the core into the group. Although you could have several of these functions, in this example I'm sticking with one.
- The subscription. This component allows cores to send data to the AWS cloud (optionally with a topic filter).
As the last step, you create the group and add all previously created components. This sequence saves you constantly having to expand the group.
Buy this article as PDF
(incl. VAT)
Buy Linux Magazine
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Subscribe to our ADMIN Newsletters
Support Our Work
Linux Magazine content is made possible with support from readers like you. Please consider contributing when you’ve found an article to be beneficial.
News
-
Canonical Bumps LTS Support to 12 years
If you're worried that your Ubuntu LTS release won't be supported long enough to last, Canonical has a surprise for you in the form of 12 years of security coverage.
-
Fedora 40 Beta Released Soon
With the official release of Fedora 40 coming in April, it's almost time to download the beta and see what's new.
-
New Pentesting Distribution to Compete with Kali Linux
SnoopGod is now available for your testing needs
-
Juno Computers Launches Another Linux Laptop
If you're looking for a powerhouse laptop that runs Ubuntu, the Juno Computers Neptune 17 v6 should be on your radar.
-
ZorinOS 17.1 Released, Includes Improved Windows App Support
If you need or desire to run Windows applications on Linux, there's one distribution intent on making that easier for you and its new release further improves that feature.
-
Linux Market Share Surpasses 4% for the First Time
Look out Windows and macOS, Linux is on the rise and has even topped ChromeOS to become the fourth most widely used OS around the globe.
-
KDE’s Plasma 6 Officially Available
KDE’s Plasma 6.0 "Megarelease" has happened, and it's brimming with new features, polish, and performance.
-
Latest Version of Tails Unleashed
Tails 6.0 is based on Debian 12 and includes GNOME 43.
-
KDE Announces New Slimbook V with Plenty of Power and KDE’s Plasma 6
If you're a fan of KDE Plasma, you'll be thrilled to hear they've announced a new Slimbook with an AMD CPU and the latest version of KDE Plasma desktop.
-
Monthly Sponsorship Includes Early Access to elementary OS 8
If you want to get a glimpse of what's in the pipeline for elementary OS 8, just set up a monthly sponsorship to help fund its continued existence.