How MTData built a CVML vehicle telematics and driver monitoring solution with AWS IoT

Introduction

Building an IoT device for an edge Computer Vision and Machine Learning (CVML) solution can be a challenging undertaking. You need to compose your device software, ingest video and images, train your models, deploy them to the edge, and manage your device fleet remotely. This all needs to be performed at scale, and often while facing other constraints such as intermittent network connectivity and limited edge computing resources. AWS services such as AWS IoT Greengrass, AWS IoT Core, and Amazon Kinesis Video Streams can help you manage and overcome these challenges and constraints, enabling you to build your solutions faster, and accelerating time to market.

MTData, a subsidiary of Telstra, designs and manufactures innovative vehicle telematics and connected fleet management technology and solutions.MTData logo These solutions help businesses improve operational efficiency, reduce costs, and meet compliance requirements. Its new 7000AI product represents a significant advance in its product portfolio; a single device that combines traditional regulatory telematics functions with new advanced video recording and computer vision features. Video monitoring of drivers enables MTData’s customers to reduce operational risk by measuring driver focus and by identifying driver fatigue and distraction. Together with the MTData “Hawk Eye” software, MTData’s customers can monitor their vehicle fleet and driver performance, and identify risks and trends.

The 7000AI device is bespoke hardware and software. It monitors drivers by performing CVML at the edge and ingests video to the cloud in response to events such as detecting that the driver is drowsy or distracted. MTData used AWS IoT services to build this advanced telematics and driver monitoring solution.

“By using AWS IoT services, particularly AWS IoT Greengrass and AWS IoT Core, we were able to spend more time on developing our solution, rather than spend time building up the complex services and scaffolding required to deploy and maintain software to edge devices with often intermittent connectivity. We also get security and scalability out of the box, which is critical as we are dealing with potentially sensitive data.

Amazon Kinesis Video Streams has also been an invaluable service, as it allows us to ingest video securely and cost-effectively, and then serve it back to the customer in a very flexible way, without the need to manage the underlying infrastructure.” – Brad Horton, Solution Architect at MTData.

Solution

Architecture Overview

MTData’s solution consists of their 7000AI device, their “Hawk-Eye” application for vehicle location and telemetry data, and their “Event Validation” application to review and assess detected events and associated video clips.

MTData architecture

Figure 1: High-level architecture of the 7000AI device and Hawk-Eye solution

Let’s explore the steps in the MTData solution, as shown in Figure 1.

  1. MTData deploys AWS IoT Greengrass on the 7000AI in-vehicle device to perform CVML at the edge.
  2. Telemetry and GPS data from sensors on the vehicle is sent to AWS IoT Core over a cellular network. AWS IoT Core sends the data to downstream applications based on AWS IoT rules.
  3. The Hawk-Eye application processes telemetry data and shows a dashboard of the vehicle’s location and the sensor data.
  4. CVML models deployed at the edge on the 7000AI device are used to continuously analyze a video feed of the driver. When the CVML model detects that the driver is drowsy or distracted, an alert is raised and a video clip of the detected event is sent to Amazon Kinesis Video Streams for further analysis in the AWS cloud.
  5. The Event Validation application allows users to validate and manage detected events. It is built with AWS serverless technologies, and consists of the Event Processor and Event Assessment components, and a web application.
  6. The Event Processor is an AWS Lambda function which receives and processes telemetry data. It writes real-time data to Amazon DynamoDB, analytical data to Amazon Simple Storage Service (Amazon S3), and forwards events to the Data Ingestion layer.
  7. The Data Ingestion layer consists of services running on Amazon Elastic Container Service (Amazon ECS) using AWS Fargate, which ingests detected events and forwards them to the Hawk-Eye application.
  8. The Event Assessment component provides access to the detected event videos via an API, and consists of consumers which read detected event videos from Amazon Kinesis Video Streams.
  9. The front-end web application, hosted in Amazon S3 and delivered via Amazon CloudFront, allows users to review and manage distracted driver events.
  10. Amazon Cognito provides user authentication and authorization for the applications.
MTData Event Validation

Figure 2: An event displayed in the Event Validation application

Device Software Composition

The 7000AI device is a bespoke hardware design running an embedded Linux distribution on NVIDIA Jetson. MTData installs the AWS IoT Greengrass edge runtime on the device, and uses it to compose, deploy, and manage their IoT/CVML application. The application consists of several MTData custom AWS IoT Greengrass components, supplemented by pre-built AWS-provided components. The custom components are Docker containers and native OS processes, delivering functionality such as CVML inference, Digital Video Recording (DVR), telematics and configuration settings management.

MTData Device Software Composition

Figure 3: 7000AI device software architecture

Device Management

AWS IoT Greengrass deployments are used to update the 7000AI application software. This deployment feature handles the intermittent connectivity of the cellular network; pausing deployment when disconnected, and progressing when connected. Numerous deployment options are available to manage your deployments at scale.

Operating system image updates

There can be complication and risk associated with updating an embedded Linux device by updating individual packages. Dependency conflicts and piece-meal rollbacks need to be handled, to prevent “bricking” a remote and hard-to-access device. Consequently, to reduce risk, updates to the embedded Linux operating system (OS) of the 7000AI device are instead performed as image updates of the entire OS.

OS image updates are handled in a custom Greengrass component. When MTData releases a new OS image version, they publish a new version of the component, and revise the AWS IoT Greengrass deployment to publish the change. The component downloads the OS image file, applies it, reboots the device to initiate the swap of the active and inactive memory banks, and run the new version. AWS IoT Greengrass configuration and credentials are held in a separate partition so that they’re unaltered by the update.

Edge CVML Inference

CVML inference is performed at regular intervals on images of the vehicle driver. MTData has developed advanced CVML models for detecting events in which the driver appears to be drowsy or distracted.

MTData Distracted Driver

Figure 4: Annotated video capture of a distracted driver event

Video Ingestion

The device software includes the Amazon Kinesis Video Streams C++ Producer SDK. When MTData’s custom CVML inference detects an event of interest, the Producer SDK is used to publish video data to the Amazon Kinesis Video Streams service in the cloud. As a result, MTData saves on bandwidth and costs, by only ingesting video when there is an event of interest. Video frames are buffered on device so that the ingestion is resilient to cellular network disruptions. Video fragments are timestamped on the device, so delayed ingestion doesn’t lose timing context, and video data can be published out of order.

Video Playback

The Event Validation application makes use of the Amazon Kinesis Video Streams Archived Media API to download video clips or stream the archived video. Segments of clips can also be spliced from the streamed video, and archived to Amazon S3 for subsequent analysis, ML training, or customer retention purposes.

Settings

The AWS IoT Device Shadow service is used to manage settings such as inference on/off, live-stream on/off and camera video quality settings. Shadows decouple the Hawk-Eye and the Event Validation applications from the device, allowing the cloud applications to modify settings even when the 7000AI device is offline.

MLOps

MTData developed an MLOps pipeline to support retraining and enhancement of their CVML models. Using previously ingested video, models are retrained in the cloud, with the help of the NVIDIA TAO Toolkit. Updated CVML inference models are published as AWS IoT Greengrass components and deployed to 7000AI devices using AWS IoT Greengrass deployments.

MTData MLOps pipeline

Figure 5: MLOps pipeline

Conclusion

By using AWS services, MTData has built an advanced telematics solution that monitors driver behavior at the edge. A key capability is MTData’s custom CVML inference that detects events of interest, and uploads corresponding video to the cloud for further analysis and oversight. Other capabilities include device management, operating system updates, remote settings management, and an MLOps pipeline for continuous model improvement.

“Technology, especially AI, is advancing at an ever-increasing rate. We need to be able to keep pace with that and continue to provide industry-leading solutions to our customers. By utilizing AWS services, we have been able to continue to update, and improve our edge IoT solution with new features and functionality, without a large upfront financial investment. This is important to me not only to encourage experimentation in developing solutions, but also allow us to get those solutions to our edge devices faster, more securely, and with greater reliably than we could previously.” – Brad Horton, Solution Architect at MTData.

To learn more about AWS IoT services and solutions, please visit AWS IoT or contact us. To learn more about MTData, please visit their website.

About the authors

Greg BreenGreg Breen is a Senior IoT Specialist Solutions Architect at Amazon Web Services. Based in Australia, he helps customers throughout Asia Pacific to build their IoT solutions. With deep experience in embedded systems, he has a particular interest in assisting product development teams to bring their devices to market.
Ai-Linh LeAi-Linh Le is a Solutions Architect at Amazon Web Services based in Sydney, Australia. She works with telco customers to help them build solutions and solve challenges. Her areas of focus include telecommunications, data analytics and AI/ML.
Brad HortonBrad Horton is a Solution Architect at Mobile Tracking and Data (MTData), based in Melbourne, Australia. He works to design and build scalable AWS Cloud solutions to support the MTData telematics suite, with a particular focus on Edge AI and Computer Vision devices.