DRIVE Software 8.0 Enables Surround Perception, AR for Safe Automated Driving

0
77
DRIVE Software 8.0 Enables Surround Perception, AR for Safe Automated Driving

[ad_1]

DRIVE Software 8.0 — which integrates enhanced perception, visualization and mapping capabilities into one release — makes it easy to see the road ahead for safe self-driving.

The newest version of the extensive DRIVE Software suite, announced this week at CES, features advanced automated driving and in-cabin user experience functionalities powered by AI. Much of the DRIVE Software 8.0 capabilities are also hallmarks of the recently introduced DRIVE AutoPilot system. The release will be available to developers in the next few weeks.

NVIDIA DRIVE Software is an open software suite that’s powered by the compute capabilities of the DRIVE AGX platform. It consists of the DRIVE OS operating system, DriveWorks software framework, a rich SDK to develop software applications for autonomous driving, as well as DRIVE AV and DRIVE IX software applications for autonomous driving.

What’s in DRIVE Software 8.0

With this release, developers now have access to high-definition visualization for both vehicle sensors and driver monitoring, as well as more advanced surround perception.

In-vehicle visualization offers a real-time look inside the brain of an autonomous vehicle for its occupants. It shows what the car sees, how it classifies objects and plans future moves, and how those decisions change second by second. And, using augmented reality, autonomous vehicle software developers can visualize and track this data in a developer-friendly format.

The release also includes updates to the software’s perception capabilities, enhancing object detection and providing a 360-degree view around the vehicle, making DRIVE Software 8.0 a highly advanced, open software platform for autonomous driving development.

Tracking the Road and the Driver

By translating sensor data to a visual display, DRIVE IX now allows manufacturers to track a vehicle’s inputs, perception and path-planning both in real time and for recorded data. Developers can view the data as the car drives with displays inside the vehicle. The visualization works in five phases: sense, perceive, map, plan and drive.

In the sense phase, the display shows the raw camera sensor data the car uses for object detection. From there, the perceive phase outlines the vehicle’s object detection process using bounding boxes, as well as labels for objects as far as 50 meters away. This phase also shows  the map phase, the display highlights lane lines perceived by the car, and the plan phase shows the vehicle’s upcoming path. Finally, the drive phase graphs the steering, acceleration and brake actions the vehicle is taking to follow the desired path. In real time, developers can toggle between these five views as the car drives.

A similar process takes place for driver monitoring. Using a driver-facing camera, deep neural networks can track whether a driver is drowsy or distracted. With the new visualization update, drivers and passengers can see this process in real time.

As in driving visualization, the driver monitoring display can show the raw camera data in the initial sense phase, adding face and eye identification in the detect phase. In the track phase, the visualization details the angle of the driver’s head as well as the direction of their gaze. And, an analysis of whether the driver is drowsy (shown by a coffee cup icon) or distracted (attention off the road is tracked by a timer in the corner) is displayed in the monitor phase. In both cases, the video is tinted red when the algorithms detect these dangerous situations.

Enhanced Perception and Mapping

In addition to cutting-edge visualizations, DRIVE Software 8.0 builds on past perception and other capabilities in DRIVE AV for an even greater autonomous driving development experience with new deep neural networks.

One such network, LightNet, identifies both solid and turn lights and can discern whether they are green, yellow or red. Its companion, SignNet, recognizes a range of traffic signs, including: speed limit, one-way, “Do Not Enter,” yield and stop signs.

In this release, DRIVE AV also features DRIVE MapStream, which allows developers to upload data from the vehicle’s sensors to their own digital maps. These maps are critical for self-driving cars to locate themselves on the road, as well as to anticipate obstacles like construction zones or a closed lane.

To help get safe, automotive-grade self-driving technology on the roads sooner, DRIVE Software 8.0 will available to developers by mid-January, bringing manufacturers one step closer to production-ready autonomous vehicles.

To get started with a DRIVE AGX Developer Kit, contact us here.

The post DRIVE Software 8.0 Enables Surround Perception, AR for Safe Automated Driving appeared first on The Official NVIDIA Blog.

[ad_2]

Source link

LEAVE A REPLY

Please enter your comment!
Please enter your name here