Compute

Compute very much serves the Plan component of SPA. The brains behind Homer, the first self driving taxi at Voyage, is a Gigabyte AORUS motherboard with an Intel Core i7–7700K Kaby Lake Quad-Core 2.4GHz processor and NVIDIA Titan X GPU. To make sure sensors have an ample data pipe, the machine has 64GB of RAM and 3TB of mass storage distributed across three Solid State Drives for redundancy.

Wired in

This powerful computing box runs an Ubuntu distribution of Linux, utilizes Docker containers to manage system environments, and the Robot Operating System (ROS) for quick prototyping of perception, motion planning, and controls nodes. ROS is an incredibly versatile robotics middleware that abstracts away all of the complexities of message passing, timing, data structures (for things like point clouds, camera frames, and obstacles), threading, and data recording. While Ubuntu (and therefore, ROS) is not a sufficient real-time operating system (RTOS) that is required for production self-driving cars, it is an incredible tool for prototyping algorithms and getting them tested in real world conditions as fast as possible. It’s critical to minimize the time between ideas on the whiteboard and cars on the road here at Voyage, and these tools help us do just that.

Want to try ROS on real point cloud data from an actual self-driving car? Check out this open repository on GitHub, put together by one of our engineers.

ROS nodes are essentially mini-programs that run independently from each other, but generally have many interconnections. One node may be responsible for reading raw data from a Velodyne LIDAR over an ethernet interface and turning it into a PointCloud2 message. This message, which consists of an array of three-dimensional points and their metadata, can then be ‘published’ over the ROS network and consumed by any number of other nodes. One of these consumers, or ‘subscribers’, could be responsible for fitting the live incoming point cloud to an existing map for localization, and another node may be running clustering algorithms to detect and track objects. These nodes then publish their own output to the network, which could get consumed by motion planning algorithms even farther down the line. At a high level, this is exactly how the Voyage cars operate. Data is consumed from raw sensors (LIDAR, radar, RTK GPS, cameras, CAN bus messages, etc), processed by a huge collection of smaller nodes that all communicate with each other, and then finally outputs actual control signals to the throttle, brake, and steering wheel through our drive-by-wire units.