Self-driving cars still seem extremely futuristic. Their imminent debut is poised to reinvent the concept of driving. However, designing safe and reliable perception systems remains one of the trickiest challenges for automakers.

AEye claims its latest innovation can overcome these challenges and deliver perception at speed and at range. The California-based artificial perception system developer has reportedly launched the world’s commercially available, 2D/3D in-sensor perception system for autonomous vehicles. Evidently, AEye’s new perception software helps ensure reliability of detection and classification.

Apparently, this is the first instance where basic perception is distributed to the edge of the sensor network. This enables sensors to not only detect objects, but also acquire and eventually classify and track these objects. The system even enables cars to extend the range at which objects are tracked, detected and classified.

The ability of the system to acquire this information in real-time could enable and enhance existing perception software platforms by minimizing latency. The system could further help automakers cut costs and enhance functional safety.

AEye seemingly intends to use the new in-sensor perception software to boost the availability of self-driving features in vehicles across all SAE levels of driving automation. This will enable automakers to implement the desired amount of autonomy based on the use-case.

About AEye’s iDAR™ platform

Evidently, AEye’s latest in-sensor perception system is powered by its proprietary iDAR™ platform that acts as the eyes and visual cortex for self-driving cars. The platform enables smart and adaptive perception for autonomous vehicles.

Also read: Toyota and Suzuki to collaborate on building driverless technology

The iDAR platform is based on the concept of biomimicry. It imitates the perception design of human vision via AI and a combination of LiDAR and fused camera.

AEye claims that it is the first system to incorporate a fused approach to artificial perception. The system apparently leverages iDAR’s exclusive Dynamic Vixels, which combine the camera’s 2D data (pixels) and LiDAR’s 3D data (voxels) inside the sensor.

Notably, iDAR is a unique software-defined platform that allows for distinct sensor functionalities to complement each other. This in turn enables the LiDAR and camera to function in sync, making each sensor more powerful while ensuring greater functional safety.

Source credit: https://www.aeye.ai/press/aeye-announces-worlds-first-commercially-available-perception-software-designed-to-run-inside-the-sensors-of-autonomous-vehicles/