Corner Case or #NotaCornerCase?

Blink into the future and you will see driverless cars, motorcycles, and trucks, tons of them, entering and leaving cities, small towns, speeding on the highway. Now blink back to reality, a reality made of complex driving situations all around the globe. Various factors contribute to the complexity of being on the road, such as weather and lighting conditions, human unpredictability, or changes in common scenarios such as broken traffic lights or animals crossing the street. Some would argue that these are so-called “corner cases” are low probability scenarios but in fact even if you don’t see these constantly you perhaps encounter theme every other day without even realizing. They are #NotaCornerCase!

IMAGINE all the DATA

From ADAS/Driver Assistance Level to Full Automation, the automotive industry is buckling down for the future of robotics on the road. Massive real-word data is being captured, meticulously tagged and used to train machine learning algorithms, as part of the perception process. However, no matter how monumental company efforts are, real-world data is just not enough. It simply cannot cover all possible scenarios, and this is where synthetic data comes into play! But not just any synthetic data. Data needs to be photorealistic, specific, scalable, with numerous variations, and incorporated metadata to be able to just “plug it in” your ML. It has to be true to reality. It has to be Anyverse!

Synthetic data is the solution to the loopholes in AV perception training and testing

and here are some examples why:

Eyes ON the ROAD

The Roads with all their elements such as lanes, traffic signs, street and traffic lights, other vehicles are tricky even to experiencedhuman drivers But what happens when the ADAS lane keeping system does not recognize sand or ice on the lane lines?

The road is full of obstacles and challenges such as low-visibility turns, vandalized traffic signs, less common vehicles such as the famous tuk-tuk, missing traffic signs, messy construction sites. The list goes on and on. So the question is – how can you ensure your driverless vehicle is prepared for all the tricky scenarios?

Eyes OFF the ROAD

Sometimes off-road elements affect safety even more that road-related elements. And we don’t mean just what’s on the sidewalk or nearby buildings. Just think of Mother Nature! Weather conditions such as snow, rain, and fog can impair visibility and consequently car control. What happens when sun reflections on windows or wet surfaces blind you? Or heavy rain prevents your autonomous vehicle from measuring car distances properly? Even in plain daylight sun glare can cause trouble on the road.

With Anyverse you can mirror reality and produce synthetic data that is physically correct, no tricks applied. Furthermore, it is equipped with serious sensor abilities and numerous lens effects such as scatter, distortion, dirt, etc.

Beware: HUMANS

Don’t CUT CORNERS

Humans… Humans everywhere! We all know a driverless vehicle does not mean a humanless world. People will make sure to get in the way and make the “life” of self-driving cars somewhat more complicated. No doubt there will be kids playing on the sidewalk, oblivious jay-walkers, protesters or a flashmob blocking the way. Because humans 🙂

We can conclude with certainty that most low-probability scenarios for some are everyday happenings for others. Life is unpredictable as it is, so what is to be expected of an autonomous vehicle?

Truth it, weather and lighting peculiarities alone are serious enough a challenge to the driverless world and are by far no corner cases.

To stay ahead of the game, you can start preparing for all possible scenarios by including specific synthetic data in your machine learning training. With Anyverse you can have any scene you like to be able to test it and see improving fidelity levels or possible loopholes. Stay tuned for some awesome scenes we’ve prepared to help you raise the bar. Coming soon 🙂