A group of UBC students took top honours at a global technology competition.

The team of four won 25,000 euros at the Ericsson Innovation Awards for students in Stockholm on Friday.

They invented SoundVision, a software application that turns 3D spatial information into sounds, which they hope will assist those who are visually impaired better understand their surroundings.

"It's still surreal, I can't believe it," said 18-year-old Karan Grover.

The first-year computer science student was on a team with Tanha Kabir, 18, Jonathan Ho, 21 and led by YK Sugishita, 24.

A whopping 843 teams from 72 countries participated in the competition which was judged by a panel consisting of Prince Daniel of Sweden and executives from Ericsson and Google.

"It was really nerve wracking ... but I think once we started having conversations with them, we realized they were really nice people and became more comfortable," said Grover.

The team behind SoundVision jumped to their feet after hearing they had won the Ericsson Innovation Awards 2016. (Ericsson)

'Third place is a good spot'

Despite spending 50 to 60 hours a week on their project for the past month and a half, Grover said he and his friends were shocked by their win.

They thought a team from Massachusetts Institute of Technology, the runners-up, would be the winners.

"I remember us four sitting beside each other, our suits and SoundVision t-shirts," he said, describing the winners' announcements. "They went to 4th place ... we're like, "Okay, I think we got third place. Third place is a good spot.'"

But as one name was called after another, the UBC students didn't hear theirs — until the very end.

"Once we got first, we all just kind of freaked out," recalled Grover with a big smile. "It's just awesome."

How it works

In a release, Ericsson praised SoundVision for helping to "create truly inclusive cities where every citizen is empowered."

The technology uses a sensor for mobile phones to scan 3 metres in front of a user for spatial measurements.

Grover and his team wrote a program to read and convert that data into sounds which are fed to bone conduction headphones.

If there are objects to the left, a sound is fed through the left ear. If there are objects to the right, the user would be alerted in their right ear.

Pitch levels correspond to the height and distance of the objects — a soft, high-pitched sound could represent a tall person who is faraway.

Grover said this team has been working with the CNIB for feedback and his hope is to eventually patent and sell the technology.