Microsoft’s CEO, Satya Nadella, said that it’s important to have a team of people with a diverse background to reduce bias in AI models. He was speaking at the company’s Future Decoded event in India.

Nadella said that it’s great that we have all the AI and data available to us, but developers should think about privacy as a human right and removing biases while developing applications:

One of the best things you can do (as a developer) to have ethics around AI. For example, a solution to remove bias(in models), would be to have diversity (in teams building it). If there’s one thing that’s going to be the real currency for the next 10 years, is how devoted and inclusive are your teams, that are building all these technologies, in terms of gender diversity and ethnic diversity.

He also added that as a South Asian living in the US, he realized that AI models made for cardiac purposes didn’t have South Asian representation for a long time, and that means “we basically had incorrect models trying to diagnose cardiac issues.”

[Read: The future of AI journalism is less hyperbole and smarter readers]

Last year, Microsoft’s top man said that there’s a pressing need for principals that can govern AI. Earlier this year at Davos, he repeated the same message and called for the formation of global rules.

Last time when the India-born CEO visited the country, he talked about the power of Aadhaar ID, India Stack, and connecting more than a billion people through it. This time around apart from numerous mentions of the Azure cloud, AI and inclusivity with technology were at the center.

You’re here because you want to learn more about artificial intelligence. So do we. So this summer, we’re bringing Neural to TNW Conference 2020, where we will host a vibrant program dedicated exclusively to AI. With keynotes by experts from companies like Spotify and RSA, our Neural track will take a deep dive into new innovations, ethical problems, and how AI can transform businesses. Get your early bird ticket and check out the full Neural track.