For months, there’s been a steady march of controversies over how tech companies collect, manage, process, and share massive (and passive) amounts of data. And even though the executives and founders of these companies profess a renewed commitment to privacy and corporate responsibility, people are beginning to worry about surveillance and power—and reconsider how much faith they should put in both the leaders and services leveraging these quickly evolving technologies. The latest manifestation of these concerns came out of San Francisco, home to the tech economy: the city banned facial recognition technology to “regulate the excesses of technology.”

WIRED OPINION ABOUT Daniel Dobrygowski is head of governance and policy at the World Economic Forum's Centre for Cybersecurity. William Hoffman is the World Economic Forum's project lead for data policy. Both are based in New York City.

As tech winds its way deeper and deeper into our lives, deeper questions arise: How can you trust someone you’ll never see? How can you trust an algorithm that is making thousands of decisions a second of which you aren’t even aware? How can you trust a company that tracks your movement every day? The biggest question of all? Given that trust is such a foundational principle for the global economy, and the global economy is digital, what is a meaningful definition of “digital trust”?

To start, trust in digital products and the companies that produce them is already eroding. Edelman’s 2019 Trust Barometer shows that more than 60 percent of respondents, globally, believe tech companies have too much power and won’t prioritize our welfare over their profits. “If the lifeblood of the digital economy is data, its heart is digital trust,” notes a recent PWC report that claims the most consequential companies of the next generation will be the ones that prioritize security, privacy, and data ethics. The ones that don’t are facing a costly problem. A recent study by Accenture found that during the next five years, CEOs could reclaim more than $5 trillion in lost value with new governance approaches for safeguarding the internet. For a global company, that could mean the equivalent of 2.8 percent in revenue growth. Yet a recent report on Digital Trust and Competitiveness from Tufts University found few business leaders are confident they have sufficient "digital trust" controls in place.

So, how do you build “digital trust” and what does it look like? At the World Economic Forum, our new report provides a framework for a more efficient and effective global dialogue on digital trust built on two main components: mechanical trust and relational trust.

Mechanical trust, especially as it relates to cybersecurity, is the heart of digital trust. It is the means and mechanisms that deliver predefined outputs reliably and predictably. An automobile’s braking system provides a good metaphor. Step on the brakes. The car stops. No ambiguity, no uncertainty. Predictable, reliable outputs are expected to be delivered every time. If a system is secure and performs predictably, individuals will be more willing to use it. They’ll be able to trust it.

But we need another, equally important, form of trust to support this: relational trust. Even if all the mechanical systems work, if people don’t believe that we’re all playing by the same rules, trust breaks down. That is why relational trust—the social norms and agreements that address life’s complex realities—is vital. While the brakes in a car may be highly reliable, we also need a shared agreement that a red light means to use them. Similarly, we need a shared agreement on when, where, why, and how technologies are used.

To establish these rules, we need people, processes, and tools. For emerging tech, that means creating frameworks that incorporate accountability, auditability, transparency, ethics, and equity. By incorporating these principles in the early stage design of digital products and services, stakeholders can have a more meaningful say in how emerging networked technologies are bound by (and in turn affect) our long-standing normative and social structures. Relational trust also ensures that the promise and value apportionment of new technologies can be more equitably delivered, fostering a virtuous cycle of trust leading to improved outcomes, which leads to greater trust.