Rather than the traditional method of calculating a company's market share, Smith said regulators should also consider how much consumer data it possesses whendetermining whether it is a monopoly. That method could spell trouble for the other tech giants , like Google and Facebook, that are currently facing antitrust investigations in the United States. It would likely have a lesser effect on Microsoft itself.

But Smith says tech companies can learn from such probes — and he knows from experience. He was the general counsel forand a trusted adviser to Bill Gates throughout Microsoft's own antitrust case in the late 1990s and early 2000s.

"I think you can think of Microsoft, the antitrust battles that started in the 1990s, as sort of technology's first collision with the modern world as we know it," Smith told CNN's Poppy Harlow in a recent interview for Boss Files . "Microsoft had to change. We had to do more to listen to other people, understand their concerns, acknowledge their concerns and then ultimately address them and that required a lot of change."

A new method of measuring monopolies is just one of the ways Smith said he'd like to see laws change so that big tech is better regulated in the United States.

Those changes, Smith said, are necessary in light of the massive power and influence technology now has in society, in business and in government. Smith explores that power in his new book , Tools and Weapons: The Promise and the Peril of the Digital Age, co-written with Microsoft senior director of external relations Carol Ann Browne.

"I think there are lots of technology companies that have been founded on a desire to do good for the world," Smith said. "But I also think that there is an opportunity for introspection, because it's one thing to do what you love to do and be committed to it doing good for the world. It's another thing to step back and ask the harder questions ... Are we [doing good for the world]? Are there unintended consequences?"

Privacy

Smith, like Apple CEO Tim Cook, said he thinks privacy (or the lack thereof) has reached a "crisis" point, "and it would benefit us to treat it that way," he said.

companies in the industry should begin offering customers those rights on their own. The solution, he said, should be twofold: Federal regulators should pass a national privacy law, similar to the General Data Protection Regulation (GDPR) in Europe, the likes of which Smith has been advocating for since 2005. GDPR gives consumers the right to know when their personal data is being collected online and how it will be used. As the United States awaits such a federal law, Smith saidcompanies in the industry should begin offering customers those rights on their own.

Smith recalled a December 2013 meeting he and several other tech executives had with then-President Barack Obama, not long after Edward Snowden leaked secret documents showing the National Security Agency was spying on American citizens. As the executives worked with Obama on a government solution, the President told them the tech industry would one day soon have to do the same.

JUST WATCHED Microsoft President: We need to rethink monopolies Replay More Videos ... MUST WATCH Microsoft President: We need to rethink monopolies 04:31

"President Obama said, 'I have a suspicion that the guns will turn (on you),'" Smith said. "He followed up that statement by explaining that in his view — which I think was also correct — the companies around the table had as much or more data about the American population, [about] people as consumers, than the government did."

a One bright spot, he said, is privacy law set to go into effect in California in 2020 that grants internet users more control over their personal data, allowing them to opt out of having their information shared or sold to third parties, for example. Smith said he thinks California's consumer privacy law will push most companies to change the way they handle user privacy even before any federal legislation passes.

Facial recognition

Another area in urgent need of regulation is facial recognition, Smith said.

accurately detects men and white people's faces than others'. Smith said it could also have potentially problematic implications for companies' or governments' abilities to track people's movements, their spending habits, or their participation in political movements — the kind of tracking that leaders in China For one thing, facial recognition technology can be biased since it moreaccurately detects men and white people's faces than others'. Smith said it could also have potentially problematic implications for companies' or governments' abilities to track people's movements, their spending habits, or their participation in political movements — the kind of tracking that leaders in China may already be using to monitor pro-democracy protesters in Hong Kong.

"If you just think about the right to assemble peacefully, which I think is at the core of democratic freedoms, it puts that at risk — the mass surveillance future that George Orwell imagined," Smith said.

Still, Microsoft is one of many companies working on developing facial recognition technology, and Smith doesn't think it should be banned altogether.

He spoke about one bill that was proposed in Microsoft's home state of Washington that would have put stipulations on the use of facial recognition software by government agencies and would have required the makers of such software to allow third-party testing of their products, as well as to establish GDPR-like privacy protections. Microsoft urged lawmakers to pass that law , rather than another Washington bill that would have put a moratorium on the use of facial recognition by local and state governments.

"I actually don't think a ban makes sense because you cannot improve a technology if you can't use it, and you can't use it if it's banned," Smith said.

Neither Washington bill passed, and Smith said he thinks companies themselves need to be selective about who they will provide their software to until regulations are in place

"There should be regulations and restrictions on its use. Companies should be applying these voluntarily at this stage," he said. "That's what we're doing ... If it leads to the risk of bias, we've turned down deals that do that. If it is going to put peoples' fundamental human rights at risk through mass surveillance, we should be and are saying no."

This kind of cooperation between tech companies and regulators could be key to the industry's future.

"I don't think it's enough to say that this is some problem that the government alone should solve and that we, who create the technology, have no responsibility to address ourselves," Smith said.