The principle, which refers to a prediction made in 1965 that engineers would pack twice as many transistors onto a computer chip every 18 months, slows. The end comes when Google starts building its own computer chip, called the TPU, because standard chips can no longer keep pace with the rise of artificial intelligence.

Gordon Moore, co-founder of Intel, who coined Moore’s Law: We ran out of gas sometime during the last decade. We couldn’t continue to shrink things anymore. I have been amazed that the engineers have been able to keep it moving as long as they have.

All of a sudden, the barriers that we were tackling were coming to a fundamental limit. They had to live with the fact that transistors are made of atoms.

That was a prediction that Stephen Hawking made during one of his visits to Silicon Valley. Somebody asked him what the fundamental limits would be. He said the velocity of light and the atomic nature of matter, both of which are fundamental.

John Hennessy, board member of Google: We’re probably about seven to 10 years into a slowdown on Moore’s Law. We’ve strayed away from Gordon’s original prediction. And we’re beginning to see the effects of this.

One complication is energy efficiency. If you think about an Intel microprocessor, it currently generates, let’s say, 135 watts. Imagine something, some tiny object that is about a centimeter and a half on a side, that’s generating more heat than a 100-watt light bulb.

You have to get the electricity in, and then you have to get the heat out. That’s become a major limitation in our ability to build chips, particularly in the mobile space, because you care about battery life. You also don’t want your phone to be so hot that it’s burning holes in your pocket.

Imagine you’re only a little bit behind on Moore’s Law. What would you do? You put more transistors on — perhaps not twice as many as you had in the previous generation, but perhaps 1.5 times as many — and then you’d try to figure out how to use those transistors to make the CPU faster.

The problem is, if you put transistors on that chip and you activate them, you’re going to burn up too much power. You’re not going to be able to get the power out. We have already reached this regime where these things are limiting the ability to use even what remains of Moore’s Law.

Artificial intelligence is also speeding the end of Moore’s Law.

Mr. Hennessy: Why do computers need to be fast? The answer to that is the rise of artificial intelligence and machine learning. Your home speaker, whether it’s a Google Home or an Alexa or whatever, offers speech recognition. The best speech recognition in the world is based on machine learning.

When Google adopted its “A.I. first” strategy, which was probably seven, eight years ago, we quickly did a computation that said that if we were going to use conventional, general-purpose processors, two things would happen. One was: We couldn’t afford the energy. And we probably couldn’t afford to buy all the processors that we would need to do everything we needed to do.

That’s what motivated our thinking with the TPU. We had to think about really trying to improve efficiency dramatically. The result is, you can get about a factor of 100 in terms of computational power per watt of power. That’s like giving you an extra seven years on Moore’s Law.