"This is a new way to see inside living human cells," said Rick Horwitz, executive director of the Allen Institute for Cell Science. "It's like seeing the whole cell for the first time. In the future, this will impact drug discovery, disease research and how we frame basic studies involving human cells."

To create the breakthrough model, scientists gene edited a large collection of live human cells to incorporate fluorescent protein tags, which illuminate specific structures inside the cells. The team took tens of thousands of pictures of these glowing cells, and used AI to create a probabilistic model, which predicts the most probable shape and location of structures in any cell, based on the shape of the plasma membrane and the nucleus.

The team then applied a different machine-learning algorithm to those pictures, which used what it learned from cells with fluorescent labels to find cellular structures in cells without fluorescent labels. By combining all the data, the system was able to generate images that look nearly identical to those obtained by traditional fluorescence microscopy, which can be expensive and toxic.

"Until now, our ability to see what is going on inside of human cells has been very limited," said Michael Elowitz, professor of biology, bioengineering and applied physics at California Institute of Technology. "Previously, we could only see the proteins that we deliberately labeled. But the Allen Integrated Cell is like the ultimate free lunch. We get to sample a 'buffet' of many different proteins and organelles, without having to label anything at all. This opens up a totally new and much more powerful way of doing cell biology. It's a total game changer."