top of page

Google Maps for your cells.

A new MIT framework just gave cell biology something it's needed for decades: a way to actually read the whole map.

(first published on my substack )


Your body contains around 37 trillion cells. We know far more about them in 2026 than we did a decade ago. But medicine still treats the cell like a spreadsheet. Problem is, it isn't.

 

“Your cells are a city at rush hour — genes, proteins, and chromatin all pushing and pulling at once.” Mark van Rijmenam, Synthetic Minds

 

A hard cell.

 

When a biologist wants to understand what's happening inside a cell from a cancer patient, they choose a measurement. RNA. Chromatin. Proteins. Morphology. Each gives them one view of the same cell. One piece of the puzzle.

 

Stitching those pieces together has meant running each analysis separately, then eyeballing the overlaps. Machine learning has helped, but most methods still squash everything into one blob of numbers.

 

You lose the ability to see what’s shared across measurements and what’s unique to each one. Which part of the cell? Which readout? It remains mostly a mystery.

 

Ok, cell it to me.

 

Researchers at the Broad Institute of MIT and Harvard, with colleagues at ETH Zurich, just published a new AI framework in Nature Computational Science that takes a different approach.

 

It’s called APOLLO, and instead of treating all the data as one soup, it learns a shared map of the cell plus smaller side‑maps for each measurement type.

 

Picture the old way as a long column of numbers.

 

Thousands, maybe hundreds of thousands. All the same colour and font size. Technically rich, visually dead.

 

Now imagine those numbers floating in a 3D cube. Different colours and sizes. Threads connecting matching colours across the space. Some clusters tightly tangled, others off on their own. The threads show which signals genuinely belong together and which just happened to land nearby.

 

Suddenly the data looks less like a list and more like a living map of the cell.

 

 “By putting the information from all these measurement modalities together in a smarter way, we could have a fuller picture of the state of the cell.” Xinyi Zhang Lead Author

 

AI – that means ChatGPT yeah?

 

Yeah, nah.

 

The systems are similar up to a point.

 

In the same way ChatGPT learns an incredibly detailed map of how millions of words relate to each other so it can predict the next word in a sentence, APOLLO learns a detailed map of how different measurements from the same cell — RNA, chromatin, proteins, images — relate to each other, so it can rebuild what we’ve seen and even predict the measurements we didn’t make.

 

Editing DNA is now comparatively easy. Predicting how that one change ripples through the rest of the cell is not. If frameworks like APOLLO deliver, they could help drug developers isolate true therapeutic signal from off‑target noise, separating what's actually working from everything else cluttering the picture.

 

Cancer specialists could track how a tumour fights back across multiple fronts at once. Scientists could see how engineered changes interact with the cell's existing systems before things go sideways.

 

There’s also a brutally practical question: there are too many things to measure in a cell to capture them all. So which do you choose?

 

Which can you safely predict from what you already have?

 

APOLLO starts to answer that by showing which measurements carry shared information and which add something uniquely new.

 

Cell-ebrate good times come on!

 

Well there you go. A positive AI story.

 

We talk about AI in terms of disinformation, energy use, and job losses. As we should; those risks are real. But at the same time, systems like this are quietly giving us new ways to read the living world.

 

A Google Maps for your cells.

 

Who knew.

 

Hey I'm now also on substack.

 


Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page