The Australian biotech company Cortical Labs recently posted a video in which 200,000 living human neurons grown on a silicon chip played the 1993 first-person shooter Doom. The neuron-controlled main character wandered corridors, encountered enemies and fired weapons—clumsily, and it died often. But the neurons were playing nonetheless.
The demo could mark a genuine inflection point. The neurons appeared to exhibit what Cortical Labs’s chief scientific officer, Brett Kagan, calls “adaptive, real-time goal-directed learning.” The stakes extend well beyond gaming, in part because AI’s appetite for electricity has been rapidly growing. Though neurons are unlikely to replace microchips, they can perform some calculations far more efficiently, and studying them could offer new approaches to computing—and, perhaps, to testing neurological drugs.
To be clear, Cortical Labs’s neural cells aren’t extracted from brains. “You can essentially take a small bit of blood or skin,” Kagan explains, “isolate certain types of cells, turn them into stem cells and then, from those stem cells, generate an indefinite supply of neural cells.” Each of its computing units can house about 800,000 neurons in a self-contained life-support system that can keep them alive for up to six months. The interface relies on electricity—“the shared language between biology and silicon,” as he puts it. When brain cells are active, they generate small electrical pulses, and the system can deliver small pulses back to them.
On supporting science journalism
If you’re enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.
But wiring is the easy part. The hard part is getting cells in a dish to do anything purposeful. “The temptation is to anthropomorphize and say, oh, they like [playing Doom],” Kagan says. “But this isn’t an animal or a human or anything even as complex as an insect. It’s a system. It’s kind of like saying, ‘Does a computer like or dislike the reward function on a [reinforcement-learning] model?’”
The solution to motivating neurons drew on the free energy principle, which was developed by neuroscientist Karl Friston of University College London. The principle holds that neural systems are driven to predict their environment. “If I reach for an empty can of drink and I successfully predict the outcomes of my actions, that’s sort of a world I can live in,” Kagan says. “But if I reach for it and sometimes it turns into a chicken and sometimes it turns into a firework, that world would be impossible to live in.”
To train the neurons, the team built a simple feedback loop. Wrong moves produced random, unpredictable signals—white noise. Right moves produced structured, predictable ones. “Any signal that the cells could not possibly predict is something that the cells would then just have to learn to avoid,” Kagan says, “because that would be the only way to create predictability in this environment.” In effect, chaos was punishment, and order was reward.
In October 2022 Cortical Labs published a proof-of-concept study in the journal Neuron. Kagan and his colleagues showed that within minutes, neurons on microchips could learn to play Pong, the classic video game in which a player repeatedly intercepts a ball—think two-dimensional ping-pong. But Pong only involves a bouncing square and a moving line. Doom has corridors, enemies, three-dimensional navigation and a lot of things that are trying to kill you.
To make that leap, Cortical Labs organized a hackathon with Stanford University. Independent researcher Sean Cole paired the neurons with a standard learning algorithm. The hybrid system outperformed the algorithm running on its own—suggesting that the biological cells were contributing to the learning process.
Cortical Labs frames its ambitions around two tracks. The first is medical: “93 to 99 percent of clinical trials, depending on how you cut it, in the neuropsychiatric space fail,” Kagan says. Many of those drugs are tested in neurons in a dish, but he points out that brain cells are not meant to sit in an information void. “We’ve actually published and shown that when you have cells in a game environment or a world environment, they’re fundamentally different in how they respond to drugs, how they exhibit disease,” he says.
The second track is computational. Neurons form “the most powerful information-processing system that we’re aware of,” Kagan says. “The complexity of it far exceeds anything we’ve built with silicon.” Silicon transistors, he says, have first-order complexity—a binary state, 0’s and 1’s. “Biological neurons have at least third-order complexity, probably much higher. They can hold at least three interacting dynamic states at any one time.”
That complexity, researchers argue, could translate into major energy savings. Feng Guo, an associate professor at Indiana University Bloomington, sees Cortical Labs’s biocomputing platform as capable of “high-level computing.” In a 2023 paper in Nature Electronics, Guo and his colleagues introduced “Brainoware,” a system that uses three-dimensional brain organoids for computing. For Guo, the energy argument is decisive. The human brain uses just 20 watts—less than a dim lightbulb. “If you want to create a similar computing power for the silicon-based AI computing system, that would be at least a million times higher,” he says.
Still, Kagan is careful not to oversell the future. “A pocket calculator will outperform me at long division any day,” he says. “But your best state-of-the-art [reinforcement-learning] AI algorithm isn’t as good as going into someone else’s house and finding the way to make a cup of tea.” Biological computing is “a new tool in the intelligence toolbox,” he says.
Don’t expect a personal computer run on a brain in a vat anytime soon. Kagan speaks realistically about the research still to be done but says that “you move from science fiction to science once you can work on the problem.” A few years ago biological computing had one published game of Pong to its name. Now it has a commercial platform, an application programming interface that developers can connect to and a video of neurons stumbling through Doom—badly, but they’re learning.
