Cortical Labs to Launch Brain-Cell-Powered Data Centers

Cortical Labs to Launch Brain-Cell-Powered Data Centers

The culture hood hummed. I watched a grainy clip of human neurons guiding a cursor through DOOM and felt the floor tilt under familiar assumptions about computing. You should be uneasy and curious at the same time.

A lab bench smelled faintly of culture media before the headline arrived

I saw the video that made Cortical Labs a viral curiosity: a petri-dish brain nudging pixels around to play DOOM. That clip was more than a stunt. It exposed a radically different compute substrate—chips seeded with living human neurons called CL1 units.

Each CL1 pairs a multi-electrode array with roughly 200,000 neurons grown from blood stem cells. For context, the human brain carries about 60–99 billion neurons, so these are tiny, highly networked islands of biology wired to electronics.

How do neuron-based computers work?

The short answer: you stimulate living neurons and record their electrical responses as computation. Electrical inputs travel through cultured networks; electrodes read spikes and network patterns; software interprets those patterns as outputs. I like to think of the setup as a tiny orchestra where cells keep time—software conducts and the biological tissue performs.

Cortical Labs’ CEO Hon Weng Chong told Bloomberg the chips run on orders-of-magnitude less power than standard AI accelerators. The company has already coaxed neurons into goal-directed behavior on-screen, and now it wants to fold that biology into infrastructure.

A suburban power grid groaned the last time a hyperscale center expanded

You’ve seen the headlines: Google, Amazon, Microsoft, OpenAI and others are pouring money into new data centers. They buy racks of GPUs by the thousands and chase every watt of efficiency.

Those projects cost big. Tech giants have committed billions of USD (for example, $10 billion ≈ €9.2 billion) to capacity and expansion in recent years, and that spending reshapes local grids and politics. Cortical Labs is offering an argument for lower-footprint compute: its CL1 stacks reportedly sip less power than a handheld calculator.

Are neuron-powered data centers more energy efficient than GPUs?

The claim is simple and intoxicating: biological chips can deliver compute with a drastically smaller energy bill. If true at scale, that could blunt a frequent community complaint—data centers pushing up local electricity rates and straining infrastructure. You should still ask for third-party benchmarks and real-world operational data; an impressive lab demo doesn’t yet equal production throughput or reliability.

A single rack at a university clinic will test whether the idea can leave the lab

At the Yong Loo Lin School of Medicine in Singapore, a prototype rack will house 20 CL1 units. Melbourne will host a facility with 120 units. DayOne, a sustainability-focused data center firm, is the local partner for Singapore; Cortical Labs signals that a commercial DayOne site could host up to 1,000 units down the road.

Can biological data centers scale to commercial use?

Scaling raises hard questions: tissue maintenance, contamination risk, regulatory oversight, and standard uptime guarantees that enterprise customers expect. Cortical Labs plans phased validation—start small, test under operational loads, then expand—while partners like DayOne and institutions such as the National University of Singapore observe and measure performance. I’d want to see how these racks behave under steady-state AI inference and what failure modes look like.

There are also governance angles. The White House recently pushed major tech firms to guarantee local energy costs after community pushback, even prompting public commitments from CEOs. A biological data center trades a different set of public concerns—ethics, biosafety, and the optics of “brains” inside commercial server rooms.

An engineer adjusted a cable while lawyers drafted consent forms in another room

Practicality matters. Cortical Labs says its CL1 units record neurons’ electrical responses as computing outputs; the company has demonstrated playful proofs of concept and now wants real infrastructure tests in Melbourne and Singapore. You should care because this is where science hits procurement, SLAs, and municipal zoning hearings.

Big names already dominate the AI training market—NVIDIA GPUs, Google’s TPU efforts, Microsoft and Amazon’s global datacenter footprints—and they set the bar for latency, throughput, and contractual reliability. Cortical Labs is asking those incumbents and customers one blunt question: what if parts of compute could run with a human cell’s metabolism instead of a datacenter’s transformer?

Caveats litter the path. Biotech requires lab-grade facilities, constant supply chains for media and stem cells, and protocols that are unfamiliar to traditional ops teams. Still, the possibility is seductive: a new compute class that trades raw FLOPS for a lower electrical bill and a very different set of engineering trade-offs.

The lab felt oddly irreverent the day I sat with this story—like a nightclub for neurons—and that single image keeps you from treating the idea as purely speculative. Are we ready to debate whether a city should host a bank of living processors?