The big idea
For the first time, my colleagues and I have built a single electronic device that is capable of copying the functions of neuron cells in a brain. We then connected 20 of them together to perform a complicated calculation. This work shows that it is scientifically possible to make an advanced computer that does not rely on transistors to calculate and that uses much less electrical power than today’s data centers.
Our research, which I began in 2004, was motivated by two questions. Can we build a single electronic element – the equivalent of a transistor or switch – that performs most of the known functions of neurons in a brain? If so, can we use it as a building block to build useful computers?
Neurons are very finely tuned, and so are electronic elements that emulate them. I co-authored a research paper in 2013 that laid out in principle what needed to be done. It took my colleague Suhas Kumar and others five years of careful exploration to get exactly the right material composition and structure to produce the necessary property predicted from theory.
Kumar then went a major step further and built a circuit with 20 of these elements connected to one another through a network of devices that can be programmed to have particular capacitances, or abilities to store electric charge. He then mapped a mathematical problem to the capacitances in the network, which allowed him to use the device to find the solution to a small version of a problem that is important in a wide range of modern analytics.
The simple example we used was to look at the possible mutations that have occurred in a family of viruses by comparing pieces of their genetic information.
Why it matters
The performance of computers is rapidly reaching a limit because the size of the smallest transistor in integrated circuits is now approaching 20 atoms wide. Any smaller and the physical principles that determine transistor behavior no longer apply. There is a high-stakes competition to see if someone can build a much better transistor, a method for stacking transistors or some other device that can perform the tasks that currently require thousands of transistors.
This quest is important because people have become used to the exponential improvement of computing capacity and efficiency of the past 40 years, and many business models and our economy have been built on this expectation. Engineers and computer scientists have now constructed machines that collect enormous amounts of data, which is the ore from which the most valuable commodity, information, is refined. The volume of that data is almost doubling every year, which is outstripping the capability of today’s computers to analyze it.
What other research is being done in this field
The fundamental theory of neuron function was first proposed by Alan Hodgkin and Andrew Huxley about 70 years ago, and it is still in use today. It is very complex and difficult to simulate on a computer, and only recently has it been reanalyzed and cast in the mathematics of modern nonlinear dynamics theory by Leon Chua.
I was inspired by this work and have spent much of the past 10 years learning the necessary math and figuring out how to build a real electronic device that works as the theory predicts.
There are numerous research teams around the world taking different approaches to building brainlike, or neuromorphic, computer chips.
What’s next
The technological challenge now is to scale up our proof-of-principles demonstration to something that can compete against today’s digital behemoths.
R. Stanley Williams was previously employed by Hewlett Packard Enterprise and presently owns stock in the company. He has received research funding from Texas A&M University. He is member of the IEEE.


SpaceX Updates Starlink Privacy Policy to Allow AI Training as xAI Merger Talks and IPO Loom
Google Cloud and Liberty Global Forge Strategic AI Partnership to Transform European Telecom Services
SpaceX Pushes for Early Stock Index Inclusion Ahead of Potential Record-Breaking IPO
TSMC Eyes 3nm Chip Production in Japan with $17 Billion Kumamoto Investment
SoftBank and Intel Partner to Develop Next-Generation Memory Chips for AI Data Centers
US Judge Rejects $2.36B Penalty Bid Against Google in Privacy Data Case
Federal Judge Signals Possible Dismissal of xAI Lawsuit Against OpenAI
Sam Altman Reaffirms OpenAI’s Long-Term Commitment to NVIDIA Amid Chip Report
Elon Musk’s SpaceX Acquires xAI in Historic Deal Uniting Space and Artificial Intelligence
Elon Musk’s Empire: SpaceX, Tesla, and xAI Merger Talks Spark Investor Debate
Nvidia, ByteDance, and the U.S.-China AI Chip Standoff Over H200 Exports
SoftBank Shares Slide After Arm Earnings Miss Fuels Tech Stock Sell-Off
Nvidia Confirms Major OpenAI Investment Amid AI Funding Race
Oracle Plans $45–$50 Billion Funding Push in 2026 to Expand Cloud and AI Infrastructure
Instagram Outage Disrupts Thousands of U.S. Users 



