Universe Might Be One Big Neural Network: Physicist


TEHRAN (Tasnim) – In a thought-provoking new paper, a physicist suggests the whole universe could be a single neural network—a competing “theory of everything” that could unite quantum and classical mechanics, he says.

If this is true—and that’s a really, really big if—it would mean pretty enormous things for the nature of the universe.

A neural network is what’s called a data structure, which is a shape or format for organizing ideas inside computer hardware. If you’ve ever made a shopping list or written down the steps to complete a task, you’ve made a data structure. If you’ve “opened a ticket” in an IT support system at your job, that ticket probably joined a data structure called a queue. You may have even programmed these structures, too, writing code for stacks, trees, and more.

“Modeled loosely on the human brain, a neural net consists of thousands or even millions of simple processing nodes that are densely interconnected,” the Massachusetts Institute of Technology explains. And the neural net isn’t just the physical data structure—it’s an umbrella term for both the structure and the weighted, programmed approach to using the structure in artificial intelligence.

In the simplest terms, a neural net is like any other net. A fish could push one part of a net far out of shape. A knitted sweater shifts to fit your body. The weights in a neural net, like a struggling fish or a bicep, note who and what is deforming the net. The best data points peek through and are skimmed off. And the approach is thought to model the human mind, hence the name neural net.

With this idea in mind, how can it be that the whole universe is like a neural net? Take it from Vitaly Vanchurin, a physics professor at the University of Minnesota Duluth, who published his paper “The World as a Neural Network” on the arXiv pre-print server last year:

“We discuss a possibility that the entire universe on its most fundamental level is a neural network. We identify two different types of dynamical degrees of freedom: “trainable” variables and “hidden” variables. We consider the trainable variables to argue that near equilibrium their dynamics (are) well approximated by Madelung equations and further away from the equilibrium by Hamilton-Jacobi equations. This shows that the trainable variables can indeed exhibit classical and quantum behaviors.”

Basically, Vanchurin says we can use the idea of a neural net to model the universe, in a way that could bring together quantum and classical mechanics. This is a key mismatch in physics, creating a phantom barrier between phenomena explained by Einstein’s theory of general relativity, for example, versus the close-up spooky trompe l’oeil effects of quantum mechanics.

This content is imported from {embed-name}. You may be able to find the same content in another format, or you may be able to find more information, at their web site.

The same set of variables is affected by a quantum phenomenon at one end of their spectrum and a classical one at the far other end, per Vanchurin’s theory. That means the same values are affected by both at once, with some pivot point or even overlap somewhere in the middle between them.

“In this paper, I consider another possibility that a microscopic neural network is the fundamental structure and everything else, i.e. quantum mechanics, general relativity and macroscopic observers, emerges from it,” Vanchurin told Futurism. “So far, things look rather promising.”

The theory could bear out over further review, including any peer review before publication in a journal. But what’s arguably most interesting about Vanchurin’s concept is what it implies for ideas: A unifying theory could still be lying just out of sight, and the structure we imagine modeling the human brain could also model the molecular level of the entire universe.

“Would this theory mean we’re living in a simulation?” Futurism’s Victor Tangermann asked Vanchurin. “No, we live in a neural network,” he replied. “But we might never know the difference.”