The Nobel Prize in Physics was awarded to John Hopfield of Princeton University and Geoffrey Hinton of the University of Toronto “for their fundamental discoveries and inventions that enable machine learning using artificial neural networks.” Geoffrey Hinton, the 2024 Nobel laureate in Physics, expressed concerns about his own discoveries and inventions that made machine learning possible using artificial neural networks.
In fact, this is the first prize awarded for work at the intersection of computer science, physics and biology. The Nobel committee noted that it was largely due to the research of Hopfield and Hinton that “computers can reproduce the processes of memorization and learning, although they cannot yet think.”
The laureates became famous for their inventions of the early 1980s, when the power of computers left much to be desired. American biophysicist John Hopfield created an associative neural network that could remember and recreate images and other data sets. The scientist relied on statistical physics models that describe the behavior of ferromagnets — substances that “remember” the magnetic field in which they were placed. Briton Geoffrey Hinton, the “godfather” of artificial intelligence, great-great-grandson of the founder of mathematical logic George Boole, created a computational model called the “Boltzmann machine.” It was also based on statistical physics methods and could classify images and create new ones according to a given pattern.
These works by the laureates, published during the “AI winter,” when interest in this field began to wane, marked the beginning of a new era and initiated the rapid development of machine learning. A new surge of attention to Hopfield and Hinton’s algorithms occurred in the 2000s; they are now used less widely and actively, but find specific applications, for example, in quantum computing.
The Royal Swedish Academy of Sciences recognizes the physicists for developing methods that have become the basis of modern machine learning. “The work of the laureates has already had enormous benefits. In physics, we use artificial neural networks in a wide range of areas, such as the development of new materials with specific properties,” said Ellen Moons, chair of the Nobel Committee for Physics.
When talking about artificial intelligence, people often mean machine learning using artificial neural networks. Initially, neural network technology was created by analogy with the structure of the brain. Biological neural networks have neurons, while artificial neural networks have nodes with different values, and the connection between nodes can become weaker or stronger. Network training will consist, for example, of developing stronger connections between nodes with simultaneously high values.
Hopfield was born in 1933 in Illinois, USA, and studied at Cornell University. In 1982, he invented an associative neural network, known as the Hopfield network. The physicist created an associative memory that can store and retrieve images in a database, with nodes acting as pixels in the network. The Hopfield network can be classified as an auto-associative memory, which is one that can complete or correct an image but cannot associate the resulting image with another image. The Hopfield network uses physics that describes the characteristics of a material, or more precisely, atomic spin, the property that makes each atom a tiny magnet. In general, the network is described in a way that is equivalent to describing the energy in a spin system. The network learns by finding values for the connections between nodes, so the stored images have low energy. If the Hopfield network is fed an incomplete image, it will methodically go through the nodes and update their values, causing the energy of the network to drop. So the network works in stages to find the stored image that is most similar to the image it was given.
Hinton was born in 1947 in a suburb of London. He is the great-great-grandson of the English mathematician and logician George Boole. Hinton graduated from the University of Cambridge in 1970 and the University of Edinburgh in 1978. Hinton developed a method that can autonomously find properties in a database and perform tasks such as identifying specific elements in images. Hinton used the Hopfield network as a basis for inventing (with Terry Sejnowski) a new network. It was called the Boltzmann machine, after the Austrian physicist Ludwig Boltzmann, one of the founders of statistical physics, a branch of physics that studies systems built from many similar components. Hinton used tools from statistical physics in his work on the new network. Like the Hopfield network, the Boltzmann machine is a network of neurons with a specific concept of “energy”. This network turned out to be the first neural network capable of learning internal representations, solving complex combinatorial problems. It can learn to recognize characteristic elements in a given type of data. The Boltzmann machine can be used to classify images or create new examples of the type of pattern on which it was trained. The machine learns by feeding it examples that are likely to occur when it is run.
Last year, the Nobel Prize in Physics was awarded to Pierre Agostini, Ferenc Krausz and Anne L’Huillier “for their experimental methods of generating attosecond pulses of light to study the dynamics of electrons in matter.” In 2022, the awards were given to French scientist Alain Aspect, American physicist John Clauser and Austrian scientist Anton Zeilinger for their research in quantum mechanics – for “experiments with entangled photons, the study of violations of Bell’s inequalities and work in quantum information science.”
The prize money of 11 million Swedish kronor (around one million euros) will be divided equally. The winners will be awarded on December 10 in Stockholm (the day of Alfred Nobel’s death).
The 2024 Nobel Prize in Physics laureate Geoffrey Hinton has expressed concerns about his own discoveries and inventions that made machine learning possible using artificial neural networks.
At a press conference held Tuesday to announce the Nobel laureates, Hinton, who spoke by phone, warned of “possible negative consequences, in particular, the neural network getting out of control.”
He noted that if he had to make the discovery again, he would “do exactly the same thing.” “But I am concerned that the consequences may be unfavorable, and that control will be exercised by systems that are smarter than us,” the scientist added, speaking “with some regret” about the research conducted.
“We don’t have that experience: what’s it like when something is smarter than us?” Hinton answered a journalist’s question about the possible consequences of his scientific work. At the same time, the scientist noted that the research “can bring a lot of benefits, for example, in the field of health care.”
Earlier it was reported that the Nobel Prize in Physics was awarded to American John Hopfield and British and Canadian citizen Geoffrey Hinton.
“This year’s two Nobel laureates in physics used tools from physics to develop methods that became the basis of today’s powerful machine learning,” the Nobel committee said in a press release.
Hopfield was born in 1933 in the US state of Illinois and studied at Cornell University. In 1982, he invented an associative neural network known as the Hopfield Network.
Hinton was born in 1947 in a suburb of London. He is the great-great-grandson of the English mathematician and logician George Boole. Hinton graduated from Cambridge University in 1970 and from Edinburgh University in 1978. Together with Terry Sejnowski, he invented a neural network called the Boltzmann Machine.