I'm surprised Terry Sejnowski isn't included, considering it seems to be for Hopfield Nets and Boltzmann machines, where Terry played a large role in the latter.
I know it's one day late but I personally don't agree with this change.
Using the main entry webpage is much easier for people to look for relevant info, including but not limited to: various articles about this specific award; other info about this year's Nobel, other info about Nobel in general.
This is exactly what the Internet (or WWW) is better than traditional printing media. Using a PDF as a link (which itself can be easily found from the entry point) defeats it.
He was probably considered since he is mentioned in the reasoning paper, still it could be one of those unfortunate omissions in the nobel history since those deciding the prize might have a hard time to measure impact.
Impact is hard to quantify. There have been several occasions where someone who very well deserved a Nobel prize didn't get one. There are all kinds of reasons. Given he is mentioned in the reasoning he was probably considered. We can't know the reason he did not get the prize.
A funny one is that there was so much objection to General Relativity that they compromised by giving Einstein a Nobel for his work on quantum mechanics.
His acceptance speech was about general relativity.
That would probably leave the prizes awarded in a very narrow field, also the prize is supposed to be given to the thing that has "conferred the greatest benefit to humankind".
So in this case they picked something that might be viewed as only having a tangential connection to the field, but the impact has been so immense that they probably went outside their regular comfort zone (and how many prizes can we give for LHC work that really don't touch regular human lives in the foreseeable future anyhow?).
Is this a widely accepted version of neural network history? I recognize Rosenblatt, Perceptron, etc., but I have never heard that Hopfield nets or Bolzmann machines were given any major weight in the history.
The descriptions I have read were all mathematical, focusing on the computational graph with the magical backpropagation (which frankly is just memoizing intermediate computations). The descriptions also went out of their way to discourage terms like "synapses" and rather use "units".
Restricted Boltzmann Machines were a huge revolution in the field, warranting a publication in Science in 2006. If you want to know what the field looks like back then, here it is: https://www.cs.toronto.edu/~hinton/absps/science.pdf
I remember in 2012 for my MS thesis on Deep Neural Networks spending several pages on Boltzmann Machines and the physics-inspired theories of Geoffrey Hinton.
My undergraduate degree was in physics.
So, yes, I think this is an absolutely stunning award. The connections between statistical entropy (inspired by thermodynamics) and also of course from biophysics of human neural networks should not be lost here.
Anyways, congratulations to Geoffrey Hinton. And also, since physics is the language of physical systems, why not expand the definition of the field to include the "physics of intelligence"?
Yeah I agree with the 2006 Hinton paper. I read it and reread it and didn't get it. I didn't have the math background at the time and it inspired me to get it. And here I am almost 20 years later working on it.
> I have never heard that Hopfield nets or Bolzmann machines were given any major weight in the history.
This is mostly because people don't realize what these are at more abstract levels (it's okay, ironically ML people frequently don't abstract). But Hopfield networks and Boltzmann machines have been pretty influential to the history of ML. I think you can draw a pretty good connection from Hopfield to LSTM to transformers. You can also think of a typical artificial neural network (easiest if you look at linear layers) as a special case of a Boltzmann machine (compare Linear Layers/Feed Forward Networks to Restricted Boltzmann Machines and I think it'll click).
Either way, these had a lot of influence on the early work, which does permeate into the modern stuff. There's this belief that all the old stuff is useless and I just think that's wrong. There's a lot of hand engineered stuff that we don't need anymore, but a lot of the theory and underlying principles are still important.
Bolzman machines were there in the very early days of deep learning. It was a clever hack to train deep nets layer wise and work with limited ressources.
Each layer was trained similar to the encoder part of an autoencoder. This way the layerwise transformations were not random, but roughly kept some of the original datas properties. Up to here training was done without the use of labelled data. After this training stage was done, you had a very nice initialization for your network and could train it end to end according to your task and target label.
If I recall correctly, the neural layers output was probabilistic. Because of that you couldn't simply use back propagation to learn the weights. Maybe this is the connection to John Hopkins work. But here my memory is a bit fuzzy.
Boltzmann machines were there in the 1980s, and they were created on the basis of Hopfield nets (augmenting with statistical physics techniques, among other reasons to better navigate the energy landscape without getting stuck in local optima so much).
From the people dissing the award here it seems like even a particularly benign internet community like HN has little notion of ML with ANN:s before Silicon Valley bought in for big money circa 2012. And media reporting from then on hasn't exactly helped.
ANN:s go back a good deal further still (as the updated post does point out) but the works cited for this award really are foundational for the modern form in a lot of ways.
As for DL and backpropagation: Maybe things could have been otherwise, but in the reality we actually got, optimizing deep networks with backpropagation alone never got off the ground on it's own. Around 2006 Hinton started getting it to work by building up layer-wise with optimizing Restricted Boltzmann Machines (the lateral connections within a layer are eliminated from the full Boltzmann Machine), resulting in what was termed a Deep Belief Net, which basically did it's job already but could then be fine-tuned with backprop for performance, once it had been initialized with the stack of RBM:s.
An alternative approach with layer-wise autoencoders (also a technique essentially created by Hinton) soon followed.
Once these approaches had shown that deep ANN:s could work though, the analysis showed pretty soon that the random weight initializations used back then (especially when combined with the historically popular sigmoid activation function) resulted in very poor scaling of the gradients for deep nets which all but eliminated the flow of feedback. It might have generally optimized eventually, but after way longer wait than was feasible when run on the computers back then. Once the problem was understood, people made tweaks to the weight initialization, activation function and otherwise the optimization, and then in many cases it did work going directly to optimizing with supervised backprop. I'm sure those tweaks are usually taken for granted to the point of being forgotten today, when one's favourite highly-optimized dedicated Deep Learning library will silently apply the basic ones without so much as being requested to, but take away the normalizations and the Glorot or whatever initialization and it could easily mean a trip back to rough times getting your train-from-scratch deep ANN to start showing results.
I didn't expect this award, but I think it's great to see Hinton recognized again, and precisely because almost all modern coverage is to lazy to track down earlier history than the 2010s, not least Hopfield's foundational contribution, I think it is all the more important that the Nobel foundation did.
So going back to the original question above: there are so many bad, confused versions of neural network history going around that whether or not this one is widely accepted isn't a good measure of quality. For what it's worth, to me it seems a good deal more complete and veridical than most encountered today.
The deep learning variety of neural networks are heavily simplified, mostly linear versions of biological neurons. They don't resemble anything between your ears. Real life neurons are generally modeled by differential equations (in layman terms, have many levels of feedback loops tied to time), not the simplified ones used in dense layer activation functions.
Ish, take a look at the curves of the spiking neural network function, they are very different from the deep learning nets. When we "model" biological neural nets in code, we are essentially coming up with a mathematical transfer function that can replicate the chemical gradient changes and electrical impulses of a real neuron. Imagine playing a 3D computer game like Minecraft, the physics is not perfect but they are "close enough" to the real world.
Would the multi-head attention (Wv) not be equivalent to the chemical gradient changes?
(there are multiple matrices in multi-head attention, one for each attention head and what I imagine would be the equivalent of different gradients
This allows each attention head to learn different representations and focus on different aspects of the input sequence.)
And then the output produced after applying the concatenated (W0 or output projection), be equivalent to the different electrical outputs such as the spikes and passed to the next neuron equivalent or attention head?
Of the people who are still alive, Hopfield and Hinton make sense.
Hopfield networks led to Boltzmann machines. Deep learning started with showing that deep neural networks were viable in Hinton's 2006 Science paper, where he showed that by pre-training with a Restricted Boltzmann machine (essentially a stacked self-supervised auto-encoder) as a form of weight initialization, it was possible to effectively train neural networks with more than 2 layers. Prior to that finding, people found it was very hard to get backprop to work with more than 2 layers due to the activation functions people were using and problematic weight initialization procedures.
So long story short, while neither of them are in widespread use today, they led to demonstrating that neural networks were a viable technology and provided the FIRST strategy for successfully training deep neural networks. A few years later, people figured out ways to do this without the self-supervised pre-training phase by using activation functions with better gradient flow properties (ReLUs), better weight initialization procedures, and training on large datasets using GPUs. So without the proof of concept enabled by Restricted Boltzmann Machines, deep learning may not have become a thing, since prior to that almost all of the AI community (which was quite small) was opposed to neural networks except for a handful of evangelists (Geoff Hinton, Yoshua Bengio, Yann LeCun, Terry Sejnowski, Gary Cottrell, and a handful of other folks).
Whether or not these fields are meaningfully distinct is a matter of taste, despite the fashion being to imagine a plurality of interconnected but autonomous domains.
"As an Academy member I could publish such a paper without any review (this is no longer true, a sad commentary on aspects of science publishing and the promotion of originality)."
National Academy members still get to pick the reviewers (if they choose to go that route rather than regular submisssion), and the review is not blind. The reviews themselves are not public, but the identities of the reviewers are made public once the paper is out. So members can't just say whatever sh*t they want (and you can imagine some do), but still a highly unusual process.
Too late now to edit my original comment, but I should have added that I was talking very specifically about the Proceedings of the National Academy of Sciences (as was Hopfield).
Everyone's first thought when they read something is whatever the social norms say you're supposed to think (peer review = good, publishing without peer review = not science somehow?), but shouldn't you stop and wonder why the esteemed scientist wrote that line instead of just dismissing it? Otherwise you are only chiming in to enforce a norm that everyone already knows about, which is pointless.
One of the really refreshing things about reading older research is how there used to be all these papers which are just stray thoughts that this or that scientist had, sometimes just a few paragraphs of response to some other paper, or a random mathematical observation that might mean nothing. It feels very healthy. Of course there were far fewer scientists then; if this was allowed today it might be just too crowded to be useful; back then everyone mostly knew about everyone else and it was more based on reputation. But dang it must have been in a nice to have such an unrestricted flow of ideas.
Today the notion of a paper is that it is at least ostensibly "correct" and able to be used as a source of truth: cited in other papers, maybe referred to in policy or legal settings, etc. But it seems like this wasn't always the case, at least in physics and math which are the fields I've spent a lot of time on. From reading old papers you get the impression that they really used to be more about just sharing ideas, and that people wouldn't publish a bad paper because it would be embarrassing to do so, rather than because it was double- and triple-checked by reviewers.
We still have lots of stray thoughts, responses and observations, now they just happen on blog posts, on social media and in other non-peer-reviewed venues. The Internet has driven the cost of publishing to 0, and peer review is the only thing left that makes academic publishing qualitatively different. If anything, publishing your thoughts online is better than publishing a traditional paper in every single way except for peer review.
Well, publishing online also has a reach problem. The nice thing about journals is that they consolidate all the material on a subject. Arxiv does this for some fields (and I guess similar aggregators in other fields) but really it is nice to have the thoughts still be _curated_, like a magazine, without necessarily being to a citeable/publishable standard.
>shouldn't you stop and wonder why the esteemed scientist wrote that line instead of just dismissing it?
I go to forums like Hacker News and Reddit and regularly see software engineers who are outraged about having to have their code reviewed and even more outraged about actually having to implement feedback from their reviewers rather than receiving a rubber stamp.
I go to work and see the effects on product, team, and world of what would happen if those coders were allowed to bypass supervision.
So no, even someone who is intelligent and good at what they do should have peer review.
You are talking about something different than I am. I'm not saying you should have un-peer-reviewed major research. But there are other types of communication that are useful but do not need to be rigorously vetted. (not that peer review is all that good at the vetting anyway)
Im not a scientist and do not know how these things work out, but wouldn't it be possible for scientist to simply publish their papers online without peer review if that is what they want?
The only for work to have an impact is if it gets exposure. Publishing in journals got you an audience, but that audience is gatekept by peer review, which has its problems.
So sure, you could publish but the chance of having an impact was low. Thankfully that's changed a bit with arxiv.
Nothing stops them, some people do do that. Two examples that come to mind are Aella's research on fetishes[1] and Scott Alexander's research on birth order effects[2]. But you don't get academic credibility by publishing online without peer review, and it's much harder to get university funding.
Very true and it's wonderful. But only a thing in some fields, as I understand it. In the past that was the role that a lot of papers played but the conflation of publications and citations with career advancement messes that all up.
I think this is the Royal Academy of Sciences way to admit that Physics as a research subject has ground to a halt. String theory suffocated theoretical high energy physics for nearly half a century with nothing to show for it, and a lot of other areas of fundamental physics are kind of done.
I think this is (very) inaccurate. It feels more like them trying to jump on a "hot topic" bandwagon (machine learning/AI hype is huge).
Physics as a discipline hasn't really stalled at all. Fundamental physics arguably has, because no one really has any idea how to get close to making experimental tests that would distinguish the competing ideas. But even in fundamental physics there are cool developments like the stuff from Jonathan Oppenheim and collaborators in the last couple of years.
That said "physics" != "fundamental physics" and physics of composite systems ranging from correlated electron systems, and condensed matter through to galaxies and cosmology is very far from dead.
I don't know exactly what they hope to gain by jumping on that bandwagon though; neither the physicists nor the computer scientists are going to value this at all. And dare I say, the general populace associated with the two fields isn't going to either - case in point, this hn post.
If there weren't any noble-worthy nominations for physics, maybe skip it? (Although that hasn't happened since 1972 across any field)
I kinda doubt it. The kind of people who end up nominating people for Nobels or even making the decisions on these aren't really struggling for grant funding.
But the system they have succeeded in optimises for people who can sell themselves well enough to get that funding. These people live and breathe selling themselves for funding. Every buzzword, sexy plot, and dynamic presentation has got them here and it's not like they plan to stop.
There's no need to skip it, there's probably a big backlog from previous shortlists :)
But yeah, they could have passed. That would have been cool.
Also, there's a ton of extremely amazing shit in astronomy, or even photolithography, or simulations of physics (though that's basically what the chemistry prize was this year).
I just briefly looked into what Jonathan Oppenheim is working on, and I’d say he’s part of the problem. More speculative work that might or might not be testable in a distant future.
It used to be that there was some experimental result or other phenomena that required explanation which lead to a theoretical model that could be tested. That worked very well.
Now there’s some theoretical considerations that leads to a theoretical model that can’t be tested. It didn’t work for Aristotle and it doesn’t work for string theorists (and similar).
Why doesn't this experimental result count as requiring explanation?
We know (for example) silver atoms have mass, and that massive objects exert gravity (which we understand as warping of space-time according to GR).
We know that we can put silver atoms in quantum superpositions of being in different positions (for example in a sequential Stern-Gerlach type experiment).
We have (essentially) absolutely no theoretical understanding of what is going on to space-time when a thing with mass is in such a superposition. Quantum mechanics does not successfully model gravity, and general relativity contains no superpositions, so the situation is completely beyond our theoretical understanding. This isn't a theoretical consideration, this is something real that you can do in an undergrad physics lab experiment pretty easily.
Now the problem is that the models we have developed so far to deal with this situation turned out to be (wildly) too difficult for us to test. I think it is very far from clear that the Oppenheim & co model falls into this category - imo its completely reasonable for them to be spending theoretical effort working out what is needed to test their model.
Because it's not an experimental result. There are two disparate experimental results, one about superpositions and one about gravity. There's no experimental result about gravity being or not being in superpositions. What will happen to gravity (if there is any) in a double split experiment is pure theoretical speculations.
And I readily admit that it would be interesting to know what would happen. But many decades of more or less convoluted hypotheses has proved to be unfruitful. We need a new way to do fundamental physics, or if possible go back to the old way, because the current one clearly doesn't work.
It really has not, though. There is more to physics than high-energy and cosmology, and there is no shortage of deserving contributions of smaller scope. It really is bizarre that deep learning would make it to the top of the list.
Could you give me some examples of areas of fundamental physics that are vital and have done some significant discoveries lately? I genuinely would like to know, because I really can't think of any.
I'm probably not the right person to ask, but off the top of my head: superconductivity of high-pressure hydrides; various quantum stuff like quantum computing, quantum cryptography, quantum photonics, quantum thermodynamics; topological phases; rare decays (double beta, etc.); new discoveries in cosmic rays, etc.
My point was that physics is a big and active field, stagnation at the smallest and largest scales notwithstanding. Note also that the Nobel committee is not in any way limited to "newsworthy" stuff and has in many cases awarded prizes decades after the fact.
"Vital" is completely subjective but I'd throw stuff around quantum information into the ring. Maybe you'd consider the loop-hole free Bell tests performed in 2015 and awarded the 2022 Nobel prize to count?
I think the prize in 2022 was a nice prize, but it could still be considering just tidying the corners. In the end it just proved that things really work as most of us has thought it worked for decades.
My sense is that we might have reached the limits of what we can do in high-energy or fundamental physics without accessing energy levels or other extreme states that we currently can't access as they are beyond our capacity to generate.
From what I've read (not a professional physicist) string theory is not testable unless we can either examine a black hole or create particle accelerators the size of the Moon's orbit (at least). Many other proposed theories are similar.
There is some speculation that the hypothetical planet nine -- a 1-5 Earth mass planet predicted in the far outer solar system on the basis of the orbits of comets and Kuiper Belt / TNO objects -- could be a primordial black hole captured by the solar system. A black hole of that mass would be about the size of a marble to a golf ball, but would have 1-5g gravity at the distance of Earth's radius.
If such an object did exist it would be within space probe range, which would mean we could examine a black hole. That might get us un-stuck.
If we can't do something like that, maybe we should instead focus on other areas of physics that we can access and that have immense practical applications: superconductivity, condensed matter physics, plasmas / fusion, etc.
> My sense is that we might have reached the limits of what we can do in high-energy or fundamental physics without accessing energy levels or other extreme states that we currently can't access
How can we know, as past decades theoretical high-energy physics has studied made-up mathematical universes that don't tell much about our real universe. We haven't really given it that much of a try, yet.
Regarding the primordial black hole:
"Konstantin Batygin commented on this, saying while it is possible for Planet Nine to be a primordial black hole, there is currently not enough evidence to make this idea more plausible than any other alternative."
Regarding planet 9 in general:
"Further skepticism about the Planet Nine hypothesis arose in 2020, based on results from the Outer Solar System Origins Survey and the Dark Energy Survey, with the OSSOS documenting over 800 trans-Neptunian objects and the DES discovering 316 new ones.[94] Both surveys adjusted for observational bias and concluded that of the objects observed there was no evidence for clustering.[95] The authors go further to explain that practically all objects' orbits can be explained by physical phenomena rather than a ninth planet as proposed by Brown and Batygin.[96] An author of one of the studies, Samantha Lawler, said the hypothesis of Planet Nine proposed by Brown and Batygin "does not hold up to detailed observations" pointing out the much larger sample size of 800 objects compared to the much smaller 14 and that conclusive studies based on said objects were "premature". She went further to explain the phenomenon of these extreme orbits could be due to gravitational occultation from Neptune when it migrated outwards earlier in the Solar System's history.[97]"
> Physics as a research subject has ground to a halt
Max Planck was told by his professor to not go into Physics because "almost everything is already discovered". Planck said he didn't want to discover anything, just learn the fundamentals.
First, I didn't say that I thought everything already was discovered, but that the fundamental physics community doesn’t discover new things. That is due to how physics research is practiced today and has nothing to do with how much that is left to discover.
Second, even if it obviously wasn't true when Planck was told that almost everything is discovered, it doesn't say anything about the state today.
Upon reading the Hopfield paper back in 1982, I concluded that it's not worth it to pursue a physics career, and more efficient to put the effort into AI research, as at some point the AI will solve all the remaining science problems in a couple of milliseconds. I might have erred by a few decades, but overall seems like we are on track.
This does indeed smell of desperation. Which is really, really sad. Advances in _real_ physics are central to the absolutely needed sustainability transition. In a sane society that values its self-preservation you would not need to grasp at second-order straws to justify the need for all sorts of both fundamental and applied physics research.
We need to think seriously whether our collective hallucinations (pun) have got us to some sort of tipping point, undermining our very ability to act according to our best long-term interests.
ps. not to imply anything negative about the worthiness of the awardees in general
Hopfield made substantial contributions (Nobel-contention work) in multiple fields, which is truly astonishing: Kinetic proofreading (biochemistry/biophysics), HopNets (ML), long distance electron transfer (physics), and much more.
My perspective as a PhD in theoretical physics, who's been doing deep learning in the last 4 years:
1. The prize itself makes zero sense as a prize in _physics_. Even the official announcement by the Nobel Prize Committee, taken at a face value, reads as a huge stretch in trying to link neural networks to physics. When one starts asking questions about the real impact on physics and whether the most important works of Hinton and Hopfield were really informed by physics (which is a dubious link to the Nobel prize anyway), the argument stops holding water at all.
2. Some of the comments mention that giving prize for works in AI may make sense, because physics is currently stalled. This is wrong for several reasons:
2.1. While one can argue that string theory (which is, anyway, only a part of high-energy theoretical physics) is having its "AI winter" moment, there are many other areas of physics which develop really fast and bring exciting results.
2.2. The Nobel Prize is often awarded with quite some delay, so there are many very impactful works from 80s which haven't been awarded with a Nobel prize (atomic force microscopy is a nice example).
2.3. It is wrong to look at the recent results in some sub-field and say "okay, there was nothing of value in this field". For example, even if one completely discards string theory as bogus, there were many important results in theoretical physics such as creation of conformal field theory, which was never recognized with a Nobel Prize (which is OK if Nobel Prize is given to other important physical works, but is quite strange in the light of today's announcement).
To finish on a lighter mood, I'll quote a joke from my friend, who stayed in physics: "Apparently the committee has looked at all the physicists who left academia and decided that anything they do is fair game. We should maybe expect they will give a prize for crypto or high-frequency trading some time later".
Even if it's not completely true, maybe some introspection is required?
I understand developing new theories is important and rewarding, but most physics for the last three decades seems to fall within two buckets. (1) Smash particles and analyze the data. (2) Mathematical models that are not falsifiable.
We can be pretty sure that the next 'new physics' discovery that gives us better chips, rocket propulsion, etc etc is going to get a nobel prize pretty quickly similar to mRNA.
Those two buckets only contain the work in physics that have a sustained presence in popular media. But take gravitational wave astronomy as a counterexample. It doesn't make it into the news much, but I'm pretty sure the entire field is less than ten years old.
So then you could argue that that's what the folks who are smashing particles and building new models are trying to get to.
Smashing particles helps test existing theories and hypotheses. We do it with particle accelerators because that's one of the ways of getting to the uncharted territory, which is where you need to be if you want to push the boundaries. And maybe remember that the sexy stuff that makes it into the news isn't the whole of the thing. The LHC is also, for example, doing practical climate science: https://en.wikipedia.org/wiki/CLOUD_experiment
And building new mathematical models is part of figuring out how to make sense of observations that don't quite fit the current models. That is a messy process, and I think that our retrospective perspective on what that process is like might be colored by survivor bias. We remember Einstein and his theory of special relativity. We mostly don't remember the preceding few decades' worth of other attempts by other physicists to resolve conflicts between existing non-unified theories (in this case Newton's and Maxwell's models) or making sense of things like the Michelson-Morley experiment. I don't really know that history myself, but I would not at all be surprised if many of those efforts were also having trouble figuring out how to produce testable hypotheses.
And also, big picture, I think that it's important for any lover of science to remember to celebrate the entire enterprise, not just its headline successes. Expecting consistent results is tacitly expecting scientists to have some way of knowing ahead of time which avenues of inquiry will be most fruitful. If we had access to an oracle that could tell us that, we wouldn't actually need science anymore.
> They already got a Nobel prize, one of the quickest in history.
From some aspects, it was late. Gravitation waves were predicted decades ago. It's almost unfair to predict something but then have engineering take decades to catch up to be able to prove/disprove the theory. This is just commentary on the notion of being right decades before the world is ready for it. Of course, it can go the other way where one is assumed to be right but then isn't (e.g., many components of string theory).
Just because there happens to be economical viability for a field currently doesn't mean that field needs less introspection. Exactly what research contributions the people who are throwing hundreds of millions of dollars worth of GPUs at the next random "research" problem at the top of the queue at Microsoft or Google are making to deserve a Nobel?
Too often there is near zero intuition for why research in AI yields such incredible results. They're mainly black boxes that happen to work extremely well with no explanation and someone at a prestigious institution just happened to be there to stamp their name on top of the publication.
Big difference between research in AI and any non-computational/statistical/luck-based science.
That's an interesting definition of "most physics". I mean, I find high-energy physics as fascinating as the next guy but there are other fields, too, you know, like astrophysics & cosmology, condensed-matter physics, (quantum) optics, environmental physics, biophysics, medical physics, …
The corners of physics I have some contact with (climate/weather modelling and astrophysics) seem pretty dynamic to me. Each generation of CMIP models seems to be significantly better than the previous.
A Hopfield network is a lot more like physics than biology, but agreed that the conformal field theorists should have been recognized before Hopfield and/or Hinton. Jim Simons would have been deserving too, IMHO, far more for his work at RenTec than for Chern-Simons theory
The Hopfield paper was published in a Biophysics section of Proc. NatL Acad. Sci., and was followed by a flood of spin-glass papers in Phys Review A and similar. So there is some connection to physics.
By the same people who guaranteed they'd see certain things at the existing energy levels but now all of the sudden need higher energy levels after they didn't find what they were looking for.
Was just listening to a live radio interview with Hinton finding him in a small hotel room in California somewhere quite flabbergasted at the news. Interviewer all happy for him etc, but when delving more into what it was for he started to go off on AI concerns etc and the interview didn't last much longer.
In some sense it makes sense to award AI researchers on behalf of the physics community because I know many physics PhD’s who thank their job to AI; they work as data scientist now.
Jokes aside, physics is a bit stuck because it’s hard to do experiments at the boundaries of what we know, as far as I understand. So then it makes sense I guess to award people who made useful tools for physics.
> it’s hard to do experiments at the boundaries of what we know
this applies primarily to fundamental physics. There are many areas of applied physics (materials, fusion, biophysics, atmospheric physics, etc. etc.) where the main constrain is understanding complex systems. These areas are quite crucial for society.
"Highly sought-after fundamental particles, such as the Higgs boson, only exist for a fraction of a second after being created in high-energy collisions (e.g. ~10-22 s for the Higgs boson). Their presence needs to be inferred from tracking information and energy deposits in large electronic detectors. Often the anticipated detector signature is very rare and could be mimicked by more common background processes. To identify particle decays and increase the efficiency of analyses, ANNs were trained to pick out specific patterns in the large volumes of detector data being generated at a high rate." (emphasis mine)
It concerns me reading stuff like this (one can find similar for the original LIGO detection of gravitational waves) without accompanying qualification. B/c I want to hear them justify why it shouldn't sound like 'we created something that was trained to beg the question ad nauseam'. Obvs on a social trust basis I have every reason to believe these seminal discoveries are precisely as reported. But I'd just like to see what the stats look like - even if I'm probably incapable of really understanding them - that are able to guarantee the validity of an observation when the observation is by definition new, and therefore has never been detected before, and therefore cannot have produced an a priori test set (outside of simulation) baseline to compare against.
Did Hinton win for the restricted Boltzmann machine? I believe Paul Smolensky has some priority with the Harmonium, but Hinton certainly deserves it. But worth reading Smolensky’s paper, it is a classic!! https://stanford.edu/~jlmcc/papers/PDP/Volume%201/Chap6_PDP8...
It's a chapter from here: Rumelhart, D. E., McClelland, J. L., & the PDP research group. (1986). Parallel distributed processing: Explorations in the microstructure of cognition. Volume I. Cambridge, MA: MIT Press.
I find the prize a bit odd this time since it focused on Hopfield networks and Boltzmann machines. Picking those two architectures in particular seems a bit arbitrary. Besides, Parisi got the prize last year (edit: actually 2021, time flies) for spin glasses. Hopfield networks are quite related. They could have included Hopfield & Hinton too, and it would have looked more coherent.
It is also concerning that lately the Nobel Committee seems to be ignoring fundamental broad theoretical contributions. In this case, backpropagation, where Seppo Linnainmaa could have been one of the awardees. It is a bit sad he and others who have already passed away get little credit for something so fundamental.
The Nobel in Physics only goes with experimental discoveries, Peter Higgs didn't get his (deserved since the 70s) until the LHC directly observed the particle.
I agree that Hopfield networks and Boltzmann machines are a surprisingly arbitrary choices. It is like they wanted to give a prize to someone for neural networks, but had to pick people from inside their own field to represent the development, which limited the range of options. There is also the aspect of the physics community wanting to give somebody that they liked a Nobel, and then trying to fit them in. (The prize isn't handed out by a shadowy committee of Swedes, there's an involved and highly bureaucratic process for nomination that requires your colleagues to take up your case.)
I've never heard that it had to be tied to experimental discoveries. For example, Feynman got the prize for Feynman diagrams, path integrals and QED calculations. None of that directly tied to experimental work.
It has definitely been awarded for both theoretical and experimental contributions throughout its history. Many theoretical physicists have received the prize for their conceptual breakthroughs, even without direct experimental verification at the time.
That was relatively early and it had clearly changed later when they started giving folks like Feynman the prize for something that wasn't at all experiment related.
But, as you said, Higgs got his prize once theories were tested. Hence, theoretical contributors (still alive as per prize rules) could have been included here as well.
Nor did Georges Lemaître, nor other theorists, and in that case the experimental physicists who (accidentally!) discovered the evidence did win Nobels.
The photoelectric effect was known before Einstein, so the award was for theoretical achievements. (But only after Millikan had done precision measurements on it)
Think of this as a Nobel prize for systems physics – essentially "creative application of statistical mechanics" – and it makes a lot more sense why you'd pick these two.
(I am a mineral physicist who now works in machine learning, and I absolutely think of the entire field as applied statistical mechanics; is that correct? Yes and no: it's a valid metaphor.)
Lots of ML is heavily influenced by fundamental research done by Physicists (eg. Boltzmann Machines), Linguists (eg. Optimality Theory / Paul Smolensky, Phylogenetic Trees/Stuart Russell+Tandy Warnow), Computational Biologists (eg. Phylogenetic Trees/Stuart Russell+Tandy Warnow), Electrical Engineers (eg. Claude Shannon), etc.
ML (and CS in general) is very interdisciplinary, and it annoys me that a lot of SWEs think they know more than other fields.
Parisi won in 2021, not last year. His work was more about establishing spin glasses as a way to study complex systems. Hopfield definitely built on that, showing how those ideas could be applied to neural networks and info storage in state-space machines.
As for focusing on Hopfield networks and Boltzmann machines, I get where you're coming from. They’re just a couple of architectures among many, but they’re pretty foundational. They’re deeply rooted in statistical mechanics and have had a huge impact, finding applications across a range of fields beyond just machine learning.
May be I should know better, but is there no Nobel category for computer science or mathematics? This isn't physics, this is absolutely embarrassing. May be all those bitter elder physicists who didn't get a prize can feel a little justified in their derision of the institution.
Computer science has the Turing award and mathematics the Fields medal. Neither is exactly equivalent to the Nobel but they're similar levels of prestige.
The Nobel prize fields and criteria are a bit random, they're essentially just whatever Alfred Nobel wrote in his will.
Within their respective fields, not in general. What makes the Nobel so unique and desirable is that everybody knows what it is and is impressed by it. Mentioning that you've won a Nobel prize will impress people and open doors in virtually any circumstance. Saying you have a Turing award will mostly lead to blank stares from anybody outside the field.
Perhaps not a bad idea in that specific instance, but they're so embarrassed today about getting bought for the economics one that doing something similar again has become effectively out of the question (and on a balance I think that's for the best).
At least CS is a real discipline in the actual spirit of the original prizes, whereas (as the Nobel family has pointed out) the economics prize probably has Alfred Nobel rolling in his grave.
This is like that sidebar you have with someone after you have joined a series of meetings and listened intently and have all these nagging suspicions and you are in denial about people who you think could not possibly be talking this much nonsense because they clearly should know more than you. A few seconds into the sidebar the person tells you that everyone is full of shit, and you find relief in that you were actually understanding everything you were hearing and yes, you were drawing the right conclusions.
The response of the physicists they say should get a slap is, in programming terms, basically shut up and show me the code. It's a fairly one sided debate that we're blessed with seeing in literally every thread anywhere about it
It still kind of baffles me that there isn't a Nobel Prize in Mathematics and/or computer science.
The latter makes a bit more sense, computer science wasn't really a thing when Alfred Nobel was around, but mathematics certainly was! It seems like it would be perfectly reasonable to add a category for math, and I think Neural Nets would fit in there considerably better.
The Swedish central bank later added the price in economics (although controversial). So other things can be amended. However, I suspect we won't see any more amendments to the noble price.
A different institution can step up and make a price on its own, though I'm not sure what institution would have that amount of prestige without weaponizing it for commercial purposes.
I prefer a nicer price for mathematics that includes a bit of computer science than a price for computer science. I don't think there is much room for a "society-changing innovation" within CS that isn't either an engineering feat (Linux, Docker, FFmpeg) or an algorithm that could fit under a mathematics price (FFT, Navier Stokes).
If a new prize is added, it would need its own funding. The five original Nobel Prizes are funded by interest from the fortune that Alfred Nobel left behind.
The economics prize is funded by the Swedish central bank, and is therefore officially their prize "in memory of Alfred Nobel", not a "Nobel Prize" as such.
> I don't think there is much room for a "society-changing innovation" within CS that isn't either an engineering feat (Linux, Docker, FFmpeg) or an algorithm that could fit under a mathematics price (FFT, Navier Stokes).
I'm not sure I agree with that. There's plenty of theoretical computer science that isn't really "engineering" and would fall into a pretty different category than stuff like FFT or Navier Stokes.
If you look at something like Concurrency Theory, for example, and work with stuff like Pi Calculus or CSP or Petri Nets, those aren't "engineering feats", but also kind of fall into a different category than the rest of math, or at least pretty different than Navier Stokes. I think you could make a pretty strong argument that CSP has been a pretty big innovation in regards the academic state of the art while not simply being engineering.
Let me add RSA, Elliptic curves, Runge–Kutta, Finite element analysis, and Hamming codes to the list.
I would still consider CSP, Petri Nets, and Pi-Calculus mathematical enough to be wrapped under a mathematics price if they're influential enough. The first true computer scientists were mathematicians, and I still feel that much of the theoretical work in the field is closer to "mathematics useful for computers" than its separate field.
In the spirit of the nobel price, "mathematics with the greatest humanitarian impact" leaves plenty of room for the inclusion of influential pieces from theoretical computer science, especially as those prices within mathematics that do exist already include loads of mathematics that require computers to prove or solve.
> I would still consider CSP, Petri Nets, and Pi-Calculus mathematical enough to be wrapped under a mathematics price if they're influential enough.
I guess, but they certainly feel categorically different than something like Runge-Kutta. They're more about the study of algorithms, which is generally where I've drawn the line of "computer science vs math".
the nobel prize has the best branding and name brand recognition pop culture wise. I'm sure winning all these others means nearly as much to those communities.
The computation of the cosmic microwave background fluctuations hasn't received a nobel prize yet. It's had a deep impact on how we understand the Universe.
Some people still alive who made important contributions to this are Rees and Sunyaev.
If you define "benefit to humankind" narrowly, and don't view gaining pure knowledge about the workings of the universe as beneficial to humanity, then most physics Nobel Prizes over the last few decades fail the test.
Detecting gravitational radiation from the merger of two black holes was an incredible step forward for our understanding of the universe. It will not practically change your life in any way.
Exoplanet science is not physics, it's chemistry or planetary science. By your logic prizes to teams who send probes to the outer solar system planets could also be given prizes.
What's "exoplanet science"? The above are applications of knowledge of physics to astrophysics, as far as I understand it. Certainly they sound more relevant to physics than neural networks.
I would argue that the first measurements of exoplanets' existence is definitely physics. This was a leap in our understanding of the makeup of the universe.
Just watched the nobel prize live stream, surprised by the topic, looks very engineering to me rather than physics, do algorithms make physics subsidiary?
The original wording explicitly mentions "discovery or invention within the field of physics", so that wording has been reflected in some of the prices over the years.
Also, it's a price based on a will from over 100 years ago, managed by a private institution. The institution can be as arbitrary as it wants. They don't need to answer to anyone.
Well in 1912, the physics prize went to a guy who designed a better light for light houses and buoys to prevent shipwrecks! Actually, most early Nobels went to practical things.
I am really surprised. I would have guessed that a Nobel Prize would be awarded to advancements in the field itself. Not for inspirations from it or to tools that led to advancements. Although as I write this I'm sure there have been several prizes awarded to scientists / engineers who have developed tools to advance physics. Like radio astronomy? Still surprised though.
Some other recent cases of the prize being given to an engineering contribution:
- 2018 was for chirped pulse amplification, which is most commonly used in medicine (LASIK surgery for example)
- 2014 was for basically for LED lights
- 2010 was for a method for producing graphene
- 2009 was for both charge-coupled device, which is a component for digital imaging (including regular consumer digital cameras), and fibre-optic cables
Well yeah, so are neural nets. I just meant that these are engineering accomplishments, not scientific per se. Of course experimental science will often take advantage of cutting edge technology, including from computer science.
NNs have absolutely revolutionized systems biology (itself a John Hopfield joint, and the AlphaFold team are reasonably likely to get a Nobel for medicine and physiology, possibly as soon as 'this year') and are becoming relevant in all kinds of weird parts of solid-state physics (trained functionals for DFT, eg https://www.nature.com/articles/s41598-020-64619-8).
The idea that academic disciplines are in any way isolated from each other is nonsense. Machine learning is computer science; it's also information theory; that means it's thermodynamics, which means it's physics. (Or, rather, it can be understood properly through all of these lenses).
John Hopfield himself has written about this; he views his work as physics because _it is performed from the viewpoint of a physicist_. Disciplines are subjective, not objective, phenomena.
My personal theory is that Demis and John will win the Chemistry prize for AlphaFold this year and that they decided to also award this one to help bolster the idea that ML is making fundamental improvements in academic science.
I would prefer if there was an actual Nobel Prize for Mathematics (not sure if the Fields would become that, or a new prize created).
I did have a similar thought, but awarding a prize to AF this year would be a very bold move, given how things went today. Right folks, we've demoralized all the physicists today, tomorrow we can do the same for the chemists!
To be perfectly honest, I'm not really sure the physics community being demoralized by this prize award is a really negative thing. I think many aspects of HEP and other areas have stagnated, requiring exponentially more money for marginal gains (I don't have a problem with LIGO or other large projects, but "we found another high energy particle consistent with the standard models" not so much). Perhaps this prize will give the community a shot in the arm to move away from "safe but boring".
i reason it from the perspective that its because they've found a way to make a machine out of bits that no one's ever made before, they're physical objects
It’s time to consider adding computer science as a category for the Nobel Prize. They have already been awarded prizes for economic sciences and peace, so why not computer science? It's not the same as physics, and its impact on modern life is undeniable
Is that even "possible"? As in, they have to follow the rules of the organization, which have some criteria laid out. Not sure who could stop them from changing, though. Like, I think the original intent was to have done the most the preceding year, but now it's more of a lifetime award. So perhaps they can change or add different categories if wanted.
That's the only reason I can think of for this - like, Alfred Nobel or some other dead donor set rules that they aren't allowed to change now. But I haven't seen anyone confirm it.
In 1968, the Nobel Foundation accepted a donation from the Swedish central bank to establish a prize in economy, but in hindsight that was a pretty bad idea, and the probability of them accepting future donations to establish prizes in other fields is very slim.
People often say that the Turing award is the Nobel Prize of computing but that's not really true. The Turing award is the most prestigious award in computing yes but that's not enough for Noble like recognition/pedigree.
What makes the Nobel prize unique is that almost anyone, even the general public or pioneers in other fields etc can here you received one and be very impressed. You'll generally be met with blank stares if you told anyone not in computing or an enthusiast you'd won a Turing. Even if you then said, "It's the most prestigious award in computing!", it wouldn't hit the same.
Awards like these are basically only really worth their social recognition, so it's no surprise people would still want a Nobel in Computing/Mathematics etc even with Turing/Field etc existing.
I don't know if this is a travesty that they awarded the prize to work on non-physical systems to jump on a bandwagon or that there was nothing else obvious enough to the board in actual physics to give this to.
If I was the awardee I'd consider declining just out of respect to the field.
The linked document connects their work to physics as follows:
"The Hopfeld network utilises physics
that describes a material’s characteristics due to its
atomic spin – a property that makes each atom a tiny
magnet. The network as a whole is described in a
manner equivalent to the energy in the spin system
found in physics, and is trained by fnding values for
the connections between the nodes so that the saved
images have low energy"
"Hinton used tools from statistical physics, the science
of systems built from many similar components."
100% agreed as I can't think of any one individual since(1) who has done as much for all of science and engineering as he ultimately did; alas, they are not awarded posthumously.
(1) Newton would be a strong contender on a "for all time" basis, but even he would've probably needed to share it with Leibniz, which would have driven him absolutely ~b o n k e r s~, like wet hornet in a hot outhouse mad, LOL.
Moreso genuine curiosity than as a gotcha: A lot of comments are saying this was the wrong choice. I'd find it really interesting to hear who the nomination should have gone to instead, in your opinions.
Congrats to the laureates! Maybe a Computing prize should be created though, like Nobel did not create the "nobel prize of economy".Though you could argue that Computing is Math? What are computer scientists usually awarded with?
I don't think Geoff Hinton is in the running for a Fields medal[1] any more, unless they did what they did for Andrew Wiles and give him a "quantized" Fields medal.
Hinton was never in the running for a Fields medal since he never made a single contribution to the field of mathematics. His work is about empirical discoveries in CS.
Isn't the standard AI doom position that the government taking control just kills everyone a few years later? The money would be for research, not welfare or whatever the government would use it for.
These people obviously did amazing work, but shouldn't a physics prize go to... physicists? It's not as if there aren't physicists doing active work. While the foundation can do whatever it wants, I do feel awards and prize categories should be honest.
Like if there weren't many good movies they wouldn't start giving the Oscar for Best Picture to a videogame.
Most commenters here don't know that Boltzmann machines and associative memories existed in condensed matter physics long before they were used in cognitive science or AI.
The Sherrington–Kirkpatrick model of spin glass is a Hopfield network with random initialization.
Boltzmann machine is Sherrington–Kirkpatrick model with external field.
This is price in physics given to novel use of stochastic spin-glass modelling. Unexpected, but saying this is not physics is not correct.
I'm a condensed matter/statistical physicist and am very aware of the connections to statistical physics I still think that the committee has completely lost it with this choice. There is a sharp line for me between things that are inspired by physics and those that are physics (and I really don't buy that physics is anything physicists do) -- and this clearly falls on the "inspired by" side.
I know plenty of physicists that would be very pissed off by all this drama.
There is so much more than fundamental physics and there is much more in physics than breakthrough discoveries. Medical physics just to name a fun field everybody always forgets about has been studying and using neural networks since about forty years. Applied physics, biophysics, atmospheric physics. Even particle physics is mostly data science these days.
This idea that physics should only be about fundamental theories and discoveries is really detrimental to the field and leads to the false idea of stagnation that permeates this whole thread.
However, it is weird for the committee to give a prize for theoretical physics without an experiment. It is doubly weird when they already made this "mistake" in 2021 with Parisi, who was the odd one out among the geophysicists, and are giving another prize in spin glass/stat phys... why?
In summary, it's definitely related to physics, but kind of weird choice.
Why David Sherrington and Scott Kirkpatrick did not share the price for the Sherrington–Kirkpatrick model? Hopfield is referencing their work?
Multiple theoretical physicists working with black holes (Hawkin's and others) didn't get Nobel, because black holes were not confirmed or theory could not be tested.
> but they have made no contribution to understanding physical laws or phenomena.
Neural networks are used in tons of data pipelines for physics experiments, most notably with particle accelerators.
The Nobel Prize is also occasionally awarded to engineers who develop tools that are important parts of experiments. 2018 for example was awarded for chirped pulse amplification, which is probably best known for being used in LASIK eye surgery, but it is also used in experimental pipelines.
> Neural networks are used in tons of data pipelines for physics experiments
With this argument you could even say Bill Gates should get an award for inventing Windows and popularized the desktop computer... Or at least Linus Torvalds since those pipelines are probably running Linux...
Please explain how Hopfield network influenced modern deep learning models based on supervised differentiable training. All the "impactful" architectures such as MLP, CNN, Attention, come from a completely different paradigm, a paradigm that could be more straightforwardly connected to optimization theory.
They did not bring it into existence. The MLP is older than the Hopfield network. The invention that made it practical was back propagation, which wasn't used here at all.
it already had, bottom-quark tagging has improved O(10)x in efficiency in the last decade without any new "physics" understanding, just from training with more low-level data and better ML arch (now using Transformers)
but we haven't found new physics with or without ML, making this prize a little weird.
Maybe it's a prize for hope and change that physics will be revolutionized by neural networks? Similar to how Obama got a Nobel Peace Prize in order to repudiate Bush's legacy in Iraq and Afghanistan. While Bush's legacy absolutely deserved to be repudiated, I don't think awarding a new president the Peace Prize was the best way to do it, especially because in the foreign policy realm, he ended up not so different from Bush.
"With their breakthroughs, that stand on the foundations of physical science, they have showed a completely new way for us to use computers to aid and to guide us to tackle many of the challenges our society face. Simply put, thanks to their work Humanity now has a new item in its toolbox, which we can choose to use for good purposes. Machine learning based on ANNs is currently revolutionizing science, engineering and daily life. The field is already on its way to enable breakthroughs toward building a sustainable society, e.g. by helping to identify new functional materials. How deep learning by ANNs will be used in the future depends on how we humans choose to use these incredibly potent tools, already present in many aspects of our lives"
Whether or not the original Higgs discovery decay channels used ML, confirming that it was in fact the Higgs required measuring the decay to b-quarks, which has used ML since the LHC started taking data.
Over the lifetime of the LHC, he backgrounds got around 10x smaller for the same "efficiency" (fraction of true b-quarks tagged) if you want to be pedantic about the definitions. We've used NNs in b-tagging for decades now, so it was always possible to dial in a threshold for tagging that was e.g. 70% efficient.
Transformers gave us a factor of a few smaller backgrounds in the last few years though [1].
I sort of agree in principle but in practise they've always taken a broad view.
Kissinger was one of the most prominent disrupters of world peace in the postwar era but that didn't stop him winning the peace prize. Churchill won the literature prize for defeating Hitler. The blue led guys a few years back didn't do much except make a thing that would go on every single consumer gadget and disrupt my sleep but they won the physics prize.
Even when they get it right they often get it wrong. For example I believe Einstein supposedly won for "especially his work on the photoelectric effect" rather than relativity.
Einstein’s work on the photoelectric effect was incredibly important, and incredibly influential on other research at the time. He proposed that light was quantised - essentially the foundation of quantum mechanics.
It’s no exaggeration that Einstein’s work on the photoelectric effect was as important as special or general relativity, and it had the advantage of strong experimental verification by 1921.
The main reason that prize is remarkable is that Einstein himself hated quantum mechanics - but that doesn’t dispute the work’s importance.
The discovery of the photoelectric effect was certainly as important as relativity in terms of how much it affects society. But it was only an incremental advanced on top of Planck work on blackbody radiation.
I'm not saying that photoelectric effect didn't deserve a Nobel Prize. But relativity completely supplanted Newtonian Physics, and Einstein played a much greater role in this revolution than he did in that of Quantum Mechanics.
Also, I believe historical records have made it clear that relativity, even if it was still considered controversial in the '20s (and so not mentioned specifically), was indeed part of the reason he was awarded the prize.
Also, consider WHY it was still controversial, despite evidence piling up even for relativity. It was seen as such a fundamental shift away from common-sense understanding of the physical world that people refused to believe it, despite evidence.
Just like how many people to this day do not believe it's possilbe to build AI out of regular computers, as their intuition tells them that some magic vodoo needs to be there for "true" inteligence.
I would add to this that it had the advantage of something like 40 years of history as a field that was the basis for some of the biggest advances in instrumentation of that era.
Also, the prize is about the greatest benefit to humankind according to Alfred Nobel, not the most impressive research. Arguably, the photoelectric effect fits that notion better than GR or any other of Einstein's research.
Besides that, Einstein received the prize in 1921, whereas the Eddington experiment in 1919 generally counts as the first experimental verification of GR.
> Arguably, the photoelectric effect fits that notion better than GR or any other of Einstein's research
Today we could argue about it due to the importance of solar panels, but that was hard to forecast in 1921. Also, without GR there would be no GPS so it's not like it doesn't bring benefits to humanity.
Einstein laid the foundation of quantum mechanics with his description of the photoelectric effect, so you could add transistors, lasers, LEDs, CCD sensors and more to the list. Although I agree that it's doubtful that most of this could have been foreseen then.
Surely they would have just noticed a discrepancy in timing and added a few circles-upon-circles to effectively fix it up? Is deeply grokking relativity necessary for GPS to work?
On the other hand, it would be impossible to make those adjustments without someone coming up with GR :-)
More to the point, photoemission spectroscopy has been a workhorse tool for understanding the electronic properties of materials for quite a long time now (though perhaps not yet in 1921).
Nobel prizes are generally awarded for verifiable observations but, also require real world applications.
Einstein won the physics prize on the photoelectric effect due to having real world applications and observable and if GPS actually existed while he was arrived (yes I know this is a stretch) he would have gotten it for relativity.
Blue LEDs allows you to access more of the color spectrum for LEDs in general and they were not easy to make.
For this year it does feel like a very large leaning into practical applications instead of physics though. Did we run out of interesting physics in the last year?
The Nobel peace prize was a mistake. Peace is not a science, and you can't objectively measure how much anyone has helped peace, especially not before a few decades has passed.
So I agree that the peace prize committee has made some bad choices, but they do have an impossible job.
I'm sure they are but they drive me nuts. If I ever become filthy rich and in doing so sell my soul and become a bad person, one of my priorities will doubtless be to have the blue led inventors hunted down remorselessly.[1]
[1] Note to future law-enforcement: I am honestly kidding. I wouldn't hurt a fly, officer.
A black sharpie over the offending led indicators will fix that. Now you can enjoy your sleep uninterrupted by dreams of manhunts and mephistophelian bargains.
So if someone will invent say a new keyboard layout which will improve median data input rate by 10% and will be used by astrophysicists then it will be worthy of the astrophysics prize? Or better yet - in your example the main driver for the ML is Nvidia, should be award Jensen a prize in astrophysics? Or in any other field where ML is deployed? In my opinion we should separate efforts of people making tools, from the efforts of people doing research using said tools.
People have won it over new microscope designs and techniques.. possibly telescopes, too.. but I'm less familiar with that and not somewhere where it's convenient to look it up.
In 1986 and 2014, science (electron optics, nanoscopy/nanolasers) came first. Then the microscopy. Even 2017 won for 3D microscopy. What Nobel-worthy physics does thing do?
One of the very early successful applications of ML was using neural network and other models in particle identification systems in particle physics experiments.
This reminded me of my Hopfield networks implementation in Go [1]. The algorithm is rather simple but fascinating nevertheless and works surprisingly well for reconstructing noisy images. I actually blogged about it as well [2]. But as many are discussing here Deep Memory networks based on Boltzmann networks are more powerful yet they don't seem to have found much use case either
A lot of commentators here saying that while fundamental physics has possibly stalled, a lot of applied physics is still bustling with activity. Even buying that premise, is there anything in the applied physics fields that is comparable in impact to the neural network?
It feels like the Nobel committee’s decision is an indictment on the lack of impact of modern physics. They had to stretch definitions and go into AI, to get something that they found impactful enough for receiving a Nobel prize. This is impact outside of physics, impact on a broader societal sense that physics in the past had in spades and AI has now.
> Even buying that premise, is there anything in the applied physics fields that is comparable in impact to the neural network?
Why does it have to be? It's a different field. You would not expect the Nobel in chemistry to be awarded to Linus Torvalds, impact or no impact.
And the connection to physics is beyond tenuous here. The toy neural networks they cite in the document, including the Boltzmann machine, have very little to do with the power of ANNs to learn complex patterns that made such a splash recently. That is basically what the Bitter Lesson is all about. The interesting stuff does not arise from clever theory, but from practical tinkering and loooooots of compute.
Hinton has no published books on physics, and his bibliography of papers, to the extent I've examined, lacks any serious contributions to physics. There are underrepresented physicists who never will come close to winning a Nobel. Not to speak of the women in Physics, and the fact that we still have Edward Witten who still isn't worthy of winning a Nobel. As someone who has seen friends and others give up on physics due to being denied for funding and other institutional issues, I am infuriated at this gesture by the Nobel Committee.
When was the last time we gave someone a Nobel physics who hasn't bothered writing a book? We have professors dying without a tint of recognition for their work. The whole ordeal is terrible, it's like giving Einstein a Nobel in medicine because his research on photoelectric effect has opened a new domain in biotechnology and because that's the new cool thing in the market, we'll go with that.
A lot of the outsiders think "physics is dead", but dare they look into the research inside it. It is not at all dead. And arguing that failing to have definitive answers to the Big questions means being 'dead' is a terrible way to look at the field. Math still doesn't have a definite way to look at primes, for centuries we didn't have the definite way to look at algebraic equations of higher degrees and general solutions to them. That didn't make math die, that's what keeps it alive. I am fine with Hoppfield for once maybe, but seriously why Hinton?
No, these are the proper laureates (for that topic anyways, whether the topic is appropriate in the first place is another matter). LeCun and Bengio's works are undoubtedly immensely impactful, but there's no denying that they are standing on shoulders of giants.
Good point. I suppose if one is going to not win the Nobel Prize, a decent "consolation prize" is at least being referenced in the prize announcement for whoever did win.
That is a really strange plot. It looks like they are fitting a really high order polynomial to what is more or less a linear or maybe quadratic trend. And the overfitting exaggerates the recent trend.
As a physicist, my reaction to this is how bizarre is that. Maybe he deserves a nobel prize but in physics?
Also arguing that NN is used in physics so we can argue nobel prize is okay is like asking for Stephan Wolfram to be awarded Nobel prize for Mathematica which is much more used in physics as a tool. And he is a physicist and had contributions to the field of numerical relativity (The reason he created Mathematica in the first place).
The royal science academy fucked up so much with this choice.
By this definition Claude Shannon (the father of Information Theory) clearly deserves a Nobel in Physics. The central concept in Information Theory is Entropy which is defined literally the same way as in Physics. And Shannon's Information Theory clearly revolutionized our life (tele-communications) much more than Hopfield network or Hinton's Boltzmann machine.
In 1939 Claude Shannon won the "wrong" Nobel prize -- The Alfred Noble Prize award presented by the American Society of Civil Engineers [0]. It causes a lot of confusion.
Claude Shannon never won a "real Nobel".
Someone changed the Wikipedia article today to call Hopfield a "physicist". Previously the article called him simply a scientist, because his main work wasn't limited to physics. I changed it back now, let's see if it holds up.
I suppose some might argue that being awarded the Nobel Prize in Physics is enough to call yourself a physicist.
…it does have the unfortunate implication, however, that nominations need not be restricted to physicists at all since any winner becomes a physicist upon receipt of the prize.
It’s sort of like the No True Scotsman but inverted, and with physicists instead of Scotsmen.
The Nobel Committee doesn’t represent the field of physics. I talked to a few former colleagues (theoretical physicists) just now and every one of them found this bizarre.
>where we all know that mathematics and/or CS deserve the honor
Or semiconductor manufacturers.
All the math and CS needed for AI can fit on a napkin, and had been known for 200+ years. It's the extreme scaling enabled by semiconductor science that really makes the difference.
That's absurd. The computer science needed for AI has not been known for 200 years. For example, transformers were only invented in 2017, diffusion models in 2015.
(When the required math was invented is a different question, but I doubt all of it was known 200 years ago.)
TBF backpropagation was introduced only in the 1970's, although in hindsight it's a quite trivial application of the chain rule.
There were also plenty of "hacks" involved to make the networks scale such as dropout regularization, batch normalization, semi-linear activation functions (e.g. ReLU) and adaptive stochastic gradient descent methods.
The maths for basic NNs is really simple but the practice of them is really messy.
Residual connections are also worth mentioning as an extremely ubiquitous adaptation, one will be hard-pressed to find a modern architecture that doesn't use those at least to some extent, to the point where the original Resnet paper sits at over 200k citations according to google scholar[1].
> All the math and CS needed for AI can fit on a napkin, and had been known for 200+ years.
This isn't really true. If you read a physics textbook from the early 1900s, they didn't really have multivariate calculus and linear algebra expressed as concisely as we do now. It would take several napkins. Plus, statistical mechanics was quite rudimentary, which is important for probability theory.
> If you read a physics textbook from the early 1900s, they didn't really have multivariate calculus and linear algebra expressed as concisely as we do now.
As a former physicist with a PhD in experimental condensed matter physics, I would say that this is one of the dumbest things I have ever seen. I suspect COVID-19 made people collectively dumber.
This prize is more of a settler colonialist land grab by physicists. ML is just a subfield of physics (like every other field), so let's make sure that everyone knows that it's in our domain.
I would agree.. But it took Computer Scientists to put neural networks on the map by getting them to scale. Basically by asking the question what happens if we turn it up to 11.
Statisticians would never have done that due to parsimony and something something Bayesian.
Engineers would never have done it, nor mathematicians either.
It took Computer Scientists because it is computation.
I also had this view, but thinking a bit more about it, what we consider 'reasonable axioms' in math, all come more or less from our logical intuition. Which was built from our evolution, which respects the laws of physics.
This has changed my point of view to where math is kinda derived from physics, as the axioms (but even the derivation rules, like modus ponens) are chosen because they respect what feels intuitively 'logical'. But this intuition cannot be disentangled from physics, as it was a product of physics.
The actual process of computation, sure, but machine learning was born from physics-based methods and applications to understand complexity and disorder.
I think ML depended on math/statistics/computing much more than physics. It's much easier to see what Hopfield and Hinton did as inspired by mathematical models that were created to help study statistical mechanics. Which is fine and just shows how stochastic scientific discovery is.
The contrast in discoveries made in 'core' physics in the first 25 years of the last centuries compared to this century is quite insane, it was never going to be sustainable. If it did sustain we would be colonising a new galaxy by now.
Consider that in 1900 the atom wasn't discovered yet, within around 25 years the basic principles of quantum physics were established, to say nothing about discoveries in cosmology (GR + big bang).
Two of my research advisers whose dissertations in PhD Physics had Hopfield as their primary reference. I'm also a PhD candidate working on one right now (no longer my primary reference because of all the developments) but I can trace several of my main references back to them.
This is an appropriate application of this prize considering the adage that there are now three pillars to science, the third being simulation (after theory and experiment).
Also, many of the underlying theories in machine learning display deep analogy with physical laws we are already familiar with, e.g., thermodynamics.
I heard it from a professor who was a researcher in magnetohydrodynamics, studying flows on and under the surface of the sun. I don't know where to read more unfortunately, I'm not sure where it'sbeen fleshed out as an ideology.
I wonder if they ever gave a Physics Nobel to a person who held a patent! People like Graham Bell never got recognized by the Nobel people. I get the impression that Physics Nobel prizes were more or less given only to University professors. They didnt seem to particularly care for people with grease on their hands
I don't really know what I'm talking about, but weren't there like 9 Nobel prizes awarded to Bell Labs engineers for physics? One of which (I think) being the invention of the transistor, which presumably had a patent.
You make several good points. However I'm not sure if Bell Labs people were different from university people in terms of their academic background. All three, Shockley, Bardeen and Brattain were Physics PhDs, two of them from elite universities. I was trying to say that brilliant engineers do not disproportionately figure in the list of Physics Nobel prize. In fact they are hardly to be seen.
A post from yesterday, complaining about Musk not receiving a prize despite (according to the author of that post) deserving it, has had me thinking about that too. Folks like Bell, Musk, Bezos etc are in many ways similar to Alfred Nobel, highly successful and very controversial businessmen, where their contributions to the world have had great positives and great negatives.
Putting aside the fact that it's also entirely reasonable to say that Musk, Bezos etc, while having changed the world, have not really personally made breakthroughs in fundamental science of the level as to deserve a Nobel prize; I wonder if the Nobel Foundation avoids figures like that because of the parallels.
Currently I'd flat out refuse to give any sort of prize to musk, that could be a tipping point for his mental "stability" completely breaking down. The last few years really had a toll on him. Fallen from idol to conspiracy rightwing idiot crashing his companies more and more.
Philipp Lenard had a patent on cathode ray tubes, Marconi on wireless telegraphy, Dalén had plenty of patents on the automatic lighthouse regulator he got the prize for, and many others.
It seems physics is natural continuation for those who want understand what's behind the curtains after all. Thanks physicists for providing support for AI scientists. Next phase for joint research is quantum AI where one would need expertise in both physics and ML.
There's a problem with collapsing states for back-propagation and the fact that quantum state can't be cloned which makes backdrop not usable there, but maybe something new will come up.
Has NN led to any fundamental breakthrough in physics research? I understand the impact from NN to scientific research, but I'm not aware of any big results in fundamental physics research because of NN.
It's not the Nobel Prize in Fundamental Physics though, and maybe this is correcting a bias that has been present for too long. Just because something isn't quantum or astronomical doesn't mean it's not physics.
Physics is the study of the physical world, and learning, imagination, creativity, are all phenomena that we observe in the physical world but have only a primitive understanding of. It's a staggering advancement that we can now simulate key aspects of each.
It seems like part of the motivation here is that it's possible to run many contemporary and planned very-big-science projects at all, since nobody's going to be sitting around analyzing centuries' worth of unvetted sensor data, but there are plenty of people prepared to spend a few years of their lives checking and massaging the regions of interest marked by a computer cluster.
The simplest cases will have been long enabled by simpler regressions and such, of course, but the more complex pattern recognition appears to be appreciated.
I strongly suspect consciousness will be the next great paradigm in physics, following electrodynamics, quantum mechanics, and general relativity, now that the ‘detour’ of string theory is mostly behind us. Some, like Penrose, are already thinking about it but too late in their careers to make any breakthroughs.
Machine learning research is the logical entry point as the ‘particle physics’ of cognition and consciousness.
I think in retrospect we will say it was obvious why so many physics PhDs were working on ML during this era.
> Their basic structure has close similarities with spin models in statistical physics applied to magnetism or alloy theory.
Statistical physics itself is hardly real physics, I also like how they avoided the term computational physics, which is what it's commonly known as. I suppose that might have given it away too quickly.
Why is that? I have video of Teuvo Kohonen explaining neural networks in 1985. "You can stack them" and "when a network has learned a skill, you can sell it as separate entity". What more you want?
No; The Nobel Committee has done a complete error in judgement with this.
These are Mathematics/CS techniques and nothing whatever to do with core Theoretical/Experimental Physics notwithstanding that they may have been inspired from Physics. There are plenty of Physics Researchers toiling away at real hard problems of the Physical World and instead of recognizing them the Committee has gone with "market fads" which themselves were only realizable due to Hardware advances at scale over the past decade. With this award they have disheartened and demotivated all true Physics Researchers which is a huge disservice to the Hard Science Community.
This is not to say that AI/ML researchers/community are not worthy of recognition. But they should not be folded under Physics rather a new category should have been created and they then awarded under it.
I think there has now been enough crossover between Information Theory and Quantum Mechanics, that we can stop splitting hairs between "it's an algorithm on a computer, that isn't physics".
It's not "splitting hairs" but a logical argument. When Pre-Scientific-Age "Natural Philosophy" was partitioned into "Physics, Chemistry, Biology, Mathematics" etc. there was an understanding of their boundaries (though technically there are none and everything could be argued to be just Physics) and the Nobel prizes were designed accordingly. Now of course we know better and it might be time to come up with something like "Nobel Prize for inter-disciplinary/cross-disciplinary achievements" with the disciplines listed out. So in this case it would mention Biology/Physics/Mathematics/CS.
Physics is pretty old and it has always been about understanding the fundamental structure of reality. If it doesn’t tell you how it all goes round then it is not physics: plain and simple.
Do physicists know the fundamental structure? They have some mathematical approximations that work for some measurements. They can make some predictions in some areas, but the same approximations break down in other areas. So the fundamentals aren't 'known'.
Some think measurements is engineering. So are the physicist that focus on building an apparatus to measure a theory, they are engineers? So only the theoretical people doing math are physicist? Even thought at that point they are only doing math?
Is Information Theory and Entropy a Computer Science subject or a Physics subject?
Physicists have learned quite a bit about the fundamental structure of the cosmos in the last 500 years. We can get into philosophical quibbles over what is knowledge and the relationship between approximations to reality, but we have clearly developed a very rich understanding of how the world works. A lot of the fundamentals are very clearly known. Entropy and statistical mechanics have been part of physics for 150 years and have clearly enhanced our understanding of the universe. Claude Shannon’s work definitively helped us understand the world more deeply. I think deep learning is interesting but it would be a stretch to claim that this has enriched our understanding of the universe by a large margin. Definitely not as much as Shannon’s work.
Funny, on same day there is a post from this guy that seems to think the nature of reality is computation. That the fundamental structure is computation and cellular automata.
Maybe it isn't 'machine learning' but definitely the lines are blurring between physics and other fields of information.
HN is the only place that helps laymen come to grips with information like this one. One need to read a lot of comments first and then jump to the link. Thanks guys/gals
Congratulations to Doctors Hopfield and Hinton! It's wonderful to see them both receiving such an esteemed prize, in recognition of the outstanding work they have both done.
Real physics and its resulting breakthrough technologies have been hidden from society for a very long time. And so they simply need somebody they can give that price to.
Hinton is only one of two people in history to win both the Nobel and the Turing award (Herbert Simon is the other, although he won the economics Nobel).
True, but that's not a very good reason for giving a Nobel Prize in Physics to something that isn't physics. I think the standard way of giving a Nobel Prize to mathematicians is to call it economics.
I think they look at the overall impact not just one single thing. a lot of people invented backpropagation independently but after that their impact was low. Hinton is like everywhere when you look at the state of machine learning now
Looks like the Nobel wants to fast track its march towards irrelevancy. What a joke.
I see a lot of people saying "physics has stalled" etc., which is not the case. It may be the case for high energy physics (I would not even make that statement myself with any confidence), but there is a lot of other physics being done.
More to do with neuroscience than you think. Fukushima took direct inspiration from Hubel & Wiesel's nobel prize in the 1960s when developing the neocognitron, which turned into convolutional neural networks. Hopfield networks are a model for associative memory. And, well, then there is the perceptron. There was always a link and mutual inspiration.
They're not identical but they are related. There's a series of approximations and simplifications you can go through to get from biological neurons to neural nets. Essentially the weights in the neural net end up corresponding to steady-state firing rates of populations of spiking neurons. See for example Chapter 7 of Dayan & Abbott's Theoretical Neuroscience.
Discussing the right levels of abstraction is a huge thing in computational biology. At what level is 'the algorithm' of natural computation implemented?
Except that the development of deep neural networks took direct inspiration biological neuroscience with neurons and synapses. Neural is even in the name. https://en.wikipedia.org/wiki/Deep_learning
DL did not take 'direct inspiration' from neurosciences. Maybe some ideas were borrowed such as the integrate-and-fire nature of neurons and Hebb's very vague rule, but those are very old ideas. Most of neuroscience research in past decades is in molecular biology , and particularly in the study of neural diseases (that's where all the funding goes). Learning and biological plasticity is notoriously complex and difficult to study, it's still very much undeciphered, and none of that plasticity research has made its way into ANN training.
In fact it is the reverse: the recent success of deep learning has sparked a race in neuroscience to try to find processes in the nervous that might mimic deep learning and in particular to build biologically plausible models about how the brain might implement gradient descent or more generally credit assignment.
People always repeat these stupid things like they're lore. Ok let's suppose this is true. What else is true is that neurology itself was inspired by phrenology and the practice of exorcisms. Should we now start recognizing and exalting those connections given how divorced modern (useful!) neurology is?
Hinton's most recent paper on forward-forward acknowledges Peter Dayan explicitly for his feedback on the paper, and cites a paper they cowrote together back in the 90s. Dayan being the author of the canonical textbook on theoretical neuroscience.
Major distinction given those practices have been abandoned as pseudo science or even worse, so they aren't fields of science continued to be developed which further useful connections might be found.
In psychiatry, there is a certain amount that we continue to study social standards of normalcy in other (including historic) societies to determine what should count as a mental disorder, but more to make sure we aren't doing a 21st century equivalent of labeling something as a demon possession because it contrasts with our current deeply held social norms.
So what is the meaning of to do with and nothing to do with? Inspiration seems to be a relationship.
Consider a different relationship between cellular biology and the Cells at Work anime. Clearly any relationship is unidirectional. Any cellular biology learns nothing from the anime, but the anime wouldn't exist without cellular biology.
Do we say the show has nothing to do with cellular biology? That doesn't seem right to me, given it depends upon it despite taking an amazing degree of artistic freedom.
The downvotes are very unusual to say the least. All the historical material on the subject unambiguously points to neural networks emerging from work done to formalize actual brain neurons. That formalism turns out not to be a great way to explain biological brains but the abstraction it provided proved highly effective for tasks like pattern recognition, classification, and decision making.
So much about computer science has been inspired from other fields such as biology. Polymorphism and object oriented programming, reification, neural networks and in particular convolutional neural networks, genetic algorithms...
If anything, it teaches the value in learning a topic and then applying it directly within computer science. The strength of computer science lies in its ability to adapt and incorporate ideas from other domains to push the boundaries of technology.
There are a lot of downvotes going around because a large contingent is thinking the Nobel Prize for "Physics" should not go to something involving Computer Science. That it was awarded as it was, was an error.
Seemingly because even if the math or algorithms came from a physicist solving physics problems . Since it didn't involve some theoretical particles, it isn't physics'y enough to get a Nobel in Physics.
At the very least neuroscience provides an "existence proof". Somehow this stuff must be possible using some sort of trained machine comprising a large number of simple components...
Honestly, I am stunned by today's Nobel committee announcement. Hinton's Boltzmann machine is a clever construct that nobody, repeat nobody, in the AI and ML is using anymore in actual practice.
Not really related to physics per sec, but to let physicist to get out of research in physics. The most self-denial award ever. But machine learning deserve a prize. Just this … anyway congratulations
I know I'm going out on a limb here, but is it possible that the prize has been awarded sarcastically?
Like, let's survey what our PhD students plan to do next in their careers. Oh wow, 62% say they are going into... AI? Well damn, why don't we just give the Nobel Prize for that, if it's such a hot field in physics right now?
I'm equally as confused; huge WTF moment. Seems like a paradigm breakthrough, in that Nobel Prizes can be given for discoveries in tangential fields. Or perhaps it's due to Dr. Hopfields physicist status, that all his discoveries are considered physics related? Or that NNs are considered a part of physics / nature?
> With their breakthroughs, that stand on the foundations of physical science
Uhm, no. This has negatively and retroactively impacted my appreciation of the award. If I were in academic CS I would have blatantly rejected it. This is ridiculous.
> So they gave the Nobel Physics prize to AI bros before honoring another woman. Five women were ever honored so and 221 men.
> This is your regular reminder Wikipedia refused an attempt to create a page for Donna Strickland with "This submission’s references do not show that the subject qualifies for a Wikipedia article" not half a year before she became a Nobel laureate. Katalin Kariko's page was not created until April 27, 2020.
#EverydaySexism #SystemicMisogyny
It got a decent amount of favs and retoots - and no angry responses. Now, when I posted this here, it got flagged.
That tells a hell lot about the people who visit the site. It's a great opportunity to check your own biases. That's why I am reposting it with this note.
Former high energy theorist here: things are not looking so good for high energy physics (both theoretical and experimental) which loosely speaking accounted for maybe 1/3-1/2 of Nobel Prizes in the 20th century. That’s part of the reason I got out. I’m inclined to say astrophysics and cosmology, another pillar of the fundamental understanding of the universe, isn’t doing that well either, probably in the okayish but not as exciting as it used to be territory. I’m not qualified to talk about other fields.
I think saying they're not looking good might be a bit of an exaggeration. Technological developments in both high energy physics and astrophysics stuff are in-between generations of technology right now, which is why things are a bit slower than usual.
With astrophysics, we're probably going to need the more sensitive gravitational wave detectors that are in development to become operational for new big breakthroughs. With high energy physics, many particle colliders and synchrotron light sources seem to be undergoing major upgrades these days. While particle colliders tend to get the spotlight in the public eye and are in a weird spot regarding the expected research outcomes, light sources are still doing pretty well afaik.
This Nobel I think is mainly because AI has overwhelmingly dominated the public's perception of scientific/technological progress this year.
> With high energy physics, many particle colliders and synchrotron light sources seem to be undergoing major upgrades these days.
AFAIK synchrotron light sources are tools for materials science and other applied fields, not high energy physics. Did I miss something?
I am also puzzled by the "many particle colliders". There is currently only one capable of operating at the high energy frontier. It's getting a luminosity upgrade [1] which will increase the number of events, but those will still be the 14 TeV proton-proton collisions it's been producing for years. There is some hope that collecting more statistics will reveal something currently hidden in the background noise, but I wouldn't bet on it.
>AFAIK synchrotron light sources are tools for materials science and other applied fields, not high energy physics. Did I miss something?
When you put it like that, yeah, I was kinda being stupid. During my stint doing research at a synchrotron light source I was constantly told to focus on thinking like a physicist (rather than as a computer engineer) and most of the work of everyone who wasn't a beamline scientist was primarily physics focused, which is what led me to think that way. But you're right in that it might not make much sense for me to say that makes them high energy physics research tools first.
>I am also puzzled by the "many particle colliders". There is currently only one capable of operating at the high energy frontier. It's getting a luminosity upgrade [1] which will increase the number of events, but those will still be the 14 TeV proton-proton collisions it's been producing for years. There is some hope that collecting more statistics will reveal something currently hidden in the background noise, but I wouldn't bet on it.
The RHIC is also in the process of being upgraded to the EIC. But overall, yes, that's why I said they were in a 'weird' spot. I too am not convinced that the upgrades will offer Nobel-tier breakthroughs.
What are you considering "high energy physics"? "1/3-1/2 of Nobel Prizes in the 20th century" is a significant overestimation unless you are including topics not traditionally included in high energy physics. For example, there were many Nobel prizes in nuclear physics, which shares various parallels with high energy physics in terms of historical origins, experimental techniques, and theoretical foundations. But nuclear physics is in a very exciting era of experimental and theoretical developments, so your "not looking so good" description does not apply.
Much of nuclear physics was effectively “high energy physics” (or more appropriately named elementary particle physics) back in the day. They ceased to be elementary or high energy at some point. My very loose categorization is everything on the microscopic path towards the fundamental theories; and there’s another macroscopic path, cosmology.
Agreed on that. My disagreement is with the statement that everything that was once referred to as high energy physics is "not looking so good". Nuclear physics in particular does not feel stuck in the way I've heard some high energy physicists talk about their field.
As a layman, the visualization of black holes, the superstructure above and below the Milky Way, JWST’s distant galaxy discoveries, gravitational wave detectors as mentioned, and some of the Kuiper Belt observations all seem to be interesting and exciting.
"theoretical physics" is such a big and ambiguous concept that physicists tend not to use the word in discussions. Thereotical work often involves a lot of numerical simulation on super computers these days which are kind of their own "experiments". And it is usually more productive to just mention the specific field, e.g. astronomy, condensed matter, AMO etc, and you can be sure there is always a lot of discoveries in each area.
Physics is not stuck in string theory as physics is not just high energy theoretical particle physics. There's also more going on in high energy theoretical particle physics than just "string theory".
Much of the experimental action in recent decades has been in low energy theoretical particle physics. Down near absolute zero, where quantum effects dominate and many of the stranger predictions of quantum mechanics can be observed directly. The Nobel Prizes in physics for 1996, 1997, 1998, 2001, and 2003 were all based on experimental work down near absolute zero.
Please bro just one more collider. Just one more collider bro. I swear bro we're gonna fix physics forever. Just one more collider bro. We could go up or even underground. Please bro just one more collider.
What a bunch of BS, yet another field trying to steal the thunder of CS. How often have I had to listen to physicists sneer at CS as not a proper science!
I've always sided with Feynman on this, and this proves him right: wtf do these people think they are appointing themselves fit to hand out trinkets and baubles on behalf of global scientific achievement?
It brings the award into disrepute, or at least in a Feynman way, exposes the inherent disreputability of awards themselves: who are they to award such a prize on behalf of physics?
Awards committees: self-serving self-appointed cliques of prestige chasers
>> Laypeople needs a simple way to know who's who in advanced research fields, without Nobel prices (or any other commitee) we don't get to have that.
I think first you're underestimating "laypeople" which seems to include many scientists who are not physicists, and second you are forgetting that many of the scientists the "lay" public knows as the greatest of all times never received a Nobel, or any other famous prize: Einstein, Newton, Kepler, Copernicus, Galileo, etc etc.
Neither for relativity nor mass-energy equivalence though, which laypeople are much more likely to know about than the photoelectric effect (what the price was actually awarded for).
Depends on the quality of the '"lay" public' I guess.
Where I live, in my estimation the 'educated "lay" public' would probably have heard of all the names mentioned, but with even worse notions of what their actual contributions were for Kepler.
The economics of this topic have always been interesting to me, especially when compared to various other fields. What is there to incentivize people to enter STEM fields, and especially research?
As a point of comparison, there are ~540 premier league football players, with an average salary of 3.5 million pounds. (Yes, that's average, not median, but there's less than 20 of them that earn under 200k.) It's not _that_ exclusive of a club, and the remuneration is insanely disproportionate, compared to academics - I highly doubt there are hundreds of researches earning millions.
So, yes, it's pretty odd to have some random people dish out these prizes, and they are a drop in the pond. However, I personally feel it's way too little, and that the targets of the prizes are far more deserving - even if it's a popularity contest - than random entertainers (even if they are quite entertaining). But, it's up for argument, and the markets obviously don't seem to agree with me.
Weirdly, if you sniff the XHR from [0] (when it loads a new page), it claims there's 1171 players for 24/25. Except if you look at a few of the teams individually, they're between 30-35 players. Which is much more in line with your ~540 than their 1171.
> the remuneration is insanely disproportionate
I once pointed out that Kevin De Bruyne, on his own, gets paid almost half as much (~21M) as the entire salary cap of the Rugby Union Premiership (~2022, 50M) (to make the point there's much more money in football than rugby.)
"I highly doubt there are hundreds of researches earning millions." -- by doing purely academic research, maybe not. But, the number of people who have moved from academia to industry off the strength of their research and made millions is probably much larger than you think. I'd wager just in ML you could round up a few hundred between OpenAI, Anthropic, Google/DeepMind, NVidia, Meta/FAIR, etc.
If Physicists could split atoms with only their arms and legs with some safety equipment, I bet they would get paid even more than 3.5 million pound salary.
Google scholar rankings of conferences or individuals by H-index or citations is a perfect way for both lay people and academics to measure each others achievements.
Even though the many of the Oscars nowadays feel rigged (with full lobbying arms from the studios behind them), my understanding was that the "Academy" (from the Academy Awards) consists mostly of your fellow filmmakers.
So it is an honor bestowed by your peers, the ones who would most appreciate the quality of the work and the work that went into it.
When Alfred Nobel wrote his will in 1895, there was no computer science or information theory.
One could argue it's closer to mathematics than physics, but if you'd say to him that someone made sand think like a human he might even put it under the medicine category.
It is different for Peace prize , it has always been political and different from day one , it is awarded by Norwegian noble committee which is appointed by the Norwegian parliament.
All other prizes are awarded in Sweden by Swedish academy(for literature) , Royal Swedish Academy for sciences (physics and chemistry) , karolinska institute (physiology) are all professionally established organizations at the time of Nobel’s death with other activities professional organizations do.
Norwegian Nobel committee while in theory independent is just people appointed by the parliament with no need to have professional standing in their field on which they are supposed to award the prize in and it always shows .
Obama’s prize is hardly the first egregious one or even the most outrageous Henry Kissinger got one .
So it kind of matters on the changed standards in swedish technical ones while peace has been disaster for half century or more , with non transparent process selected by ex- MPs
Obama’s and Kissinger’s prizes are both disastrous for entirely different reasons. And I don’t mean anything partisan by that. Kissinger was awarded a peace prize for achieving a cease fire in the Vietnam war … a war in which he’d personally been responsible for some of the worst and most illegal excesses. Obama was awarded the peace prize for literally nothing. It was anticipatory, meant to urge him towards ending American wars in the Middle East (he escalated instead). The Nobel committee gets it wrong looking both forwards and backwards.
It is less about on what reasons made them poor laureates specifically and more important on what is wrong with the process itself
Unless the process of selection of the committee (unqualified, political) and process of selection of the winner(opaque and inconsistent criteria) is fixed we will have poor candidates winning it in the future too
Unlike the economic prize peace was one that Alfred Noble actually wanted himself to be an award given too.
One could also argue peace prize should be the most important of them all. Noble wished to mitigate the military application of his invention, the peace prize should be achieving that most.
The continued poor judgment of the parliament in selecting the committee and the committee in selecting winners devalues the prize for future winners and devalues the good work the Swedish institutions have put in making it the premier award in their fields
I definitely think their work is deserving of awards, but I kinda agree with other commenters in that this says more about the Nobel committee than anything
i.e. Hinton has already won a Turing Award in 2018, and there is no Nobel for computer science
And this work was already recognized to have impact ~12 years ago, when he auctioned his company of 2 grad students to Google/Microsoft/Baidu/Facebook, for over $40M, ultimately going with Google [1]
---
i.e. IMO it feels a little late / weird / irrelevant to be giving this award in physics to machine learning research – it doesn’t feel like that would have happened without the news cycle
At least IMO the scientific awards are more interesting when they're leading indicators, not trailing ones -- when they are given by peers, acknowledging impact that may happen in the future.
Because it often takes decades to have impact, and it may occur after the researcher has passed away
>scientific awards are more interesting when leading indicators
Peter Higgs waited 50 years, the Nobel is not a "leading indicator." If it was, it would be given out on the basis of the "hype cycle," which would not be very helpful to anybody.
So I guess I mean "drawing attention to something that would have not otherwise had attention", and based on the consensus of people working in the field
Not the poster, but I don't understand the downvotes: this is exactly right. Higgs was awarded the Nobel after the mechanism he theorized was experimentally confirmed, and that is 100% the reason it took so long.
Right! Einstein didn't get the Nobel because the theory of relativity is awesome, he got it after Eddington observed gravitational lensing during an eclipse, confirming a key prediction.
Brilliant theorizing can be both brilliant and wrong.
This must be frustrating to see for all the actual physicists out there. What work in physics got ignored so that a prize for AI could be shoehorned in?
I do think work on neural networks does rise to the level of a Nobel Prize. So I don't have any problem with this work getting such high-level recognition. But I really struggle with the physics classification and the side effect of omitting an award to physicists this cycle.
I was scratching my head but then it seems like the precedence for awarding Nobel prize for Physics for something that isn't "exclusively" physics has been set before.
To me, I'd rather see a Nobel Prize in Math/CS/IS but if I had to choose where these type of work would be shoehorned into existing Nobel prize category physics would be it.
The distinctions between experimental-theoretical scientific disciplines are fairly arbitrary, e.g. where does one draw the line between physics and chemistry and biology? Mathematics is something of an exception but there's no Nobel for that, nor for astronomy, planetary science, etc.
The Nobels are grossly overrated and the idea that one can follow the most important scientific developments of the past century by just listing off the Nobel Prizes since 1905 is one best abandoned.
It just makes one wonder, what is the point of categories of Nobel prizes? Should they instead hand out half a dozen or so prizes each year for whatever is most important to them?
The categories correspond to who's filling out the nomination paperwork, and voting. You might think of it as a "Nobel from physics," which only usually is a Nobel in physics.
This is totally bizarre, no precedent for it really. The reality of the prize means that less and less are the winners names every physicist has heard of, but even today they're still big names in each subfield. For e.g. Kosterlitz, Thouless and Haldane weren't exactly household names but they really deserved the prize in 2016.
In this case, there's a good argument that Hopfield had conducted strong work as a physicist and in physics, but Geoffrey Hinton has never worked as a Physicist, at best adopting some existing things from physics into cognitive science use cases. In any case, what they've been given the prize for is work where they've not contributed to the understanding of the world of physics - it's not even really an arguable case where this is work that crosses over between Physics and another field either. It'd be like if Black or Scholes had been given the Physics prize rather than Economics because their famous equation can be re-written in Schrodinger equation form.
The citation has a large section on the impact of DNNs on physics research practice. They cite mega-projects that depend on this tech, for example. Hinton's inventions are a contribution to physics, not a contribution in physics. Hopfield of course was a bone fide physicist.
I just find that justification really strange - you could have made the same argument about K&R or John Backus because C and Fortran have had enormous impacts on Physics research, much more so than AI has to date.
The key is field. A physicist use maths will not get maths prize if just use it for physics.
He use words and its lyrics has meaning, like any literature. Cannot say poetry is not literature. Then why not poetry with music, probably more traditional as many poems are songs. In some culture, it must be singable.
These use physics but not in the field of physics. Otherwise anyone use qm can get Nobel prize and chemistry people can get one as they all use physics. Really need to be in the physics field. You can use other method like computer, maths.
This is embarrassing. I would say Hopfield networks aren't even very revolutionary in neuroscience, but they're so old I can't tell. In terms of AI... they've been irrelevant for thirty years. I guess you could argue a transformer is a generalized Hopfield network, but of course that's a post-hoc understanding. None of this has anything to do with physics.
So what if an energy function lets you approximate the number of macro-states it can capture? Should every mathematics paper with Lagrange multipliers be put up for nomination? Every poll that uses the law of large numbers, and thus, entropy? Surely the computer scientists building the internet need to be included as well, since their work is based in information theory.
Or maybe, hear me out, we reserve the Nobel Prize in physics for advances in the physical sciences, understanding physical reality or how to bend it to our will.
Had they wanted a good ML relevant physics Nobel, the committee had decades to award a prize to Marshall and Arianna Rosenbluth for the Markov Chain Monte Carlo method. Would have been self-evidently important and relevant to both physics and ML. Too late now -- Arianna died in 2020.
There were some predictions that Peter Shor could win this year for quantum computation. I'd say his work is a lot closer to physics than Hinton's or Hopfield's.
The goal here is to attribute a very important area in contemporary technology to physicists. This prize advances physics in terms of giving it higher importance in the minds of lay people and journalists.
“These artificial neural networks have been used to advance research across physics topics as diverse as particle physics, materials science, and astrophysics,” Ellen Moons, chair of the Nobel Committee for Physics, said at a press conference this morning.
The landmark Deep Belief Networks (stacked RBMs) paper in Science was in 2006 [1]. DBNs were completely obsolete quite quickly, but don't deny the immense influence of this line of research. It has over 23k citations, and was my introduction to deep learning, for one. And cited by the Nobel committee.
You're completely incorrect to say RBMs were of theoretical interest only. They have had plenty of practical use in computer vision/image modelling up to at least a few years ago (I haven't followed them since). Remember the first generative models of human faces?
Edit: Wow, Hinton is still pushing forward the state of the art on RBMs for image modelling, and I am impressed with how much they've improved in the last ~5 years. Nowhere near diffusion models, sure, but "reasonably good". [2]
[1] G.E. Hinton and R. Salakhutdinov, 2006, Science. "Reducing the Dimensionality of Data with Neural Networks"
This feels.... Weird, It feels like Turing award exist for a reason and Hinton getting Nobel prise in physics is a stretch, unless you claim his contribution extend to things such as development of Alphafold.
In any case, I'm anticipating a long blog post from Schmidhuber about this soon.
When the most significant advance since electrification needs to hop the fence to be recognized, perhaps it's time to add a new field. It can be done, the Economics prize was added in 1968.
Well, society would collapse without computers so I think the description is apt.
At best you could argue that they're the same phenomenon, but then you might equally well argue electrification is just the consequence of steam engines.
Hard to tell honestly. I just chose to fill in the blanks in a way that made for the strongest argument.
I mean a Nobel prize category for advances in computing makes a lot of sense and I can easily name a whole list of people who could qualify. We'll need to be quick if we don't want to award some of them posthumously though.
Sincerely, I don't like a shadow cast over the Turing and Gödel prizes. These awards have long honored groundbreaking achievements in computing and logic.
IMO Turing Award is plenty prestigious - and has more legitimacy as its awarded by the relevant community (ACM) - rather than some small group (the Swedish Academy of Sciences) - tbh on that vein I'd say the right thing to do is to ditch the Nobel and let each community in the relevant field decide as a community the work to honor - prevent fiascos like this.
(and, working in the field, I completely disagree with the qualification as "most ...." - it has well known deficiencies and has not yet stood the test of time)
This is unlikely to ever happen, because Nobel explicitly excluded mathematics from the list of prizes in his will. There are plenty of awards and prizes for every field imaginable, not everything has to be a Nobel prize to be worthy of recognition.
What matters for an award is that people recognise it as a prestigious accolade.
The economics prize, while not “official”, is still recognised by everyone in economics as the highest honour in the field. Who cares if it’s “official” or not?
Awards and prizes derive their value from their social recognition, which it has a solid amount of, at the very least.
There is nothing wrong with their connection with dynamite. Nobel designed it to prevent deaths in construction and mining, because nitroglycerine was way too dangerous (and way too useful to be abandoned). It's bad reputation comes from it's use in warfare, which is undeserved because it was not very well suited to that use and was quickly replaced by other solid explosives.
The significance is that it's not a Nobel prize. Saying that is simply formally wrong. It's a prize lobbied in (with a hefty donation) almost 70 years after the establishment, trying to raise the status of Economics as a scientific discipline by basking in the reflected glory of the actual Nobel prizes.
You may not care about the distinction, and if so that's your prerogative, but this Memorial prize in Economics, despite sharing in the festivities, is not in the same category and that's what you keep running into seeing pointed out.
The General Public and Economists hold it in the same regard as other Nobel Prizes so appeal to 'formality' is pointless. The social recognition is the point of these awards so if it has that and is also often called the "Nobel Prize in Economics" then it's a Nobel Prize. They're literally announced and awarded together.
Nobody but a few nitpicks care about your distinction because it's not a real one. Might as well say "Money is not valuable because the material it's made up of has little intrinsic value". Well no, Money is valuable because society has decided it is.
It is also my prerogative not to care about your opinion. You claimed you come across this a lot and don't get it. I just told you. Take it or leave it.
Apologies to NlightNFotis for implicitly accusing you of being the griefer. I replied quickly between other tasks and evidently didn't pay due attention to the username. No misplaced offense intended.
That makes it sound like it has no connection to the other prizes.
It's awarded by the Royal Swedish Academy of Sciences, who also award the Physics and Chemistry prizes. Its winner is announced with the winners of the original prizes. The winner in included in the annual Nobel Prize Award ceremony in Stockholm, and receives a medal, diploma, and monetary grant award document from the King of Sweden at that ceremony. The Nobel Foundation counts it when they say their are 6 prize categories, and includes its winners on their lists of Nobel laureates.
It only differs from say the Chemistry prize in that it was established in memory of Nobel instead of by Nobel and the prize money doesn't come from Nobel's estate.
The economics prize is listed on nobelprize.org ("the official website of the Nobel Prize") along with the other Nobel prizes, so I don't think you can justify calling it "unofficial".
Perhaps if the ACM renamed the Turing Award to "The Alfred Nobel Memorial Prize in Computer Science", the Nobel Foundation would let them get away with it.
The economics prize is not an actual Nobel prize, but something "inspired" by the Nobel prize. It's little more than a tool to push neoliberal policies to the public, with 34 of the 56 winners tied to the Chicago School of Economics.
There was at least a credible story of fission/fusion for the benefit of humanity. Here, we know that AI is primarily used for target systems, surveillance, opinion manipulation, slop content, etc. If AGI ever succeeds it will be used for eliminating all knowledge jobs.
Nuclear weapons have not been used since 1945. Do you think that systems like Lavender won't be used in the future? Zero chance.
So, the claim is that image recognition and ML (no, wait, broadened to AI?) is primarily used for targeting, surveillance, opinion manipulation, slop content, etc, and that AI has no credible benefit story, and is therefore nothing more than a weapon?
I disagree fundamentally with that, and don't see how we could reach a mutual understanding working from that axiom.
Feynman would voice his objections if he were alive ... what about nature was discovered? ANN is an application of a variant of Universal Approximation Theorem ...
Feynman was a well-known proponent of AI and neural networks [1]. He even gave popular lectures on the subject [2]. He also claimed that replicating animal-like visual recognition abilities in machines would be Nobel-worthy; deep learning was certainoy a breakthrough in that.
IMO this award recognizes the effort and success of modeling the acquisition of knowledge, especially toward realizing such models in usable ways that will surely redefine life on this planet forever. Few Nobels have recognized work that is so culturally groundbreaking (and so disruptive). It'd be hard to see the Nobels as a significant measure of science/technology if they did NOT acknowledge the revolution begat by ML using DL. And with Hopfield at age 91 and Hinton 76, now is the right time to do so.
"This year’s laureates used tools from physics to construct methods that helped lay the foundation for today’s powerful machine learning."
Does this mean if I'd use a deep understanding of birds to design way more aerodynamic airplanes, I could get the Nobel prize in physiology/medicine? Don't get me wrong, their work is probably prize worthy, but shouldn't the Nobel prize in physics be awarded for discoveries in the _physical world_?
> Don't get me wrong, their work is probably prize worthy
I would strongly disagree with you there. It's the exact same idea as the least squares approximation or conjugate gradient method: create an energy function from a quadratic and minimize it.
"This year’s laureates used tools from physics to construct methods that helped lay the foundation for today’s powerful machine learning."
Does this mean if I'd use a deep understanding of birds to design way more aerodynamic airplanes, I could get the Nobel prize in physiology/medicine? Don't get me wrong, their work is probably prize worthy, but shouldn't the Nobel prize in physics be awarded for discoveries in the _physical world_?
I studied physics in the 90s and we had an NNs course, where most of the models were inspired by physics (MLPs was just one). NNs have been used since decades for identifying e.g. the trajectories of particles at CERN. I remember Hinton's work with Sejnowski (who probably should also be awarded). I was actually surprised to find out that Hinton was not a physicist by training
Obviously physicists take great interest in models of the brain or models of intelligence. All of physics is modeling , after all
Not all modelling is physics, but a rather large part of modeling is. My PhD is in complex systems, and you would be surprised by the range of systems we did study. My work was on a more "traditional" field of high dimension fractal surfaces, but we had a student working on public transit models, another on ecological pattern formation, and so on.
At least the somewhat free interpretation of field boundaries is nothing new. The physicist Rutherford ("All science is either Physics or stamp collecting")[1]
won the Chemistry Nobel Prize.
Influence and consideration of the Zeitgeist is also nothing new. Einstein got his prize for the discovery of the Photoelectric Effect and not Relativity.
[1] I know that some people have interpreted this quote in favor of the other sciences but I think that is far fetched.
But, the starting point of Neural Networks in the ML/AI sense, is cybernetics + Rosenblatt's perceptron, research done mathematicians (who became early computer scientists)
That's why I wrote that it was unexpected.I'm not taking position of if this was deserved or undeserved, but this was clearly in the realm of physics and inspired by it.
Accepting wrong arguments in support of positions you have is not good way to live your life. It leads to constipation.
The Society for Birdology now has the pleasure of jointly awarding posthumously Plato and Diogenes with the Distinguished Birdologist Award. Their findings on human anatomy used insights from birdology at critical points. Well done, lads!
> Does this mean if I'd use a deep understanding of birds to design way more aerodynamic airplanes, I could get the Nobel prize in physiology/medicine?
Yes I think it does. But those planes would have to create one hell of a buzz!
Hm, they have to fit them into Physics, Chemistry, Medicine, Literature, or Peace. I guess physics is the closest they can get without a gross missplacement? (Although you might be able to absue literature for LLMs?)
I think that you can grow mathematics through applied mathematics. It's something that grows the domain where Mathematics is useful, even though the maths themselves where known and somewhat well understood in a more abstract way.
Considering this, it feels odd not to allow a similar thing to happen on physics.
It's definitely not how "they" work. It's not like a committee choosing an achievement across all the fields and then trying to put it into one of the 5 buckets.
We have Turning Award, Fields Award and the other thousands of awards for achievements that can't be categorized as Physics/Biology/Economics/Chemistry.
I IS a physics problem. Non physicists tend of think that the only areas being studied are high energy and/or cosmology, but modern physics covers a multitude of areas, including complex systems.
Does that mean that computer scientists who do neural network research should be considered physicists? Do physics journals accept submissions on neural networks research under the same justification?
Honestly it feels a bit weird with a Nobel laureate in physics who probably knows a lot less physics than even I* do… Makes me cringe a bit to be honest.
Also makes me sad when I think about all the physicists and engineers who have made the chips that can train multi-billion parameter neural networks possible. I mean the so-called “bitter lesson” of AI is basically “don’t bet against the physicists at ASML et al”. No prize for them?
(*) I have a humble masters in engineering physics, but work in ML and software.
I work in industry supporting these supply chains; our advancements are part of a hive mind that could be harmed if individuals were artificially highlighted for achievement.
The academics can have their awards, we smile seeing the world change a bit at a time.
Nothing wrong with picking some random people out of a hive mind. There seems to be some notable contributions around EUV for example [1].
And BTW, is the same not true for machine learning? I don’t think many have even read the Boltzmann machine paper. It’s basically a footnote in the history of deep learning. It has no practical significance today.
I think one can consider what AI will bring to the field of physics. Merit is quite deserving of math and science of building tools which will unlock potential discoveries from here into the future.
Despite all of the talk surrounding AI in the workforce/business world, I think it is actually most important in science.
But, this is more of a applied math than physics. There are many other scientist that contributed more towards understanding of quantum systems, e.g. Aharonov.
Also, as a tool, it has not been as useful as influential as they make it out to be, at least less influential than the work Aharonov in terms of increasing our understanding
How do LeCun and Bengio feel about being left out of the most prestigous prize of them all? (Geoffrey Hinton, Yann LeCun, and Yoshua Bengio was together awarded the Turing prize in 2018...)
I'm annoyed that he was awarded just now, obviously as a reaction to ChatGPT and the breakthrough of LLMs. If his work is worthy, it has been worthy many years ago.
This reinforces the reduction of ML to LLMs, just like the use of the term AI.
This is such a tired reply. The peace prize is not part of the same group as the other awards, and a significant difference in the peace award is that intent is awarded not results.
The dude who invented the MAD doctrine did not get the award despite nuke deterrance doctrice being related to the least amount of wars in any century since WW2.
But his platform of deescalation and his plans for american foregin diplomacy were rewarded. He ultimately failed to reach those goals (specially with the escalation on Afghanistan and the emergence of groups like ISIS), but tbh the Iran agreement and the Pacific trade agreement, killed and buried by the next administration, would have created a massive buffer and solution for the 2 hotspots we currently experience around the middle east (where terrorism is largely sponsored by Iran) and the Taiwan takeover by the CCP (would also be partially neutralised by the Pacific trade talks).
He was naive, in the way the world was naive to the ability to sacrifice prosperity that some leaders are capable of. He underestimated how dumb and suicidal putin could be, he underestimated how much China would be willing to sacrifice in terms of potential, he underestimated how much violence was latent and capable in the middle east. but his nobel peace prize was due to his campaign running on nuclear proliferation treaties and closer relationships with the muslim world which had been entirely antagonistic since Bush
Well its the only one selected by Norway instead of Sweden, its also the only one selected on intent and not achievements. So its not the same in important ways
The road to hell is paved with good intentions. The award shouldn't have been given for intentions, before he even did anything. We should not reward promises, but action. Even a long term member of the committee expressed regret in them giving it to Obama.
> Even a long term member of the committee expressed regret in them giving it to Obama.
That is nothing compared to past controversies.
People left the assembly and resigned when it was awarded to Kissinger and Arafat in the past. regret is way milder than calling the receipient a terrorist in the floor of the award ceremony
He received it before any of that. And Libya does actually cancel every point you mention by the way. Because it's actually not hard to have presidents not start wars at all- both presidents since Obama did just that.
And if the real Nobel prize doesn't want the confusion around its name to happen... it should do something about it?
which is why he got it based on his plans and not his actions
> and Libya does actually cancel every point you mention by the way.
it really doesnt. Lets begin with the main reasons, he was awarded the award for nuclear profileration agreements and a new american policy in the middle east. Lybia is not a nuclear power and its in north africa not the middle east.
secondly the military intervention of Lybia came at the behest of a UN security council resolution that put NATO in charge of securing the no fly zone to prevent Gadafi to bomb his own citizens after he had shot protestors during the arab spring. The NATO mission was led by France. The USA involvement ended the day the UN security council ended the mission despite the new Lybian goverment wanting them to remain. It is not Obama's fault that half the arab world exploded in protests in 2011, or that the UN voted to intervene, or that the French led mission was a bit of a clusterfuck. So no, Lybia does not affect any point I mentioned, or any of the reasons for the comittee to vote for him years earlier.
> it's actually not hard to have presidents not start wars at all- both presidents since Obama did just that.
Trump started a war, Iran just didnt follow through. Killing Soleimani is casus belli and Iran had every right to retaliate against america. The fact they didn't does not somehow exonarate Trump from his actions. That was way more belligerent than any action taken under Obama's 8 years.
Biden did not start any wars but 100% would have intervened if ISIS had begun under his presidency, the same way Obama did. Obama did not start any war against any country, he just had missions in countries america was already in, like Afghanistan, or contributed in international efforts like the Syrian civil war, or lybia intervention after Gadaffi's Un resolution.
His reputation as war mongering is artificial and designed by the same people who told Trump that if you dont test for Covid you get less cases. America started reporting less the drone strikes they carried, but carried them more often under Trump for example. Its the same sleight of hand that people use to say Sweden is worse off because they have more rape cases. They simply report them more often. Obama was more open than further admins on their interventions, that does not make it happen more or less often.
> it should do something about it?
They did not award it to Gandhi and gave it to Kissinger. The fact people still care about that award is bonkers
The amount of people who act like Obama is a war monger without understanding the situation he found himself in is shockingly high, especially on a website like this with its supposedly "educated" people.
Losing the TPP (Minus the IP parts)/Asia Pivot and the focus away from Nuclear Non Proliferation are terrifying. Obama is directly the reason why Myanmar had its democracy for as long as it did, and most people in South East Asia have not found anyone nearly as inspirational as him from America since 2016 and likely won't for awhile longer.
Obama was awesome, and his legacy has been unfairly malingered. He was not the "warmonger" president that revisionists like to portray him as.
> The amount of people who act like Obama is a war monger
Its deliberate. Conservative PACs designed that legacy and pushed it hard. Trump quickly stopped reporting drone strikes, so that way he could pretend Obama was a big bad shooting at everyone. Not reporting != not happening.
> Losing the TPP (Minus the IP parts)
I actually see the point to the IP parts. Its a complicated mess, but China has abused it in the past so being able to sue goverments has its uses. For example when Lenovo was accused of IP theft to HP computers, the CCP bought stock in lenovo and made it impossible to take them to trial. Those kind of abuses are an issue when you try and promote fair competition due to high RD costs.
Obviously the can of worms it opens is huge and an issue in itself, but I see the point in why it was added to the TPP agreement and can't imagine how hard it was to put that in, before Trump came and broke the whole thing.
> Obama was awesome
Dealing with the worst recession in a century, passing the largest US healthcare change in history, preventing the arab spring from exploding everywhere, stopping ISIS, swift to the pacific etc. The amount of achievements its hard to point out when after that came a circus clown who would salute north korean generals.
> The dude who invented the MAD doctrine did not get the award
No, he didn't win the award, because MAD doctrine (aside from it being immoral) doesn't actually work in the real world.
It's an idealized model based on game theory, which doesn't deal with pesky complexities such as irrationality, salami tactics, short-range CBMs, anti-missile defenses, tactical nukes and so on. (That's why many of these things used to be banned by treaties, to continue to pretend that MAD is actually required for peace. In reality many nations do not have nukes and live in peace.)
> In reality many nations do not have nukes and live in peace.
not many of them are superpowers, or strategic interests of superpowers. See Taiwan, a country that until recently felt safe and at peace and is no longer unthreatened.
Most studies show that MAD allows for strategic peace for large superpowers and more regional wars for smaller countries. Ultimately it still decreases overall violence under all empirical studies on the subject.
The point I was making though was that the achivements of MAD are not measured when giving the award. However Israel and Palestine sitting down to talk in the 90s was, despite the talks ultimately going nowhere and being worse off now than before the Nobel Peace award
It does work, you just need credible trigger thresholds for the salami tactics, treat tactical nukes as strategic, and have enough nukes to punch through ABM.
In addition to the other replies, he is the only US president in modern history to explicitly authorize the assassination of a US citizen without a trial, and create a legal doctrine allowing future presidents to do so; and he was the major escalator of the use of drone strikes in war (the practice started with Bush, but it expanded many fold under Obama).
> […] [Obama] is the only US president in modern history to explicitly authorize the assassination of a US citizen without a trial
Just one of the many things Obama did that upsets me so much. The precedent he set with that is criminal.
Of course I’m against terrorism, but our government MUST NOT have the right to classify Americans as terrorists and just execute them without a trial—via drone strikes!
Most Americans likely don’t even know about what happened to the al-Awlaki’s, which is unfortunate.
Just because those countries could not realistically engage in a war with the US, seeing as they lack the necessary technology. Obviously, if you shoot fish in a barrel you're not starting a war with the fish, but that doesn't necessarily mean you're doing much to advance peace with the fish.
Tangential to your question but not the premise of this subthread/post - he became president in Feb 2009 and got the award in October.
I don't think he started any new wars, but he inherited some and continued. Anyway, the point here should be the absurdity of a lot of Nobel awards and that stands - especially in his case.
I mean Trump was nominated for the award for fuck's sake! More than 2 or 3 times iirc. So anyway.
Obama intervened in the Libyan civil war. The outcome was disastrous for Libya (13 years of chaos and counting, the entrance of ISIS into Libya, the re-emergence of slavery in Libya, to name a few consequences). Obama blatantly violated the War Powers Act, which requires the President to seek Congressional approval for any war waged abroad after 60 days. The act was passed on the tail end of the Vietnam War, to prevent a repeat of things like Nixon invading Cambodia in secret. The US Constitution gives Congress the power to declare war, but that power is absolutely meaningless if the President can just wage war wherever he chooses without a declaration.
Obama specifically won the Nobel Peace Prize for talking about his "vision of a world free from nuclear weapons" as a candidate. As President, he initiated a massive program to upgrade the US' nuclear arsenal. It made a complete mockery of the Nobel Peace Prize, though Kissinger also won the Nobel Peace Prize, so it's not as if the prize has any credibility anyways.
The outcome was positive for Libya, as it experienced only a fraction of human suffering compared to Syria where the United States did not intervene against the regime.
Either way Libya operation was spearheaded by France with Obama joining only reluctantly later.
Can you explain why starting a war (still ongoing), killing >10k people, and converting Africa's best functioning and richest country into one of the world's worst functioning places is positive outcome? I don't understand this.
The Syrian Civil war was clearly (in parts) engineered by the west. Here is some evidence.
The US intervened in both civil wars, though in Syria its involvement early on was much more through funding and arming of various armed groups - notably Sunni fundamentalist groups. How you can say that the outcome was positive for Libya is beyond me. The country was utterly destroyed. It went from being the one of the most developed countries in Africa to a war-torn country with competing warlords and open slave markets.
Human death toll in Libya and Syria differ by almost 60x. Half a million Syrians could have lived, the refugee crisis and the rise of far right in the West could be avoided had Assad been droned in 2013. Putin would also not have dared the 2014 annexation either.
The US had hundreds of thousands of troops on the ground in Iraq for over a decade. More than half a million Iraqis died. There was intense violence between different religious groups and political factions. But you come here and say that everything would have magically gotten better with more US involvement in Syria.
A direct American intervention in Syria probably would have made things even worse. Droning Assad, as you suggest, probably would have led to an even greater amount of chaos (besides being totally illegal). It's bad enough as it is that the US funded Sunni extremists in Syria.
Notice how I specifically talked about Syria and Libya. I (along with a lot of other people) opposed Iraq war as well and it took you to pull it here for lack of consistent argument.
Don't see the point arguing with you further. Some day both Putin and Assad are going to be dead and I hope they suffer in their last minutes. I will be cheering while you will be mourning your tyrants.
You argued that the US intervening more heavily in Syria would have prevented all of the human suffering. I'm pointing out to you that the US' other interventions in the Middle East show that the opposite is likely the case.
Just imagine the chaos in Syria if the Sunni extremist groups that the US supported had won. How would the various religious minorities, like the Shiites, Alawites and Christians, have fared? What's the chance that the Sunni extremists would have carried out genocide against religious minorities? It's one thing to say that Assad is a tyrant, but another to say that everything would be better if the US toppled him.
In Iraq, supporters of a US invasion made the exact same argument. "Saddam is a tyrant? Why don't you want to get rid of him?" The US toppled him, and half a million people died as a result.
Your analysis - everything will be better if the US topples tyrants (and realistically, empowers people who might be even worse) - is very simplistic, and has a terrible track record in the real world.
That's the second withdrawal from a second presence, requested by the Iraqi government after the rise of ISIL.
> The United States completed its prior withdrawal of troops in December 2011, concluding the Iraq War.[9] In June 2014, the United States formed Combined Joint Task Force – Operation Inherent Resolve (CJTF-OIR) and re-intervened at the request of the Iraqi government due to the rise of the Islamic State of Iraq and the Levant (ISIL).
> On 9 December 2017, Iraq declared victory against ISIL, concluding the 2013–2017 War in Iraq and commencing the latest ISIL insurgency in Iraq.
Perhaps those troops should have been withdrawn for the second time in early 2018. Alas, it took place after messier circumstances.
> On 31 December 2019 through 1 January 2020, the United States Embassy in Baghdad was attacked in response to the airstrikes.[6] On 3 January 2020, the United States conducted an airstrike that killed Iranian Major General Qasem Soleimani and Kata'ib Hezbollah commander Abu Mahdi al-Muhandis.[6] Iraq protested that the airstrike violated their sovereignty.[13]
>
> In March 2020, the U.S.-led coalition, Combined Joint Task Force – Operation Inherent Resolve (CJTF–OIR), began transferring control over a number of military installations back to Iraqi security forces, citing developments in the multi-year mission against the Islamic State of Iraq and the Levant (ISIL).
Or perhaps the second withdrawal has never actually completed.
> In February 2021, NATO announced it would expand its mission to train Iraqi forces in their fight against ISIL,[14] partially reversing the U.S.-led troop withdrawals. In April 2021, U.S. Central Command stated that there were no plans for a total withdrawal of U.S. forces from Iraq, citing continued threats posed by the ISIL insurgency and Iran-backed militias.[3]
The Nobel peace prize is awarded by a different institution than the science ones. And there are hundreds of people that can nominate, doesn't mean that a nomination reflects anything upon the committee that awards the prize.
Each of the Nobel prizes is awarded by a different committee from a different organization. The Nobel Peace prize was established at the same time and in the same way as the Literature, Physics, Physiology or Medicine, and Chemistry prizes (through Alfred Nobel's will). Of course, by its nature, it is the most political of the prizes.
The only Nobel prize that is separate is the Economics one, which was established much later and has no connection to Alfred Nobel (it is paid for by Sweden's central bank instead of the Nobel estate). But even that one is administered by the same Nobel foundation.
I mean Trump was nominated for the award for fuck's sake
Being nominated only means that one of thousands of people allowed to nominated candidates wrote your name on a piece of paper and mailed it in. There is at least one right wing Swedish politician who's been sending in Trumps name every year for a while now.
The Nobel peace prize committee is not really responsible for nominating candidates[1], only for selecting a winner from the list of nominated candidates.
[1] Although I believe they are allowed to suggest names.
"Although not one of the five Nobel Prizes established by Alfred Nobel's will in 1895, it is commonly referred to as the Nobel Prize in Economics, and is administered and referred to along with the Nobel Prizes by the Nobel Foundation. Winners of the Prize in Economic Sciences are chosen in a similar manner as and announced alongside the Nobel Prize recipients, and receive the Prize in Economic Sciences at the Nobel Prize Award Ceremony."
It's propaganda for liberalism. It's just that at a certain point that propaganda became so successful, that you sound like a lunatic if you call it propaganda. Unfortunately there was no reason to make propaganda for Mathematics so they never got their own Nobel prize.
He also got it for being half black since by the cutoff date he'd been president for all of 11 days. Had they waited a year they would have had the pleasure of finding out he ordered 50% more drone strikes than Bush did.
We're well into flame war territory here, so I apologize, and am treading carefully.
The list of war crimes I can pin on US during that time is mostly indefinite imprisonment in Guantanamo if you allow for the efforts Obama made to reduce torture.
Many of those are unfulfilled promises of the type that have been around for 30 or 40 years at least. And climate modelling: what's the point? You can't predict climate change from history. That's the whole point of the research.
So then wait until those promises have been fulfilled, as has so often been the case in Nobel prizes. Remember Higgs?
But the negative effects have been clear. Might just as well give the Nobel Peace Prize to Zuckerberg.
I don't think there is a take I can disagree with more strongly. New technology can always be used for good or bad. His work in ML sets the course for a better future in spite of the people who use it for ill, whether advertisers or warlords.
Atomic bombs are a product of atomic energy. So are atomic energy, cancer treatment, electron microscopes, etc.
I disagree the negatives outweigh the positive. Spellcheck, Google maps traffic, and electricity distribution are three applications I've used this morning. We dont tend to think about the successful applications, instead focusing the solely negative use like adtech.
I genuinely think there is potential for a silly Internet tradition here. Google should pick bunch of candidate winners by ML, hire them six months before, fire them all a week before awarding, and then programmatically re-hire moments before awarding. It can't be more malicious than most academic pranks and it shouldn't matter whether the conspiracy is real, it'll be just funny.
I think calling it a conspiracy theory is a bit of a stretch. I could be wrong. I agree that's how it should be. But I don't get the impression there are lot of fans of Google in the Prize Committee. Either way, it's not something that matters too much. Just a thought.
> I think calling it a conspiracy theory is a bit of a stretch.
It's the textbook definition of a conspiracy theory, isn't it? I mean, a group conspiring to not awarding the most prestigious prize in science to someone who deserved it because of who their employer was, and suddenly awarding it once he switched employers?
> But I don't get the impression there are lot of fans of Google in the Prize Committee.
This is a conspiracy-oriented line of reasoning. Who anyone's employer was is something that never surfaced when discussing Nobel prizes. Suddenly it became the basis of a theory on how people conspired to first not award it and afterwards award it, and somehow the guy's accomplishments don't even register in the discussion.
That's what these conspiracy theories bring to the table.
I get what you're saying. I have no evidence and no inside information. It could be a conspiracy, but I doubt it. It could just be multiple individuals independently being uncomfortable with tacitly approving a huge company they see as potentially responsible for privacy problems, ethics problems and AI misuse. I don't see these as invalid concerns either. And I don't see being conflicted about giving an award to an employee of a company tied to big ethical concerns as anti-science or having a lack of integrity.
Discovering breakthroughs in machine learning is a profound achievement and deserves to be recognized. Wielding powerful tools against humanity for the sake of money, not so much. But, like I said, I could be dead wrong, and this is probably why I wouldn't be a good person to serve on one of these committees.
The age too, and the fact that Google lost prestige in AI over the years.
It's the company that didn't see the potential of Transformers, and that presented a half-assed Bard when LLMs were already in production in other companies.
But Hinton was not in favor of LLMs anyway, he argued backprop is not what the brain does and that we should do better than these models. I'd say Google would be a great place for someone thinking like that.
Since most papers nowadays are written by AI, and peer reviewed by AI, it only seemed logical for AI to be used by the Nobel committee to award the godfather of AI.
Thanks to AI, you now only have to to ask any GPT for the source code of the universe to get the code. Since physics is now a solved problem, we should recenter ourselves on more important questions like why did AI create the universe ?
I'm surprised Terry Sejnowski isn't included, considering it seems to be for Hopfield Nets and Boltzmann machines, where Terry played a large role in the latter.