This paper was not widely noticed back when it was written in July (both inside and outside the theoretical physics community) and still has not been accepted in a journal.
In fact, it is rather unfortunate but most of what Stephen Hawking did in his latter years has been highly speculative; perhaps food for thought for other experts, but not worth reporting about in the mainstream media since it is (in my opinion) very likely to be wrong.
I apologize for not debunking this paper with more detail but a scientist can spend a lifetime doing so... However I can make one falsifiable prediction (like a true physicist): the serious news media will not report on this.
Graham Hutton, one of the authors of the original paper, also wrote a wonderful book "Programming in Haskell". I would wholeheartedly recommend it to both the programming novice and the experienced programmer trying to learn Haskell and monadic programming.
Chapter 13 of (the second edition of) that book also revisits the original paper from a modern viewpoint and overlaps with the current article. For example, both the book and the article mention the similarity with the state transformer, introduce instances of 'functor' and 'applicative' before the 'monad' instance, and discuss the 'alternative' instance.
This seems like an area where a trusted organization (perhaps the EFF?) could do a lot of good by creating a "for dummies" webpage where the vulnerability disclosure process is explained in layman's terms (i.e. with suitable car analogies...) from a website owner's perspective. Those who discover a vulnerability in a company's IT infrastructure can then submit a link to this page with their reports.
It's most certainly not scientific writing. But I think that is not the point of these publications -- they are meant for a wide audience.
So I think I should interpret your criticism as saying that quanta does a poor job in maintaining rigor while trying to explain advanced topics in mathematics and theoretical physics to a wide audience. My impression, on the other hand, is that their reporting on these topics beats similar publications hands down. I am therefore happy to recommend it to interested laypeople, for example to my (few...) friends outside of academia and to people here on HN.
By the way, I hope you realize that these two viewpoints are not completely orthogonal.
> By the way, I hope you realize that these two viewpoints are not completely orthogonal.
You win the Fields Medal. :)
I appreciate your feedback and you seemed to have interpreted my text words as intended. I disagree with your conclusion but you do understand my position. I'm grateful for that. So much can be misunderstood through the web. I point others to your comment to hopefully elaborate.
I do think Quanta does a poor job at maintaining the rigor and no they aren't as bad as other publications. But we should hold them to a higher standard (heck all publishers). They are closer to what I think is needed in the industry but it's too much hype for me and not enough rigor.
As usual I think that Woit's blog post is unnecessarily polarizing. If you decide to read his post then I would recommend you also read the excellent comment by Marty Tysanner on the same page.
Also, sigh. If you dig deep into the dark corners of the internets then I am sure that you can find fake anything. Focus on the beauty and the truth, people. For example:
> also read the excellent comment by Marty Tysanner on the same page.
also read Woit's sensible reply to that comment and MT's re-reply. I agree Woit is polarizing but maybe not unnecessarily.
because the problems he points out are not hidden in the dark corners of the internets but all over the mainstream media - and arxiv too, and backed with millions of dollars.
- The detection of gravitational waves has been expected for decades based on the orbital decay of binary neutron star systems
- There have been dozens of 'bumps' in the data that went away over the past several years. The Standard Model still stands and there's no BSM physics that we've found. I'd bet the bump goes away just like the recent 750 GeV bump.
- The magnetic moment of the proton and antiproton being the same is also not BSM physics at all.
The problem is that fundamental elementary physics continues to boringly grind along and validate the standard model and things that we already knew had to be found (Higgs, LIGO).
The result in the popular press, though, is that multiverse mania has taken over, which is a non-solution to the problem. You'll actually find arguments that we stop thinking about alternatives to string theory and just assume it works because its beautiful, but it can never be measured--which is not a scientific argument.
500 years from now if we haven't made any progress we might wind up going "meh, probably string theory, but we'll never know", but its too soon to throw in the towel yet.
> very accessible overview of particle physics in 2016
In general, Symmetry is an excellent site; they describe themselves as "An online magazine about particle physics" and are funded by the US Department of Energy, via Fermilab and SLAC. Recommend having a closer look http://www.symmetrymagazine.org/
We need to make sure we're not ignoring motives we don't understand. That happened in the Republican primary, the general election, and perhaps now, too.
Quanta Magazine has also a good reputation among many theoretical physicists including myself. They of course do suffer from the occasional misconception, but as a whole the accuracy of their reporting is leaps and bounds above many other popular science sites or blogs.
Lifehacker introduced me to them and I've been a happy reader since then.
The thing I like the most is that don't play the "analogy game" too much and instead generally teach a small concept and then build up on it.
While other popular media outlets like Wired or Verge simply dumb it down too much or are factually incorrect.
I do like WIRED's Science Blogs though even though they are a little bit inactive.
The main problem is that the supposed violation of momentum conservation. This law is very deeply ingrained in all the current fundamental theories of physics, from the standard model to general relativity. These theories are extremely successful in describing phenomena all the way from subnuclear scales to cosmological scales, as well as nearly everything in between. This leads to two issues:
(a) If these models break down somewhere I would expect it to happen under much more extreme conditions and not with a 'table-top' experiment like this one.
(b) As far as I can see, giving up momentum conservation is not a feature that we can simply 'bolt onto' the existing models. It would require a radical rethinking of the very foundations of our current physical models.
So my personal attitude is that extraordinary claims require extraordinary evidence and this paper is not remotely convincing.
This experiment reminds me very much of the faster-than-light neutrinos. The authors of that experiment also seemed careful enough to not overstate their claims but this did not prevent a massive public interest in an ultimately debunked result. I expect the results in this paper to go the same way.
> (a) If these models break down somewhere I would expect it to happen under much more extreme conditions and not with a 'table-top' experiment like this one.
To push back on this kind of thinking a bit, Einstein's breakthrough papers on Brownian motion (which served as evidence of existence of Atoms and Molecules) and the photo electric effect (which demonstrated validity of the Quantum theory of light) were very much "tabletop" effects.
While there's no denying this requires extraordinary proof, I think it's equally important for physicists to prevent themselves being biased by pre-conceived notions.
They were well-known by Einstein's time because by then they had been demonstrated many times. But someone had to have been the first to see and report the photoelectric effect.
What we need now is other physicists to test their own "Q thruster." If no one else detects a thrust, well, fine, chalk it up to weird error and move on. If other people do find a thrust, then we're into "we don't have a theory yet" territory.
> If these models break down somewhere I would expect it to happen under much more extreme conditions and not with a 'table-top' experiment like this one.
I agree and I'm skeptical myself. But note that models just barely break in this experiment. The deviation from theory (if real) is so small that it's almost unmeasurable. The models might break in a more severe way if extreme conditions are applied in a correct way.
Peter Higgs published about 5 scientific papers after his Nobel-winning work in 1964 until his retirement in 1996, none of which were particularly impressive. I think this is below any reasonable standards, not just below contemporary academic standards. Therefore, barring special circumstances like an exemplary teaching record, in my opinion Edinburgh University would have been right to sack him and replace him with a more productive person. In short: I don't think that Higgs nearly getting sacked is an accurate indication that academia has too much of a 'publish or perish' culture.
First who are you to judge quality of his work, that guy has a Nobel. Imagine if they did follow your utterly idiotic suggestion and did kick him out, other universities who recognize importance of his work and would instantly hire him. Years later when he would actually win the prize, Edinburgh University would look crazy for kicking out a Nobel prize winning physicist.
So no Peter Higgs is a genius, he knew importance of what he had achieved and took leisurely path, nothing wrong in that. Edinburgh University knew importance of his work and correctly decided that keeping him was a great investment.
The fact that you think that a researcher who won a Nobel prize somehow did not work "hard enough" in later years and should have been fired is a great indicator of dysfunctional academic culture.
Beyond his Higgs Boson papers, the rest haven't been cited particularly much. Not indicative of a genius. Sure, no-one is calling him stupid, but there are potentially many other people who have changed the field as much, or more than him. Just because he won a Nobel Prize for one work doesn't make him immune to criticism.
There is valid criticism, and then there's "this guy did not do anything after making a Nobel worthy discovery, should have been kicked out".
Also "haven't been cited particularly much" is utterly bullshit. Unless the goal is to optimize for mediocrity (3 papers each year with 20-50 citations each) being better than one break-through Nobel worthy work. Frankly citations are very very easy to game if you are a professor with reasonable means at a good university, and are a really really bad indicator of success.
Since he already had a very successful paper maybe he wanted to write risky papers. In any case that's not worse than other researchers who write cookie-cutter papers adding extra terms to equations that are guaranteed to be cited by the next guy adding even more terms. By finding these minor faults with a Nobel award winning researcher, you are displaying the same dysfunctional thinking that has plagued academia.
I'm not sure why you have this idea that academia is resistant to paradigm shifts; they happen all the time. Any papers that are "risky" enough to start such paradigm shifts end up getting cited tons.
Its not me but rather its you who has the wrong idea that all papers with 50 citations are good. Or rather the more citation == better research.
Frankly almost 95% of papers are crap and better off not having been written, had it not been for publish or perish culture, or "lets count papers/cites to shame a Nobel award winning researcher culture" that you espouse.
Citations are a self reinforcing metric. Once a community starts counting them, the only way to succeed is to publish more which in turns leads to higher counts.
There is nothing wrong in publishing a good thorough paper over 4 years maybe slowly updating it as a working paper as done in economics.
I wish papers were "running", as in a wiki page with developments being incrementally added. It'd certainly help reduce the amount of redundant reading, and much greater coherency.
Except it wasn't a great investment. Unless he was doing something great besides research.
If you put in amazing groundbreaking work in the first years of a startup and then started slacking off. How long before the company that you helped build has a right to kick you to the curb? A year? Five? More? How about 30?
5 papers, even good ones is pitiful for 30 years. Sure, it's possible that they were the culmination of long brilliant research projects. If that's the case then great. If it wasn't then the University has every right to ask what's up, which they did and decided to keep him on anyway.
Hahhahah if you think having a Nobel award winning professor is not a great investment, you frankly don't understand how world works. There are Universities in some part of the world that will gladly pay more than what Edinburgh U. could, just to have him listed on the website.
Actually I have met people who did put in hard work in the initial years of a Unicorn statup. Guess what the value of their equity greatly exceeds the salary. And if the startup were to even kick them to curb they would live happily on their enormous earnings. Doing great research to an extent is similar.
Let's stick to the facts. University employs a Nobel Award winning scientist. He spends the next 30 years doing very little. The University says "hey, that's not cool. We thought you were going to keep doing research. Everyone says they're being too demanding."
I didn't say Nobel Award winners AREN'T a good investment. I said this one WASN'T. The University clearly thought he was going to produce. He didn't, and they were unhappy... with the return on their investment in him.
Right. The equity of early employees is high. My example wasn't about compensation, clearly. It was how long you continue to keep an underperforming employee in a position just because they did good work in the past.
Come on dude, I am sorry but you are wrong. Not just wrong but it seems you fundamentally misunderstand how world works.
See Peter Higgs is a Genius. The moment he published that paper, he knew that he essentially could do nothing, and the University would never "risk" losing him and waste potential payoff, even worse be ridiculed for firing a potential Nobel laureate. Also as wikipedia shows during all those years his research on Bosons kept him winning awards.
The University got far far far more than what they could have expected for when he eventually won the Nobel prize. Keeping him employed is equivalent to holding an option with enormous expected pay-off at small yearly recurring cost.
>> My example wasn't about compensation, clearly. It was how long you continue to keep an underperforming employee in a position just because they did good work in the past.
Except in academia having a seminal discovery, worthy of Nobel prize is equivalent to having large equity in that field. And from point of view of the University, losing such a person is equivalent to losing stake in delayed recognition of that work.
What am I wrong about? Seriously, I'm not sure what you think I'm arguing for. I'm saying that a university with an employee who does nothing for decades has the right to be a little pissed about that.
I'm not saying that having a Nobel Laureate on staff isn't worth something. I'm not saying that they should have fired him. I'm simply saying that I can understand the view in the earlier post where it was said that Higg's isn't a great example of why today's publishing climate is a bad thing.
The original article was about a Nobel Laureate who "wouldn't be productive enough for today's publishing climate". The implication is that this means that there is a problem with today's publishing climate. There may be, but Higgs isn't a good example of why. He was arguably not productive enough even for his earlier lower pressure time. He just put out one very brilliant piece of work that made up for it.
But that is in no way an indictment of the current academic focus on publications. There are other reasons and examples of why the current climate is a problem, but Higgs isn't one. The Higgs lesson in this context is "get a Nobel and you can do whatever you want". If you don't have a Nobel you're going to have to consistently produce research, and while the pressure to do so wasn't as high in the 70s and 80s, one paper every 3-5 years is pretty awful in an environment where publishing research is the goal.
You seem to have a problem with the concept of tenure. Why not just come out and say professors don't deserve tenure and they must keep pushing the rock up the hill grinding out paper after paper of incremental drivel?
I just tried to clarify my position in a post close to this one. Let me add this about tenure. Tenure is meant to protect researchers so they can do the resarch that they want and find most interesting. The idea is to insulate them from the vagaries of others opinions, as it is believed that this type of freedom is good for research. What it is NOT designed to do, and never was, was put tenured professors in a position where they can just not do research. While pressuring researchers to put out a half-dozen mediocre papers a year is probably not a good idea, as it lowers the quality of the field and turns science into a commodity, "less" is not necessarily the solution. In this case the amount that was being produced clearly indicated that research just wasn't getting done. If a researchers job is to do research, then expecting a certain amount of time and effort spent doing that is not unreasonable.
For that reason Higgs just isn't a good example of why the system is broken. The university was upset about his lack of productivity in the 70s, long before the current publishing environment became an issue.
I would go even further. Look at history of art - most artists were just alcoholic bums, such as Verlaine. We would be better off as a society if we made those people more productive than writing poetry, for example, they could be soldiers.
I'm definitely not in the humanities camp, but I don't see how someone who produced work that presumably thousands of people found value in would be less beneficial to society than yet another person to die at war.
If you are being sarcastic, it isn't working. Did Verlaine have a lifetime paid appointment to not create any poetry? While other artists were soldiers because they needed income?
The point is it's about time horizon. The parent would decide that in early days of Higgs' career, he wasn't productive enough, and thus he would prevent him making the discovery for which he is famous. Likewise, at the time Verlaine, it would be more productive if he was just sent to war.
But from a longer time horizon, nobody really cares about those people in the 1950s having to support Peter Higgs living, or about people not having Verlaine in the army. We care about their results differently now.
I wish that timescale is the one thing that people who use the word "productivity" or "efficiency" would understand. There is no universal optimum, it depends on the time horizon you're looking at.
And while I am at it, let me make another comment to the article. I think today, we are so obsessed about efficiency of other people working, because there simply aren't enough jobs for everybody (due to automation). Since having a job (in a general sense) is customarily a requirement to being fed, most humans optimize towards not actual productivity, but an appearance of productivity. All these attempts to measure productivity are just a symptom of this problem - we desperately need something with which we can bang people in control of resources over their heads with, so that we could eat.
In other words, most jobs are changing from doing actual work into proving to other people in society that you did, ever so diminishing, amount of work. It's a shift in the focus of the competition, and unless we collectively realize that we can just lay back and don't need to actually compete, it won't get any better.
> The parent would decide that in early days of Higgs' career, he wasn't productive enough, and thus he would prevent him making the discovery for which he is famous.
You are stretching the meaning of my comment. Did you know that he published three papers in the three years before getting hired in Edinburgh in 1960? I only pointed out that (on any 'time horizon') his scientific productivity after 1964 was basically non-existent.
In fact I agree on some level with a lot of the comments here, including yours. I just think that Peter Higgs is not the right example to justify the cause.
> and unless we collectively realize that we can just lay back and don't need to actually compete, it won't get any better.
Ah, OK. But I am not sure what cause are we justifying here - the tenure? I guess the idea of tenure is predicated on beating somebody enough to have him go through graduate and postgraduate studies, so then only people who are likely to really want to work in the field will remain and they will continue working on their own.
It seems to me that there is tradeoff. We can look today at Peter Higgs and say, whoah, what a failure he was after 1964. But could that have been said in 1974? I am not sure. What if in 1975 he would come up with another breakthrough?
So the trade-off is in the timescale on which we judge the scientist's output. If you shorten the timescale, you decrease your accepted risk, and you can miss some rare wins (and I think that's where the Higgs example shines, because it is an example of such rare event). If you make the time scale longer, you accept greater risk of people turning badly. Idea of tenure advocates maximal practical timescale of such trust - one human lifespan, because with tenures arguably the wins are worth more than the accrued losses.
To balance your view, I think js8's sarcastic point worked very well. There is much more to a person's achievement than 'raw productivity' measured by some arbitrary scale.
I think you highlight an important issue. I had an impression from the interview that he talks about going for years publishing nothing with considerable amount of braggadocio. The question is, what did he do all these years? If he pursued ambitious and risky avenues of research that didn't pan out, then he has all the rights to brag, as that's what tenured academics are supposed to do in theory (but still he would do everyone a favor by writing up some of that stuff). If, OTOH, he just decided to coast after doing some excellent work, then I am much less sympathetic.
Contemporary academia may be obsessed with publishing, but there is some middle ground between that and publishing a paper every 6 years. A certain webcomic comes to mind: http://www.smbc-comics.com/?id=2495
It's a little complicated. The idea is that one uses this Leech lattice to define a 24-dimensional space, and we throw in little quantum strings that live on that space. These strings are inspired by (but not exactly the same as) the strings we know from standard 'string theory'.
The quantum field theory is then the one living on the worldsheet of the strings, it is the one that describes their embedding in this peculiar space. The monster group turns out to be a symmetry group of this field theory.
This is all conjecturally related to quantum gravity (according to Witten's paper) through the so-called AdS/CFT correspondence which states that field theories are sometimes equivalent to theories of quantum gravity. It has its problems and it remains to be seen if it is really true. (AdS/CFT is fine in itself, the trouble of the conjecture is whether it holds in this particular example.)
A word of warning based on previous experience: this is highly technical stuff and what is mentioned here are just words. Please don't think you can make contributions to this field without seriously studying the equations.
https://arxiv.org/abs/1707.07702
This paper was not widely noticed back when it was written in July (both inside and outside the theoretical physics community) and still has not been accepted in a journal.
In fact, it is rather unfortunate but most of what Stephen Hawking did in his latter years has been highly speculative; perhaps food for thought for other experts, but not worth reporting about in the mainstream media since it is (in my opinion) very likely to be wrong.
I apologize for not debunking this paper with more detail but a scientist can spend a lifetime doing so... However I can make one falsifiable prediction (like a true physicist): the serious news media will not report on this.