Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

“If you pay a man a salary for doing research, he and you will want to have something to point to at the end of the year to show that the money has not been wasted. In promising work of the highest class, however, results do not come in this regular fashion, in fact years may pass without any tangible result being obtained, and the position of the paid worker would be very embarrassing and he would naturally take to work on a lower, or at any rate a different plane where he could be sure of getting year by year tangible results which would justify his salary. The position is this: You want one kind of research, but, if you pay a man to do it, it will drive him to research of a different kind. The only thing to do is to pay him for doing something else and give him enough leisure to do research for the love of it.

-- Attributed to J.J. Thomson (although I've not been able to turn up a definitive citation -- anyone know where it comes from?)

Does make me think some people have been pondering this for a long while though.



There's a long tradition of important advances in knowledge being made by people with an undemanding job [1] [2]. The 19th century university became a hotbed of research because the teaching workload was relatively light and they didn't require that you also did research.

[1] https://en.wikipedia.org/wiki/Parson-naturalist

[2] https://en.wikipedia.org/wiki/Baruch_Spinoza#Lens-grinding_a...


Siddhartha Gautama (the Buddha) was (became; established a whole tradition of being) a hermit/beggar/monk; Socrates was a stone mason (occasionally); Diogenes the Cynic lived in a barrel and begged for food; Jesus was a carpenter and then a wandering preacher; Spinoza (as you note) was a lens grinder; Kant was a tutor; Einstein was a patent clerk. And, as you note, especially in the 19th and 18th centuries, many proto-scientists were people with either an established income or independent wealth, who had the freedom to pursue whatever interested them.

We ought to be finding ways to broaden the number of people who can pursue their interests, while not forfeiting security, dignity, and social respect. This is one of the arguments for universal basic income, that it could decouple (some) people's desire to pursue knowledge from the need to earn a living, and maybe even especially from the need to produce regular, measurable results. Right not we're channeling most of our people who are interested in and capable of developing new knowledge into jobs which demand "results" more than they demand that truth be followed wherever it leads, and however long it takes. This will lead to a great deal of dubious Normal Science [0], and very little revolutionary thought [1].

[0] https://en.wikipedia.org/wiki/Normal_science [1] https://en.wikipedia.org/wiki/The_Structure_of_Scientific_Re...


Note that having a job seems necessary though. Maybe it's a case of the golden mean? On one side you have people whose job absorbs all leisure. On the other you have some who draw sufficient income without having a job at all. These are Marx's infamous "coupon clippers", and historically they're not as prominent in intellectual history as people with undemanding jobs (personally I'm not aware of many philosophers / scientists who were rentiers). If leisure was the only variable, we'd expect people with guaranteed incomes (annuities, for example, have been sold since the Middle Ages) to have been the most prolific thinkers of history.


> Note that having a job seems necessary though.

I'm not sure that's true. It would be interesting to see some numbers. My naive impression is that there are at least two things needed to make intellectual history: genius, and opportunity to practice your genius. Both are rare, and given that there are far, far more people who need to earn a living than people who don't, I would expect the absolute numbers of (Genius)(NeedsAJobHasAMenialJob) to be greater than (Genius)*(Doesn'tNeedAJob).

My impression is that gentlemen of leisure have, especially in the 19th, 18th, and 17th centuries, been a very disproportionate percentage of intellectual history, and that even the nominal jobs in that group (parson, Lucasian Professor of Mathematics at Cambridge) were wholly orthogonal to whether they produced.

What do you think having an unrelated, undemanding job would contribute to intellectual work?


I don't have a good theory - it's just based on reading biographies. Almost no famous thinkers lived on passive income (although, as I've said, annuities and bonds have been widely available for over 500 years). And probably not any kind of job will do. The most plausible explanation I've read belongs to Richard Feynman:

> I have to have something so that when I don't have any ideas and I'm not getting anywhere I can say to myself, "At least I'm living; at least I'm doing something; I am making some contribution" -- it's just psychological. When I was at Princeton in the 1940s I could see what happened to those great minds at the Institute for Advanced Study, who had been specially selected for their tremendous brains and were now given this opportunity to sit in this lovely house by the woods there, with no classes to teach, with no obligations whatsoever. These poor bastards could now sit and think clearly all by themselves, OK? So they don't get any ideas for a while: They have every opportunity to do something, and they are not getting any ideas. I believe that in a situation like this a kind of guilt or depression worms inside of you, and you begin to worry about not getting any ideas. And nothing happens. Still no ideas come. Nothing happens because there's not enough real activity and challenge

http://www.pitt.edu/~druzdzel/feynman.html


The sentiment in general is reasonable to me: that humans are generally more effective and more productive when they have something to do and work on, even when they're not getting anywhere with their main project(s).

It's not at all clear to me that a menial job is a strong solution to that problem. I would expect that in, say, a world with UBI, that also recognized this problem, people would find better ways to keep themselves feeling like they are at least doing something and making some contribution, than working on things they don't care about or believe in. There are plenty of meaningful volunteer opportunities in the world that would be much more effective "make you feel useful" solutions than filing papers in a bureaucratic office.


i think you have to go back to the 18th century for that. i'm thinking britain where science was something acceptable that gentlemen could do in their spare time (with smart+poor doing a lot of heavy lifting) and in particular Antoine Lavoisier in France who literally bought the rights to be a tax farmer so he could do his research. he lost his head in the revolution so there's that. YMMV.


Well, Lavoisier's tax farming was not passive income: "Lavoisier's researches on combustion were carried out in the midst of a very busy schedule of public and private duties, especially in connection with the Ferme Générale." [1] The gentleman scientists were also usually involved in some sort of business, even if it was just administering their estates. The people who truly had a passive income - relying on annuities or government bonds, mostly inherited - are hard to spot in the history of science. They were a pretty numerous class though [2].

[1] https://en.wikipedia.org/wiki/Antoine_Lavoisier#Gunpowder_Co...

[2] https://douthat.blogs.nytimes.com/2014/04/25/piketty-and-the...


Kelvin, De Broglie, and Brahe come to mind as well.


Kelvin taught at Glasgow, Brahe was a cathedral canon, and De Broglie did his best work for his PhD thesis.


The Parson-naturalist list you link to lacks one very prominent scientist-cleric- Gregor Mendel (though I believe this to be deliberate, as he performed research for economic reasons rather than spiritual reasons iirc). The earliest research on genetics took years to complete and was done as a side-project in addition to other monastic duties.


I don't know where the quote is from but it represents my only hope for the future of research.

Of course, the hardcore bean-counters would not let that middle ground happen either; they are people in power who absolutely positively won't give you almost any leisure.

It's still a good idea however. And some managers would embrace it since the notion of "paying somebody for possible benefits 5-10 years in the future" is something which our current form of capitalism hates. But some of them are reasonable people and would accept a middle ground.

Another thing I was thinking about in the past is rotational work: given a team of 6 people, have 1 person always working on a research for at least a month with zero responsibility about the money-making activities. After that month -- or two, or six -- expire, return the person back to the capitalistic loop and put somebody else on the research duty. This however is rather clunky because people will have to be constantly caught up with where are the things currently in the money-making job and the research job.

Perhaps there are other ideas as well, but I am not aware of them.


In a university setting, it seems to me that teaching in combination with support from endowment (not research grants) can be one of the scenario for this ... key is still to have fundamentally good people with passion and joy for the endeavor ..


The thing that always infuriated me about computer science (my field), is that I don't actually need the big research grant. There are people who do, of course, but I could do most of what I found interesting and useful with a high-end desktop PC and some free time.

But universities don't operate that way anymore. It doesn't matter that all I need is $5K and some time -- the university needs their 40% cut of a multi-year, multi-million dollar collaboration. And so the entire system is set up to fund those projects and those projects only.


So now that you are out of academia, do you now occasionally pursue those low-cost research ideas in computer science on your own? I mean, while $5k isn't anything to sneeze at, it is still within the realm of DIY research monetary levels (ie - if I wanted to, I could save up such an amount in a few months from my workaday salary).

So - unless such research no longer interests you (or you have zero free time to devote to such) - it would seem like that since you don't need to kowtow to the academic whims on getting massive amounts of money for research - you should now be able to pursue these questions...


I generally stay current in my research field, but that's a different thing than actually contributing. The free time is one aspect of the problem, but there's also the problem of attending conferences regularly. I could pay the expense with the excess salary that industry pays, but unless you work for a research group, it's unlikely that you can make a workable schedule.

Not to say it couldn't be done. But there are enough hurdles that I end up mostly just not participating much. That's not ideal to me, but it's where my actions show my priorities to be, I guess.


But you can only have so much free time, which means you eventually will have to "buy" time. Grants help you do this by allowing you to fund grad students and postdocs.


Not everyone who is a good independent researcher is a good man-manager. For some people, researching independently -- or perhaps collaborating with a few other independents -- might be a better route than being forced to spin up a research group (and more-or-less-inevitably finding themselves with limited time to do the hands-on stuff themselves).


> 40% cut of a multi-year

40%?!? I would kill for overhead that low. Most places are over 50 now. I've heard of 70% at the absolute top.


I've been out of academia for a few years now, but I think my university took 40%, and the department took some more on top. The total was probably 60% or so.


I can do research with a thinkpad and being a part-time barista...


Indeed you can. It's much harder though. If you have a full-time job unrelated to your research, you need to self-fund a lot of expenses. You need to convince your boss to let you take time off to go to that conference to present your work. You need to foster connections to other researchers through activities like reviewing for journals, etc.

Of course, you can bypass all that and just publish straight to the internet. Personally, I find that to be enormously difficult. The interactions with my peers -- the conversations over beers in the evenings during conferences -- are so useful. Losing those hurts your ability to do good work.


I get that we're all humans and feel better if we converse face to face sometimes. I certainly appreciate people doing research on their own expense and time for the general good.


It shouldn't be a surprise given the fact that physics was revolutionized by a patent clerk.

Additionally, having a relaxing job with lots of free time doesn't just free you to explore hard problems, it also promotes creativity. This is because stress is the ultimate creativity killer. Thus these sorts of jobs are really a recipe for deep thinking.


Solutions usually come from people who see in the problem only an interesting puzzle, and whose qualifications would never satisfy a select committee. -- John Gall in Systemantics[1]

A very short read, and quite interesting. It touches on a lot of the same points already made in this thread (people will game metrics, etc). The biggest argument that the book makes that is trying to set up a system to accomplish something will cause the system to do everything but accomplish its goal, which leads to the above quote. You can't set up a system to achieve a goal, only set up an environment that allows the goal to be achieved.

[1] https://en.wikipedia.org/wiki/Systemantics


If you're a private enterprise, anything you do can be reasonably related to money (revenue, profit, long term forecast etc.), and we have ample tools to measure performance, align incentives and evaluate risk over the money-abstraction. This works because money flows through the whole process and partakes in every transaction.

Academic pursuits are, monetarily speaking, money sinks. They might be worthwhile investments, but only so far as they deliver patents or attract grants. Academic professionals want to advance the state of the art, and only small parts of that work is applicable to the money-abstraction.

In order to prevent Science(TM) from grinding to a halt, we replace Money with some other measurable so we can apply our existing tools and make risk-assessment before granting money to academics who will (most certainly) spend every last cent of it to indulge their pursuits. We chose measurables in a trade-off between academic freedom + infinite spending and academic constraints + finite spending, because the supply of Science(TM) is practically unbounded.

I'm stating the obvious here in order to ask the question: What is the gosh-darned alternative?


A good solution for universities is to pay researchers a full salary for a 30 hour week. Where 10 hours must be dedicated to teaching and the rest can be used at their leisure. It is a happy arrangement for all parties.


The number of stated hours that a salary is allocated for is irrelevant. Nominally, professors are supposed to be working 40 hours a week. This stops exactly nobody from working between 60 and 80 hours regularly. The problem is cultural, not policy-based.


This sounds an awful lot like Google's 20% time and the "hack days" done by many other companies. Very few guidelines and no consequences for doing something insufficiently useful, but it ends up with a lot of valuable ideas anyway (and granted, some automated foosball tables).


The one place where the University and the Professor are perfectly aligned is the summer salary. Most professors are paid for only 9 months by the University. They need to pay the remaining portion out of their grant. This gives the Professor and the University a mutual incentive to secure one or more grants. The Professor gets paid and the University gets their overhead tax on the grant.


How different is Thompson's idea from modern academy?

PhD is granted for showing that you know how to do research. Tenure granted for showing that you can do research. Tenured professors are free to research however they like, and get paid as long as they teach.


It should be noted that "modern academy" involves a lot of people who don't teach, and are in institutional settings where teaching is not a thing.


I could see this working in mathematics and many branches of computer science. How would this work in environments that require large outlays of money to do experiments? Do you just hand untested individuals large sums of money and leave them alone?


I believe letting academic communities in autogestion (as in capacity of workers' self-management) is the way to go. For really large projects you can also include citizens (selected by sortition) and elected politics in addition to the temporarily mandated members of the community in the committee taking the decision. Of course the citizens and politics would have to be trained a bit so the decision process might be long, but these kind of projects are never in a hurry.


I just want to say to you and the others in this thread, excellent discussion, comments, thoughts, and references.


Another reason why UBI could turn out useful.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: