This is a bizarre essay by someone who understands neither functional programming nor the history of computers.
> To be kind, we’ve spent several decades twisting hardware to make the FP spherical cow work “faster”, at the expense of exponential growth in memory usage, and, some would argue, at the expense of increased fragility of software.
There is not one iota of support for functional programming in any modern CPU.
Totally agree. In addition, one of his examples (Mars Pathfinder) has absolutely nothing to do with functional programming or simplifying assumptions of any kind. The mars pathfinder problem was caused by a priority inversion on a mutex - exactly the sort of thing that all programmers rightly consider hard and that things like software transactional memory in FP would prevent. Here’s the famous email “What Really Happened on Mars?” which was written by the a pathfinder software engineer and explains the issue
The definition of spherical cow is also butchered beyond recognition.
Spherical cows are about simplifying assumptions that lead to absurd conclusions, not simplified models or simplified notation in general.
Calling functional programming a spherical cow when you mean that automatic memory management is a simplifying assumption, is such a gross sign of incompetence that nobody should keep reading the rest of the blog.
> Spherical cows are about simplifying assumptions that lead to absurd conclusions
There aren’t any commonly-accepted conclusions from spherical cows because the bit is the punch line. It’s a joke a physics 101 student makes when toughing through problems that assume away any real-world complexity and thus applicability.
Spherical cows, in the real world, are pedagogical tools first, approximations second, and mis-applied models by inexperienced practitioners third.
“Hello World” is a spherical cow. Simplifying assumptions about data are spherical cows. (And real dairy farmers implicitly assume flat cows when using square feet to determine how much room and grazing area they need per head.)
The joke as I recall it, was a physics student who brags that he can predict the winner of any horserace, so long as all of the horses were perfectly spherical perfectly elastic horses.
I'm actually not sure where cows came in, but maybe there's a different version of the joke out there.
The spherical cow joke generally goes that a farmer has some problems with his cows (maybe it’s how much milk they’re producing I don’t remember) and so his daughter says “you should ask my boyfriend to help - he’s a physicist and really clever”. So the farmer asks the boyfriend and he says “Well, assume the cows are spherical…”
The joke being because when you do mechanics you generally start modelling any problem with a lot of simplifying assumptions. In particular, that certain things are particles- spherical and uniform.
Trying to be as kind as possible in my interpretation of the article, my take was that the author got stock on the "spherical cow" analogy early on and couldn't let it go. I think there are nuggets of good ideas here which generally tries to talk to leaky abstractions and impedance mis-matches in general between hardware and software, but the author was stuck in spherical cow mode and the words all warped toward that flawed analogy.
This is a great example of why rewrites are often important, in both English essays and blogs as well as in software development. Don't get wedded to an idea too early, and if evidence starts piling up that you're going down a bad path, be fearless and don't be afraid of a partial or even total rewrite from the ground up.
yes. and the two nuggets I took were looking a Unix pipe as a concurrent processing notation and pointing out that the Unix R&D for great notations (or the communication thereof?) stopped right before splitting, cloning and merging concurrent streams. I've rarely seen scripts nicely setting up a DAG of named pipes. I'm not aware of a Unix standard tool that would organize a larger such DAG and make it maintainable and easily to debug.
To the best of my understanding, the author describes the structured imperative programming style used since the 70s as "functional" because most languages used since the 70s offer functions. If so, it makes sense to describe hardware is optimized for what the author calls "functional programming", since hardware has long been optimized for C compilers. It also makes sense to describe callbacks, async, then, thread-safety as extensions of this definition if "functional programming", because yes, they're extensions of structured imperative programming.
There are a few other models of programming, including what people actually call functional programming, or logical programming, or synchronous programming, or odder beasts such as term rewriting, digraphs, etc. And of course, each of them has its own tradeoffs.
But all in all, I don't feel that this article has anything to offer to readers.
The most credit I could give is that the post itself is a spherical approximation of the subject and the point being made is that they discovered async dataflow programming and think it's underrepresented. I've only seen it compared it to command-line pipes for explanation of the concept, not understanding implementation characteristics.
I agree that code tends to be overrepresented--we don't 'data golf'. Even non-async dataflow oriented programs are much easier to follow, which happens to play exceptionally well with FP.
If you squint so hard that SSA is functional programming[1] and register renaming is SSA, modern CPUs are kind of functional, but that of course has nothing to do with functional programming done by the user, it’s just the best way we know to exploit the state of the art in semiconductors to build CPUs that execute (ostensibly) serial programs.
What does that mean in the context of the comment you reply to - which includes the literal quote about "twisting hardware to make the FP spherical cow work faster”? The article may not be exclusively about FP but nobody said it was.
It's a theoretician's trope. "Identical and spherical" is the baseline state of the objects in a system one wishes to model. There's are several jokes with this as the punchline.
An executive is retiring. He's been very fond of horse races, but has been very responsible throughout the years. Now with some free time on his hands, he spends more time than ever at the tracks and collects large amounts of data. He takes his data, along with his conviction that he's certainly onto something, to a friend in research at a nearby university. He convinces his friend to take a look at his data and find a model they can use to win at betting. After many delays, and the researcher becoming more disheveled over months of work, he returns to the retired executive to explain his model. He begins "if we assume all the horses are identical and spherical..."
That author uses it to mean “model”, as he calls a variety of programming models “spherical cows”.
Well, for sure, a core tenet of computer science is that all models of computing are equally powerful in what inputs they can map to what outputs, if you set aside any other details
I would say "sequential execution of CPU instructions" and "O(1) memory access" are two major spherical cows in computing. Probably the biggest, though, is the "fast, reliable network". We build systems that treat networked resources as if they were local: always there and instantly available. Heck, most of our stuff wouldn't even run without the database being online, and that's usually provided over the network.
You are, of course, right, although I'd say that it's fairly unusual for the "sequential execution of CPU instructions" abstraction to leak.
The wrongfully assumed "O(1) memory access" (or worse, wrongfully assumed O(1) data structure access even when the data structure actually isn't O(1)) showed up more frequently in my experience. And I still don't understand how we keep writing code that assumes "fast, reliable network" when we're reminded every day that code is neither fast nor reliable.
> You are, of course, right, although I'd say that it's fairly unusual for the "sequential execution of CPU instructions" abstraction to leak.
On x86. x86 and x86-64 usually do a good job of enforcing memory access order as if the instructions were executed sequentially. Other architectures, not so much. On PowerPC you had to put explicit memory barriers guaranteeing a certain access order with the 'eieio' instruction. ARM processors feature similar weak memory order, but the Apple M1 and later feature a special mode that guarantees strong x86-64 memory order, to make emulation for programs written for Intel Macs easier.
Ah, good point, I haven't programmed for anything other than x86-64 in a while.
But even then, if you're using any kind of compiler, either this is successfully hidden from you... or you've hit some UB in C or C++ and you're in nasal demon territory anyway, no?
I’m more DevOps but we have issues wear we discuss the spherical cow of build pipelines like we have a handful of static environments in the same cloud account with the same permissions and database requirements that does not stand up to encountering the actual requirements where we have destructible and static environments, an environment in a different cloud account for our customer to review per contract requirements, and a prod environment with heavy security and data protection requirements hosted by a entirely different cloud vendor with bad infrastructure support resources.
At least most functional language tutorial claim to be based on abstract machines, not like the C language, which is a spherical cow that people not often aware of. https://queue.acm.org/detail.cfm?id=3212479
Calling random things “spiritual cows” is a fine comedy bit, but has no place in a professional environment where you have to be able to communicate with others to achieve a common goal. TFA is just shitposting in blog form. Not sure if the author realizes that.
I’m also having trouble figuring out why mentioning sacred cows would be a problem in business. Other than potential career suicide for handling the news poorly.
"Sacred Cows" crop up in business surprisingly often, if you deal with agricultural services in parts of the world with a significant Hindu population.
You have to be absolutely certain you've got the right cows in the right trailer, for one thing.
Oh you mean literal sacred cows. Yeah I was talking metaphysical.
(I used to know someone with an ostensibly Hindu coworker who she caught eating a burger. His response to being caught was to say that Indians aren’t reincarnated as American cows.)
I know a guy who is a very devout Muslim, whose "home cuisine" is basically potatoes, sauerkraut, and pork sausage. He comes from a part of the fiddly little "-istans" that weren't quite Asia and weren't quite the former Soviet Union, where really all they had to eat were potatoes, cabbages, and the pigs that ate the scraps.
The general idea is that if you live in a desert then - as with Jewish dietary restrictions - eating stuff like shellfish and pigs that don't naturally lend themselves to being eaten unrefrigerated a hundred miles from the sea in 40°C weather is not a good idea. But if it's roughly the same climate as Scotland and you know what you're doing, it's all just fine, it's far more haraam to starve yourself to death when you can eat a big bowl of lentil and ham soup.
Most of Leviticus is about hygiene, including the part that people take as a hall pass for bigotry and murder. Which is on the same page and in the same phrasing as “don’t eat shellfish.” There are some people who think someone mistranslated “unclean” a lot.
> To be kind, we’ve spent several decades twisting hardware to make the FP spherical cow work “faster”, at the expense of exponential growth in memory usage, and, some would argue, at the expense of increased fragility of software.
There is not one iota of support for functional programming in any modern CPU.