As a self-taught programmer, I have just "experienced" Dijkstra through various tidbits like this[1], and from the bits I've read he comes across as a bit of an pompous asshat. However strong opinions are a fertile ground for discussion, so I assume it's a deliberate strategy and can see why he keeps getting mentioned.
That said, I found this one particularly interesting given the recent rise of LLMs:
Projects promoting programming in "natural language" are intrinsically doomed to fail.
I assume he's referring to languages like COBOL and SQL, the latter still going strong, but I can't help but think that this part will change a lot in the coming decades.
Sure we'll likely still have some intermediary language with strong syntax and similar, just like how LLVM, C# and similar have their IL, but it's hard to think the majority of the programming is done typing in regular programming languages like JavaScript, C++ or similar in 2050.
He comes across like that because he was. From him you always get these profound wisdoms mixed with denigrating statements, a lot of times about mental capacity. He was an asshole, a brilliant one, but an asshole.
I, by the way, don’t celebrate him as other people do. I don’t think there’s ever an excuse for anyone to behave like that, no matter how brilliant.
It's hard to make people view both intelligence and arrogance (or other bad traits) in hand. And neither does society encourage benevolence. I think Dijkstra was right about many things, but I can take those views and choose to act in certain ways, not just parrot Dijkstra as an all-knowing god. Although I will be the first to say that I am too arrogant for my own good. But still. Don't be like me or like Dijkstra, I guess.
I think some people mistake arrogance as wisdom. It's more prevalent in these days of social media where people mistake strong words and loud proclamation for truth, but I suspect similar things have been happening on a smaller scale in the past.
The former is also still going strong; the US Social Security Administration has sixty million lines of the stuff (as well as half of banks and approximately 80% of card transactions).
COBOL, at this point, has outlived Dijkstra and is poised to be a language-in-use for longer than he was a human-in-breathing. So I suspect he missed the mark on that one.
(I think, personally, we hackers have a bad habit of deciding a language that doesn't fit our favorite problem domains is a bad language. There are reasons, other than simple inertia, that COBOL sticks around in places where the main task is turning written laws into computer code...).
I don't consider COBOL to be a good indicator of programming quality. Although frankly, I will say that about most software. Generally products stick around not because they're good but because replacing them would be too costly. And generally products get in not because they're good but because they're decent and get lucky with timing. I saw another HN post today about Excel. I think Excel is on the better end of the software spectrum. Still, the main reasons why we don't yet have "Excel++ but for real better" everywhere is simply because developing such a thing and then getting the traction for it would be infeasible. I will once again say that I don't consider COBOL remotely impressive for still underlying many important computer systems. The computer systems are important because of their purpose, and implementation is always a conflict with purpose.
> don't consider COBOL to be a good indicator of programming quality
COBOL wasn't designed for that. The intention was a language that non-coders could code in. This was the precursor to things like FIT which was a precursor to today's Cucumber, like some history of child abuse carried on over the generations, so we still suffer today.
That said, I found this one particularly interesting given the recent rise of LLMs:
Projects promoting programming in "natural language" are intrinsically doomed to fail.
I assume he's referring to languages like COBOL and SQL, the latter still going strong, but I can't help but think that this part will change a lot in the coming decades.
Sure we'll likely still have some intermediary language with strong syntax and similar, just like how LLVM, C# and similar have their IL, but it's hard to think the majority of the programming is done typing in regular programming languages like JavaScript, C++ or similar in 2050.
[1]: I have of course learned about his algorithm