This seems like a good time to remind everyone of a letter by David Packard, to his employees. There is more morality, common sense and insightful business advice here than in any 1000 business titles you would care to name.
I think that OPs essay identifies that something bad happened at HP but completely misses what it was. Look at this quote:
Around 1997, when I was working for the General Counsel, HP engaged
a major global consulting firm in a multi-year project to help
them think about the question: “What happens to very large companies that
have experienced significant growth for multiple successive years?”
OP says that the findings and recommendations included: "the decade long trend of double-digit growth was unlikely to continue", and "the company [should] begin to plan for much slower growth in the future."
OP then goes on to talk about fighting for resources for investments, a "healthy back and forth" on these tradeoffs, and then losing the "will to fight" following this report. "The focus became how not to lose".
Unlike OP, I did not work at HP. But I have seen up close startups, middle-sized companies, and huge companies, and the transitions among these states. So I feel justified in saying: OP has missed the point. And in particular, he makes no reference to that letter from David Packard.
Look at this quote from the letter:
I want to discuss why a company exists in the first place. ... why
are we here? I think many people assume, wrongly, that a company
exists simply to make money. While this is an important result of
a company's existence, we have to go deeper and find the real
reasons for our being. ... a group of people get together and exist
as an institution that we call a company so they are able to accomplish
something collectively which they could not accomplish separately.
They are able to do something worthwhile—they make a contribution
to society .... You can look around and still see people who are
interested in money and nothing else, but the underlying drives
come largely from a desire to do something else—to make a product—to
give a service—generally to do something which is of value.
I think this is the essence of what it means to do useful and interesting work in any technical field. Unfortunately, there are many, many examples of companies that have lost their way, forgetting this key insight. HP was certainly one of them. I would argue that Google and Microsoft are examples too. Boeing, for sure.
And sadly, there are very, very few companies that actually embody Packard's ideas. I think that JetBrains is such a company, familiar to many HN readers. Another one that comes to mind, from a very different field, is Talking Points Memo -- an excellent website that does news reporting and analysis, mostly on US politics. It started as a "blogger in a bathrobe", and 25 years later, it is a small, independent news organization, supporting itself mostly through paid subscriptions by a very loyal readership.
To me, the saddest part of the essay is this:
In the last few years more and more business people have begun to
recognize this, have stated it and finally realized this is their
true objective.
(This is right before the "You can look around ..." section quoted
earlier.) It seems to me that very, very few business people recognize
the way to run a business, as outlined by Packard.
No, "we" are not replacing OOP with something worse. "We" are replacing layers of stupid shit that got layered on top of, and associated with OOP, with different renderings of the same stupid shit.
I have been programming since 1967. Early in my college days, when I was programming in FORTRAN and ALGOL-W, I came across structured programming. The core idea was that a language should provide direct support for frequently used patterns. Implementing what we now call while loops using IFs and GOTOs? How about adding a while loop to the language itself? And while we're at it, GOTO is never a good idea, don't use it even if your language provides it.
Then there were Abstract Datatypes, which provided my first encounter with the idea that the interface to an ADT was what you should program with, and that the implementation behind that interface was a separate (and maybe even inaccessible) thing. The canonical example of the day was a stack. You have PUSH and POP at the interface, and the implementation could be a linked list, or an array, or a circular array, or something else.
And then the next step in that evolution, a few years later, was OOP. The idea was not that big a step from ADTs and structured programming. Here are some common patterns (modularization, encapsulation, inheritance), and some programming language ideas to provide them directly. (As originally conceived, OOP also had a way of objects interacting, through messages. That is certainly not present in all OO languages.)
And that's all folks.
All the glop that was added later -- Factories, FactoryFactories, GoF patterns, services, microservices -- that's not OOP as originally proposed. A bunch of often questionable ideas were expressed using OO, but they were not part of OO.
The OOP hatred has always been bizarre to me, and I think mostly motivated by these false associations. The essential OOP ideas are uncontroversial. They are just programming language constructs designed to support programming practices that are pretty widely recognized as good ones, regardless of your language choices. Pick your language, use the OO parts or not, it isn't that big a deal. And if your language doesn't have OO bits, then good programming often involves reimplementing them in a systematic way.
These pro- and anti-OOP discussions, which can get pretty voluminous and heated, seem a lot like religious wars. Look, we can all agree that the Golden Rule is a pretty good idea, regardless of the layers of terrible ideas that get piled onto different religions incorporating that rule.
I'm the kind of person that sees a bowl as a large cup without a handle.
Likewise, I see these patterns as equivalent style choices, since the problem fundamentally dictates the required organization and data flow, because the same optimal solution will be visible to any skilled developer, with these weak style choices of implementation being the only freedom that they actually have.
For example, these two are exactly the same:
state = concept_operation(state, ...args)
and
class Concept:
def operation(self, ...args)
self.state = <whatever with self.state>
and an API call to https://url/concept/operation with a session ID where the state is held.
I suspect people who get emotional about these things haven't spent too much time in the others, to understand why they exist with such widespread use.
It's like food. If you go anywhere and see the common man eating something, there's a reason they're eating it, and that reason is that's it's probably pretty ok, if you just try it. There's a reason they're eating it, and it's not that they're idiots.
OOD/OOP has not gone away, has not shifted etc. but is alive and well; just packaged under different looking gloss. The meta-principles behind it are fundamental to large scale systems development, namely; Separation-Of-Concerns, Modularization, Reuse and Information-Hiding.
I thought structured program was about language support for control flow, simplification and formalization of control flow and single return. Not ADTs.
I can see what you mean, I interpreted everything to be elaboration to your introductory remark:
> "We" are replacing layers of stupid shit that got layered on top of, and associated with OOP, with different renderings of the same stupid shit.
Meaning that both structured programming and ADTs are different names for the same "stupid shit", meaning the same ideas as OOP. I agree with this for ADT, that is really just the same thing under another name, but I failed to see how structured programming has something to do with OOP.
I now see, that the paragraph wasn't to be read like that.
---
This:
> All the glop that was added later -- Factories, FactoryFactories, GoF patterns, services, microservices -- that's not OOP as originally proposed. A bunch of often questionable ideas were expressed using OO, but they were not part of OO.
> The OOP hatred has always been bizarre to me, and I think mostly motivated by these false associations. The essential OOP ideas are uncontroversial. They are just programming language constructs designed to support programming practices that are pretty widely recognized as good ones, regardless of your language choices. Pick your language, use the OO parts or not, it isn't that big a deal. And if your language doesn't have OO bits, then good programming often involves reimplementing them in a systematic way.
is really the summary under every explanation or criticism of OOP. It is way more eligible for expression than most blog-posts, but it is so concise that it kind of doesn't even warrant to be that.
These are not the same thing. The GOTO people complained about and what the famous article "GOTO considered harmful" is about, is called longjmp in C. Nearly all C programmers will agree with you that you shouldn't use longjmp. The goto of C has less freedom for control flow than try-catch constructs in other languages.
This is a very minor but pleasant surprise. An action like this is beyond what I thought the US government (my government, sadly) was capable of. It is kind of puzzling to me that this issue, like every other one, didn't get politicized, with right wing talking heads bemoaning progress of any sort, appealing to the good old days, when America was great, the days that MAGAs want to return to.
I came to source code control reluctantly. CVS, SourceSafe, others I’ve forgotten. One of them was very expensive, very complex, took months of customization, and then it scrambled our bits following a disk crash and the vendor had to piece things back together. An expensive nightmare.
I finally started using Subversion, and it finally clicked. Easy to understand and use. It did everything I needed, and it was intuitive. But git was gaining popularity and eventually that was the only “choice”. And I don’t get git at all. I can do a few things that I need. I often have to consult google or experts for help. While I get the concepts, the commands are incomprehensible to me. I hate it.
Turbo Pascal was completely amazing. I remember resisting it for a long time, because IIRC it implemented non-standard Pascal. But the competitive tools were less powerful and far more expensive, (e.g. the Microsoft tools). And then I tried it, and was completely blown away. I no longer cared about the non-standard stuff. I had a fast intuitive IDE running on my original IBM PC.
As for modern IDEs, Intellij has been orders of magnitude better than any competition for more than 25 years (I think). I have stayed away from Microsoft products for a very long time, so I can't comment on VSCode and its predecessors. The main competition I remember was Eclipse, which I always found to be sluggish, unintuitive, and buggy. The fact that it wasn't even mentioned in this article is telling.
JetBrains, the company that created Intellij (and then PyCharm, CLion and many others) is one of those extremely rare companies that defined a mission, has stuck to it, and excelled at it for many years, and has not strayed from the path, or compromised, or sold out. It is so impressive to me that they maintain this high level of excellence as they support a vast and ever-growing collection of languages, coding standards and styles, and tools.
I chose it becaue I don't have access to neovim on my cloud desktop and ideavim is a superior solution to any vim like plugins for vscode. It is struggling with 4 cores and 16GB of ram with only a few projects open at a time. Some of it is due to being win11 with the amount of security malware installed by my company but still vscode doesn't seem to make it suffer that much.
Visual Studio still supports WinForms including the graphical form designer, which is very close to the OG Delphi experience in late 90s (esp. since WinForms is such a blatant rip off VCL).
You are missing a step there, before Windows Forms there was WFC, Windows Foudation Classes (not to mix with the other WFC from .NET), used in J++, one of the reasons for Sun's lawsuit.
Alongside events, and J/Direct the percursor to P/Invoke.
One point though: It is not necessarily the case that visual imagery is the only alternative to an inner dialogue. In both cases, that dialogue, or those images, are things you are aware of. There is something going on behind the scenes to generate those experiences. An alternative to dialog/images is just nothing being generated. I.e., there is something going on subconsciously, but there is nothing related to that activity that breaks through to awareness.
We all experience this, in which some "inspired" breakthrough just suddenly appears in your mind. That breakthrough must have come from somewhere in your subconscious mind.
Anecdotally, this is a conversation that my wife and I have occasionally, about our different mental landscapes. She is a very organized person, with lots of lists and internal (and sometimes external) dialog. I need to let problems just simmer in my mind, without paying attention to them, and I eventually get an answer. She believes that my mind is usually empty. My view is that both of us think subconsciously, but the different is that she has some mental dialog/imagery accompanying her subconscious thinking.
More generally, maybe the real thinking is always subconscious, and what we call thinking (the awareness of reasoning) is just accompanying imagery.
Go away kitten, your comment appears to be low-quality.
"Generate in a parallelogram" is pretty obvious, unless you are truly clueless. Every post of johndcook that I find posted to HN is interesting, and definitely of high quality.
Serious question: Is Ada dead? I actually had to google Ada, and then "Ada language" to find out. It's not dead, and it has a niche.
When I was in grad school in the late 70s, there was a major competition to design a DoD-mandated language, to be used in all DoD projects. Safety and efficiency were major concerns, and the sponsors wanted to avoid the proliferation of languages that existed at the time.
Four (I think) languages were defined by different teams, DoD evaluated them, and a winner was chosen. It was a big thing in the PL community for a while. And then it wasn't. My impression was that it lost to C. Ada provided much better safety (memory overruns were probably impossible or close to it). It would be interesting to read a history of why Ada never took off the way that C did.
I don't think the Ada story is particularly interesting:
(1) It was very expensive to licence at a time where C was virtually free.
(2) It was a complicated language at a time where C was (superficially) simple. This made it harder to port to other platforms, harder to learn initially, etc.
(3) All major operating systems for the PC and Mac happened to be written in C.
Ada had virtually nothing going for it except being an amazingly well-designed language. But excellence is not sufficient for adoption, as we have seen repeatedly throughout history.
Today? Virtually nothing stops you from using Ada. For lower level code, it's hands-down my favourite. Picking up Ada taught me a lot about programming, despite my experience with many other languages. There's something about its design that just clarifies concepts.
(3) wasn't relevant (or even really true) in 1977-1983 when Ada was being standardized.
MS-DOS was mostly x86 assembly, Classic MacOS was a mix of 68k assembly and Pascal, CP/M was written in PL/M, UCSD P-System was Pascal, and this leaves out all of the OS Options for the Apple II - none of which were written in C. I'm hard pressed to identify a PC OS from that time period that was written in C, other than something Unix derived (and even sometimes the unix derived things were not C, Domain/OS for example was in Pascal).
If we leave the PC space, it gets even less true - TOPS10/20 NotC, RSX-11 NotC, VMS also NotC - and I can keep going from there - the only OS from the time period that I can point at from that time period that was C is UNIX.
I'd actually argue that C/C++ were not enshrined as the defacto systems programming languages until the early-90's - by that time Ada had lost for reasons (1) and (2) that you noted.
What would you recommend for getting started with it? Looks like there's GNAT and then also GNAT Pro and then the whole SPARK subset, which one would be best for learning and toying around?
GNAT. Upgrade to gprbuild when you start to find gnatmake limiting.
SPARK is best considered a separate language. It gives up some of the things that make Ada great in exchange for other guarantees that I'm sure are useful in extreme cases, but not for playing around.
Ada isn't dead and it's superior to Rust in many ways, but it is less trendy. adacore.com is the main compiler developer (they do GNAT). adahome.com is an older site with a lot of links.
Assuming "solve" is meant loosely: much like in C++, with RAII-style resource management. The Ada standard library has what are called _controlled types_ which come with three methods: Initialize, Adjust, and Finalize. The Finalize method, for example is automatically called when a value of a controlled type goes out of scope. It can do things like deallocate dynamically allocated memory.
That said, Ada also has features that make C-style dynamic allocation less common. Ada does not have pointers but access types, and these are scoped like anything else. That means references cannot leak, and it is safer to allocate things statically, or in memory pools.
It kind of doesn't at the moment. That's an area where Rust is ahead. They are working on a borrow-checker-like thing for Ada. But the archetypal Ada program allocates at initialization and not after that. That way it can't die from malloc failing, once it is past initialization.
No. Just google for NVIDIA and Adacore to see how Ada is quite alive in NVIDIA land. Ada is quite a nice language that more or less anticipated a lot of the current trends in languages that the safe languages like Rust and friends are following. Spark is quite a cool piece of work too. I think the perception of old-ness is the biggest obstacle for Ada.
https://aletteraday.substack.com/p/letter-107-david-packard-...
I think that OPs essay identifies that something bad happened at HP but completely misses what it was. Look at this quote:
OP says that the findings and recommendations included: "the decade long trend of double-digit growth was unlikely to continue", and "the company [should] begin to plan for much slower growth in the future."OP then goes on to talk about fighting for resources for investments, a "healthy back and forth" on these tradeoffs, and then losing the "will to fight" following this report. "The focus became how not to lose".
Unlike OP, I did not work at HP. But I have seen up close startups, middle-sized companies, and huge companies, and the transitions among these states. So I feel justified in saying: OP has missed the point. And in particular, he makes no reference to that letter from David Packard.
Look at this quote from the letter:
I think this is the essence of what it means to do useful and interesting work in any technical field. Unfortunately, there are many, many examples of companies that have lost their way, forgetting this key insight. HP was certainly one of them. I would argue that Google and Microsoft are examples too. Boeing, for sure.And sadly, there are very, very few companies that actually embody Packard's ideas. I think that JetBrains is such a company, familiar to many HN readers. Another one that comes to mind, from a very different field, is Talking Points Memo -- an excellent website that does news reporting and analysis, mostly on US politics. It started as a "blogger in a bathrobe", and 25 years later, it is a small, independent news organization, supporting itself mostly through paid subscriptions by a very loyal readership.
To me, the saddest part of the essay is this:
(This is right before the "You can look around ..." section quoted earlier.) It seems to me that very, very few business people recognize the way to run a business, as outlined by Packard.