Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> They couldn't even write UNIX in it.

Are you referring to C here, or to B? If C, I'd like to see your source. If B, that's a highly misleading statement. You're attributing to C failings that belong to another language, and which C was designed to fix those failings.

> Now, what would UNIX look like if they subsetted and streamlined a language like ALGOL68 or Modula-2 that was actually designed to get real shit done and robustly?

But they weren't. I mean, yes, those languages were designed to get stuff done, and robustly, but in practice they were much worse at actually getting stuff done than C was, especially at the level of writing an OS. Sure, you can try to do that with Algol. It's like picking your nose with boxing gloves on, though. (The robust part I will give you.)

> Their sales numbers indicated people wanted to use those enough they paid millions for them. ;)

But, see, this perfect, wonderful thing that I can't afford is not better than this piece of crap that crashes once in a while, but that I can actually afford to buy. So what actually led to widely-used computers was C and Unix, and then assembly and CP/M, and assembly and DOS.

> Once you have goals, you derive the language, tools, whatever to achieve those goals. Thompson failed to do this unless he only cared about easy compilation and raw speed at the expense of everything else.

Nope. You don't understand his goals, though, because they aren't the same as your own. So you assume (rather arrogantly) that he was either stupid or incompetent. (The computing history that you're so fond of pointing to could help you here.)



"Are you referring to C here, or to B? If C, I'd like to see your source. If B, that's a highly misleading statement. You're attributing to C failings that belong to another language, and which C was designed to fix those failings."

My side of this discussion keeps saying C's design is bad because it avoided good attributes of (insert prior art here) and has no better design because it's foundations effectively weren't designed. The counters, from two or three of you, have been that the specific instances of the prior art were unsuitable for the project due to specific flaws, some you mention and some not. I countered that error by pointing out C's prior art, BCPL to B to original C, had its own flaws. Rather than throw it out entirely, as you all are saying for C alternatives, they just fixed the flaws of its predecessors to turn them into what they needed. The same thing we're saying they should've done with the alternatives.

So, you on one hand talk like we had to use Modula-2 and the others as is or else impossible to use it. Then, on the other, justify that prior work had to be modified to become something usable. It's a double standard that's not justified. If they could modify & improve BCPL family, they could've done it with the others. The results would've been better.

"The robust part I will give you."

As I've given you speed, ease of porting, and working best in HW constraints. At least we're both trying to be fair here. :)

"but that I can actually afford to buy. So what actually led to widely-used computers was C and Unix, and then assembly and CP/M, and assembly and DOS."

It did lead to PL/M that CP/M was written in. And to the Ceres workstations that ETH Zurich used in production. And A2 Oberon system that I found quite useful and faster than Linux in recent tests despite almost no optimization. Had almost no labor vs UNIX and its basic tools. I imagine data, memory, and interface checks in a micro-ALGOL would've done them well, too.

"Nope. You don't understand his goals, though, because they aren't the same as your own. "

That's possible. I think it's more likely they were very similar to my own as security and robustness were a later focus. I started wanting a reliable, fast, hacker-friendly OS and language for awesome programming results. Started noticing other languages and platforms with a tiny fraction of the investment killed UNIX/C in various metrics or capabilities with various tradeoffs. Started exploring while pursuing INFOSEC & high assurance systems independently of that. Eventually saw connection between how things were expressed and what results came from them. Found empirical evidence in papers and field backing some of that. Ideas you see here regularly started emerging and solidifying.

No, I think he's a very smart guy who made many solid achievements and contributions to IT via his MULTICS, UNIX, and Plan 9 work. UNIX has its own beauty in many ways. C a little, too. Especially, when I look at them as an adaptation to survive in specific constraints (eg PDP's) using specific tech (eg BCPL, MULTICS) he learned before. Thing is, my mental view of history doesn't begin or end at that moment. So, I can detach myself to see what foolish choices in specific areas were made by a smart guy without really thinking negative of him outside of that. And remember that we're focusing on those specific topics right now. Makes it appear I'm 100% anti-Thompson, anti-C, or anti-UNIX rather than against them in certain contexts or conditions while thinking better approaches were immediately apparent but ignored.

"The computing history that you're so fond of pointing to could help you here."

I've looked at it. A ton of systems were more secure or robust at language level before INFOSEC was a big consideration. A number of creations like QNX and MINIX 3 achieved low fault status fast while UNIX took forever due to bad architecture. Oberon Systems were more consistent, easier understanding, faster compilation, and eventually included a GC. NextStep & SGI taught it lessons for desktops and graphics. BeOS, like Concurrent Pascal before it, built into OS a consistent, good way of handling concurrency to have great performance in that area. System/38 was more future proof plus object-driven. VMS beat it for cross-language design, clustering, and right functions in OS (eg distributed locking). LISP machines were more hacker-friendly with easy modifications & inspections even to running software w/ same language from apps to OS. And so on.

The prior history gave them stuff to work with to do better. Hence, me accusing them. Most of the above are lessons learned over time building on aspects of prior history plus just being clever that show what would've happened if they made different decisions. If not before, at least after the techs showed superiority we should've seen more imitation than we did. Instead, almost outright rejection of all that with entrenched dedication to UNIX style, bad design elements, and C language. That's cultural, not technical, decision-making that led to all related problems.


> My side of this discussion keeps saying C's design is bad because it avoided good attributes of (insert prior art here) and has no better design because it's foundations effectively weren't designed. The counters, from two or three of you, have been that the specific instances of the prior art were unsuitable for the project due to specific flaws, some you mention and some not.

No, my counter in the specific bit that you are replying to here is that your history is wrong. Specifically, you said that C was initially so bad that they couldn't even write Unix in it. That statement is historically false - except if you're calling BCPL and B as "part of C" in some sense, which, given your further comments, makes at least some sense, though I still think it's wrong.

I'm not familiar enough with Modula or Oberon to comment intelligently on them. My reference point is Pascal, which I have actually used professionally for low-level work. I'm presuming that Modula and Oberon and that "type" of languages are similar (perhaps somewhat like you lumping BCPL and C together). But I found it miserable to use such a language. It can protect you from making mistakes, but it gets in your way even when you're not making mistakes. I would guess that I could write the same code 50% to 100% faster in C than in Pascal. (Also, the short-circuit logical operators in C were vastly superior to anything Pascal had).

So that's anecdote rather than data, but it's the direction of my argument - that the "protect you from doing anything wrong" approach is mistaken as an overall direction. It doesn't need for later practitioners to re-discover it, it needs to die in a fire...

... until you're trying to build something secure, or safety-critical, and then, while painful to use, it still may be the right answer.

And I'm sure you could at least argue that writing an OS or a network-facing application is (at least now) a security situation.

My position, though, is that these "safety-first" languages make everything slower and more expensive to write. There are places where that's appropriate, but if they had been used - if C hadn't won - we would be, I estimate, ten years further behind today in terms of what software had already been written, and in terms of general availability of computers to the population. The price of that has been 40 years of crashes and fighting against security issues. But I can't say that it was the wrong choice.


" Specifically, you said that C was initially so bad that they couldn't even write Unix in it. That statement is historically false "

Well, if you watched the Vimeo video, he looks at early references and compares side-by-side C with its ancestors. A lot of early C is about the same as BCPL & its squeezed version B. The first paper acted like they created C philosophy and design out of thin air based on B w/ no mention of BCPL. Already linked to it in another comment. Fortunately for you, I found my original source for the failed C attempt at UNIX which doesn't require a video & side-steps the BCPL/B issues:

https://www.bell-labs.com/usr/dmr/www/chist.html

You'll see in that description that the B -> Standard C transition took many intermediate forms. There were several versions of C before the final one. They were simultaneously writing UNIX in assembly, improving their BCPL variant, and trying to write UNIX in intermediate languages derived from it. They kept failing to do so. Ritchie specifically mentions an "embryonic" and "neonatal" C followed by this key statement:

"The language and compiler were strong enough to permit us to rewrite the Unix kernel for the PDP-11 in C during the summer of that year. (Thompson had made a brief attempt to produce a system coded in an early version of C—before structures—in 1972, but gave up the effort.)" (Ritchie)

So, it's a historical fact that there were several versions of C, Thompson failed to rewrite UNIX in at least one, and adding structs let them complete the rewrite. That's ignoring BCPL and B entirely. That they just produced a complete C magically from BCPL or B then wrote UNIX is part of C's proponents revisionist history. Reality is they iterated it with numerous failures. Which is normal for science/engineering and not one of my gripes with C. Just got to keep them honest. ;)

" I would guess that I could write the same code 50% to 100% faster in C than in Pascal. (Also, the short-circuit logical operators in C were vastly superior to anything Pascal had)."

Hmm. You may have hit sore spots in the language with your projects or maybe it was just Pascal. Ada would've been worse. ;) The languages like Modula-3, Component Pascal, and recently Go [but not Ada] are usually faster to code in than C. The reasons that keep turning up are straight forward: design to compile fast to maximize flow; default type-safety reduces hard-to-debug problems in modules; often less interface-level problems across modules or during integrations of 3rd party libraries. This is why what few empirical work I read comparing C, C++, and Ada kept showing C behind in productivity & with 2x the defects. Far as low level, the common trick was wrapping unsafe stuff in a module behind safe, simple interfaces. Then, use it as usual but be careful.

". until you're trying to build something secure, or safety-critical, and then, while painful to use, it still may be the right answer."

Not really. Legacy software is the counterpoint: much stuff people build sticks around to become a maintenance problem. These languages are easier to maintain due to type protections countering common issues in maintenance mode. Ada is strongest there. The simpler ones are between Ada and C in catching issues but allow rapid prototyping due to less debugging and fast compiles. So, reasons exist to use them outside safety-critical.

"My position, though, is that these "safety-first" languages make everything slower and more expensive to write. "

In mine, they're faster and less expensive to write but more expensive to run at same speed if that's possible at all. Different, anecdotal experiences I guess. ;)


What I think is interesting is Intel is adding bounds checking registers to their processors. That should eliminate a lot of the issues people complain about. (Except your programs foot print will be larger due to needing to manage bounds information

https://gcc.gnu.org/wiki/Intel%20MPX%20support%20in%20the%20...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: