Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> My side of this discussion keeps saying C's design is bad because it avoided good attributes of (insert prior art here) and has no better design because it's foundations effectively weren't designed. The counters, from two or three of you, have been that the specific instances of the prior art were unsuitable for the project due to specific flaws, some you mention and some not.

No, my counter in the specific bit that you are replying to here is that your history is wrong. Specifically, you said that C was initially so bad that they couldn't even write Unix in it. That statement is historically false - except if you're calling BCPL and B as "part of C" in some sense, which, given your further comments, makes at least some sense, though I still think it's wrong.

I'm not familiar enough with Modula or Oberon to comment intelligently on them. My reference point is Pascal, which I have actually used professionally for low-level work. I'm presuming that Modula and Oberon and that "type" of languages are similar (perhaps somewhat like you lumping BCPL and C together). But I found it miserable to use such a language. It can protect you from making mistakes, but it gets in your way even when you're not making mistakes. I would guess that I could write the same code 50% to 100% faster in C than in Pascal. (Also, the short-circuit logical operators in C were vastly superior to anything Pascal had).

So that's anecdote rather than data, but it's the direction of my argument - that the "protect you from doing anything wrong" approach is mistaken as an overall direction. It doesn't need for later practitioners to re-discover it, it needs to die in a fire...

... until you're trying to build something secure, or safety-critical, and then, while painful to use, it still may be the right answer.

And I'm sure you could at least argue that writing an OS or a network-facing application is (at least now) a security situation.

My position, though, is that these "safety-first" languages make everything slower and more expensive to write. There are places where that's appropriate, but if they had been used - if C hadn't won - we would be, I estimate, ten years further behind today in terms of what software had already been written, and in terms of general availability of computers to the population. The price of that has been 40 years of crashes and fighting against security issues. But I can't say that it was the wrong choice.



" Specifically, you said that C was initially so bad that they couldn't even write Unix in it. That statement is historically false "

Well, if you watched the Vimeo video, he looks at early references and compares side-by-side C with its ancestors. A lot of early C is about the same as BCPL & its squeezed version B. The first paper acted like they created C philosophy and design out of thin air based on B w/ no mention of BCPL. Already linked to it in another comment. Fortunately for you, I found my original source for the failed C attempt at UNIX which doesn't require a video & side-steps the BCPL/B issues:

https://www.bell-labs.com/usr/dmr/www/chist.html

You'll see in that description that the B -> Standard C transition took many intermediate forms. There were several versions of C before the final one. They were simultaneously writing UNIX in assembly, improving their BCPL variant, and trying to write UNIX in intermediate languages derived from it. They kept failing to do so. Ritchie specifically mentions an "embryonic" and "neonatal" C followed by this key statement:

"The language and compiler were strong enough to permit us to rewrite the Unix kernel for the PDP-11 in C during the summer of that year. (Thompson had made a brief attempt to produce a system coded in an early version of C—before structures—in 1972, but gave up the effort.)" (Ritchie)

So, it's a historical fact that there were several versions of C, Thompson failed to rewrite UNIX in at least one, and adding structs let them complete the rewrite. That's ignoring BCPL and B entirely. That they just produced a complete C magically from BCPL or B then wrote UNIX is part of C's proponents revisionist history. Reality is they iterated it with numerous failures. Which is normal for science/engineering and not one of my gripes with C. Just got to keep them honest. ;)

" I would guess that I could write the same code 50% to 100% faster in C than in Pascal. (Also, the short-circuit logical operators in C were vastly superior to anything Pascal had)."

Hmm. You may have hit sore spots in the language with your projects or maybe it was just Pascal. Ada would've been worse. ;) The languages like Modula-3, Component Pascal, and recently Go [but not Ada] are usually faster to code in than C. The reasons that keep turning up are straight forward: design to compile fast to maximize flow; default type-safety reduces hard-to-debug problems in modules; often less interface-level problems across modules or during integrations of 3rd party libraries. This is why what few empirical work I read comparing C, C++, and Ada kept showing C behind in productivity & with 2x the defects. Far as low level, the common trick was wrapping unsafe stuff in a module behind safe, simple interfaces. Then, use it as usual but be careful.

". until you're trying to build something secure, or safety-critical, and then, while painful to use, it still may be the right answer."

Not really. Legacy software is the counterpoint: much stuff people build sticks around to become a maintenance problem. These languages are easier to maintain due to type protections countering common issues in maintenance mode. Ada is strongest there. The simpler ones are between Ada and C in catching issues but allow rapid prototyping due to less debugging and fast compiles. So, reasons exist to use them outside safety-critical.

"My position, though, is that these "safety-first" languages make everything slower and more expensive to write. "

In mine, they're faster and less expensive to write but more expensive to run at same speed if that's possible at all. Different, anecdotal experiences I guess. ;)


What I think is interesting is Intel is adding bounds checking registers to their processors. That should eliminate a lot of the issues people complain about. (Except your programs foot print will be larger due to needing to manage bounds information

https://gcc.gnu.org/wiki/Intel%20MPX%20support%20in%20the%20...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: