Nowadays CPU pipelining is the norm, so all serious optimizers have to do instruction scheduling. Beating naive assembly written for one instruction after another, is not novel any more.
Somebody has to come up with the intended results of the optimization for certain chips and the actual benchmarks in the first place. So they do write assembly and test various things.
However, that person probably have extensive experience in that narrow set of problems. Most programmers don't.
There are still jobs where you end up writing large amounts of asm regularly and this is absolutely table-stakes for that. Otherwise there would be no point using asm at all, you’d just use C or another high-level language.
There was once a case where someone dropped from asm to machine code to shave a little more performance off. Sometimes asm is too high-level.
I feel like disabling all the error checking is a significant change for comparing performance, because the phrasing implies removal of bounds checks, etc that can be significant.
That aside a 2x performance improvement speaks to something algorithmic rather than the canonical "sufficiently intelligent compiler" - does anyone have the before/after source for this "study" (Obviously it does matter that algorithmic improvements may be easier in a higher level language, but that is likely truer for any high level language vs assembly)
> With the exception of the 1996 update note below, this article is reprinted from the Proceedings of TRI-Ada '92, Orlando, FL, November 16-20, Association for Computing Machinery (ACM), New York, 1992.
I am a big proponent of having proper liabilities across the industry, just like any other industry.
As a software enginneer professor of mine would joke, general computing quality is akin to buying a pair of shoes that randomly blow up if tied incorrectly, and people have been educated to put up with it.
With any other industry, if one buys something that doesn't properly work, usually the first reaction is to return to the shop and ask for the money back.
Thankfully digital stores, warranty contracts in project delivery, and ongoing cybersecurity laws, are already some steps into the right direction, yet there is still too much to be done.
EULA: By using this product, you (user) agree that:
1. It might kill your dog, and it will be your fault.
2. Any other harm or defects will be your fault as well.
3. You abandon all rights to sue us, ever, for anything.
4. Any dispute will be handled by our arbitration department.
5. We are free to spy on you and share all information with
our marketing partners or anyone else who asks.
C is used in situations where choose doesn't have to be reliable - wanna expand?
In my superficial understanding of computers, Linux is the paragon of reliability, so naively I would say that shows that C is good enough. However, in also aware that some instructions don't have such a great view of Linux, but I don't know much.
And that many of Linux's severe vulnerabilities were facilitated by C/gcc:
"The net result is that a PL/I programmer would have to work very hard to program a buffer overflow error, while a C programmer has to work very hard to avoid programming a buffer overflow error."
Aerospace, defense, and safety critical systems still use it. It's less common these days but still out there. Ada 2012 is pretty nice if you're going to be writing imperative/procedural code anyways.
I worked for a defensive contractor in the early 2000s. We had MS2 and ATC programs in our facility. Anything that had to do with the operations of ATC was written in ADA. Almost all the guys I worked with went to school at Embry Riddle. Pretty sure those systems are still written in ADA.
I'm not sure "replaced" so much as "didn't grow" or "didn't grow as fast". Even in the 90s when the US DOD mandated Ada it wasn't used universally in defense projects because people got waivers for it. As the number of projects grow if Ada's not growing as fast as the others in adoption then it's getting a smaller and smaller share of the market. C and C++ took a lot of the marketshare in newer developments and rewrites of older Fortran, JOVIAL, and assembly systems.
So as a percentage of the market in the aerospace, defense, and safety space it has shrunk. In absolute numbers, I'm not sure if there are more or fewer systems developed and currently being maintained in it than 20 or 25 years ago. But C and C++ have definitely grown in that space.
Thanks, the usual sentiment with C and C++ is that it's unsafe to make anything beyond a hello world. Happy to hear that defense is using them. Last time I had to write Ada was in university!
I graduated recently and my first project out of college was in Ada. It was defense-related. I have also heard rumors of banks still having some ada code.
My impression of the language is that it is a lot close to C than other languages like Java or Rust.
The latest version of the Ada standard came out in 2022. It has its own package manager similar to Cargo and I'm actively working in Ada. Favorite language.
In France I did learn Ada at computer science school. Maybe because it is useful for the aerospace and aviation industry ? Or because teachers were used to it and deemed it was great to learn programming in a rigorous manner.
I have to say at least when you're coding in Ada there is no ideological fights over functional programming or silly design patterns from OOP. Refreshing.
The same question can be asked about many languages that are very actively used. Just not by web army.
From my personal experience - I use many languages and I mostly base my choice on perceived ROI for particular project / situation. I could not give a flying fuck if "anyone still uses those in 2023" .
I am very small and never needed more than couple at subcontractors of the same time. With the whole world being at my disposal I've had zero problems finding decent freelancers for any language.