UB was added to C when it was 15 years old, and the current understanding of UB is from about 10 years after that. It isn't essential to C; an implementation that defines all the undefined behavior would still be compliant and capable of running nearly all existing C.
> There is a perf penalty from GC.
Not really, no. There's a perf penalty from bounds checking and runtime type checking. GC takes a little time but saves you time on free(), although it only becomes a real performance win when you remove the other, less efficient ways of tracking lifetimes, such as reference counting.
> Plus you need all the original sources, right?
Yes, but from my point of view, loss of sources is not a significant problem. I know it happened historically, especially last millennium, but really only for proprietary software running on a single platform such as MS-DOS. Unix software, free software, and software using source control systems have suffered almost no source code loss, except for particular old versions.
> Declining usage of C is going to make you lose intellectual heritage
I think there are more C programmers today than there have ever been, and I doubt that that number will ever fall to zero while there are still humans.
Tracing GC involves either a big performance penalty or a big memory footprint penalty compared to the ownership-based memory management in a language like Rust.
The pervasive use of reference counting that you find in languages like Swift is worse on throughput than typical GC, but can often avoid the memory overhead of GC due to deterministic destruction and typically gives you better worst-case latency, so there isn't a single winner between ARC and GC.
I think that Fil-C, unlike normal C allocators, could use a copying GC instead of a tracing GC, because it isn't exposing the machine pointers to user code. But I don't know if it does. Copying GCs often suffer from high memory overhead, but for example OCaml's GC is pretty frugal with memory, and RC by itself doesn't guarantee good worst-case latency—decrementing a reference to the root of an arbitrarily large tree can have arbitrarily bad latency in RC. You probably already know all of this, but someone else reading the thread may not.
So I'm not sure there isn't a single winner between ARC and GC, but you could be right.
Its being not essential doesn't matter. If you have a Fil-C code that terminates on UB and a C code that doesn't, you have two provably semantically (subtly) different programs.
Proof: You have a program that halts on UB and one that continues running on UB.
> I think there are more C programmers today than there have ever been, and I doubt that that number will ever fall to zero while there are still humans.
Think that depends on more things than just there being humans. Where my BASIC programmers at?
You seem to be suggesting that code that crashes in Fil-C will often run successfully in YOLO-C, but that doesn't seem to be the case from Pizlo's LFS exercise. He's had to modify a few things, but not much. Generally, UB in C code is a bug, and usually a portability problem, so C code that runs on a VAX, a 68000, a SPARC, a 386, an Alpha, and AMD64 will also probably run on Fil-C.
Old BASIC programmers are mostly working on Pick and other "business BASIC" systems, and I still run into them from time to time. But most of that code is only useful within a single business, so I expect it to eventually die out. (Meanwhile, new BASIC programmers are proliferating in the retrocomputing hobby.)
By contrast, on my system here I have over a thousand libraries written in C or C++. A random sampling reveals libraries for: LevelDB; various JS interpreters; file format handlers for zipfiles, OpenEXR, DjVu, and JPEG; gamepad interaction; a sparse matrix solver (used in Octave); the RIST protocol (used by OBS Studio); simulation with finite element models, which uses a different sparse matrix solver (used by FreeCAD); inspecting and manipulating configuration of PCI devices; the MTP protocol; the Icecast protocol; the protocol FTDI devices speak over USB; and so on.
Nearly all software written today is either written in C, written in C++, or interpreted or compiled by an interpreter or compiler written in C or C++.
> You seem to be suggesting that code that crashes in Fil-C will often run successfully in YOLO-C, but that doesn't seem to be the case from Pizlo's LFS exercise.
No. I'm saying they will behave subtly different. One will not stop other might, etc.
It's kinda like running same code in debug and release mode and expect identical results.
> Old BASIC programmers are mostly working on Pick and other "business BASIC" systems
I've never heard of those things either. Nor have I met a programmer outside retro computing groups that used BASIC.
> Nearly all software written today is either written in C, written in C++, or interpreted or compiled by an interpreter or compiler written in C or C++.
For time being. Up until 10 years you had no other realistic options.
> There is a perf penalty from GC.
Not really, no. There's a perf penalty from bounds checking and runtime type checking. GC takes a little time but saves you time on free(), although it only becomes a real performance win when you remove the other, less efficient ways of tracking lifetimes, such as reference counting.
> Plus you need all the original sources, right?
Yes, but from my point of view, loss of sources is not a significant problem. I know it happened historically, especially last millennium, but really only for proprietary software running on a single platform such as MS-DOS. Unix software, free software, and software using source control systems have suffered almost no source code loss, except for particular old versions.
> Declining usage of C is going to make you lose intellectual heritage
I think there are more C programmers today than there have ever been, and I doubt that that number will ever fall to zero while there are still humans.