> Zig is not only a new programming language, but it’s a totally new way to write programs
I'd say the same thing about Rust. I find it the best way to express when what code should run at any given point in the program and the design is freakin interstellar: It is basically a "query engine" where you write a query of some code against the entire available "code space" including root crate and its dependencies. Once you understand that programming becomes naming bits and then queries for the ones you wish to execute.
As someone not really familiar with Rust, this sounds intriguing, but I don’t full understand. Do you have any links that can or examples that could clarify this for someone who is just starting out with Rust?
I don't use my phone much other than you know, for calling and occasional messaging. For me the most annoying is constant asking of password in both phone and mac. It's so secure.
"Video card" was the more general word. "VGA" is one of the IBM video cards for PCs that later became a de facto standard, as its behavior was cloned by other companies. It's sometimes used descriptively to talk about the 640x480 resolution, or the DE-15 connector that remained a standard connection for analog video output on personal computers for a long time.
With some others like the Hercules which was MDA upward-compatible and did graphics as well as text.
They didn't really do any graphics "processing"; just displaying memory-mapped pixels in various formats.
They were memory-mapped, and the MDA used a different memory block than the CGA/EGA/VGA, so you could have two separate monitors simultaneously, doing things lke running something like Turbo Debugger on the MDA text display.
> I used to believe that Apple were unequivocally ‘the good guys.’
So did I about YC until I watched their videos on bootstrapping and the rest of that wonderful series about a year ago. This happens because your made up image of 'the good guys' is based on their past which is not going to hold up indefinitely by accident.
> and found its capabilities to be significantly inferior to that of a human, at least when it came to coding.
I think we should step back and ask: do we really want that? What does that imply? Until recently nobody would use a tool and think, yuck, that was inferior of a human.
If I dare suggesting a technical solution to a non-technical problem, I think this just shows a mismatch between how we design applications and something called requirements, broadly speaking. If this has become a subject of discourse, I think we should just write cookies off as a component to rely on and try to deal with that instead of fighting over where the annoyance should be placed. It's just bad design.
They are going to seriously let it lose, when we talk about "revert to human control within a reasonable amount of time".
reply