That's certainly one opinion, and is even right, to some degree. But there's also movies like the Imitation Game about Alan Turing, which is a movie, and not a documentary, but there are also countless documentaries about him. There's no shortage of hagiographies about Steven Jobs, in both book and film and tv miniseries format, along with Pirates of Silicon Valley which includes Microsoft's part. There's a movie about Aaron Swartz (The Internet’s Own Boy). The movie Antitrust(2001) is fictional, but in the same area.
Hackers (1995) is fictional but a cult classic. Freedom Downtime (2001) is about Mitnick and hacking culture. There's smaller documentaries too. There's that one about Wikileaks, that one about Cambridge Analytica. There's books like The Dream Machine (Mitchell), Unix: A History and a Memoir (Brian W. Kernighan), The UNIX‑HATERS Handbook (Simson Garfinkel et al.). There's http://folklore.org about the early days at Apple. Asiometry has https://www.youtube.com/watch?v=Ffh3DRFzRL0, a 20 minute bit about the Unix wars.
DefCON got too big for the Riv and the Sahara, and is now at the LVCC. Yeah it's not the same. It's never going to be the same, but some still gather for their yearly mecca and watch Hackers and get drunk in hotel suites paid for by corporate sponsors. Others stay home for various reasons.
Do we still keep what we're doing? I mean, I don't program in Z80 ASM assembly anymore. There are still classes in my code by the focus on OOP isn't what it once was. I'm not sure if I want to call it progress, but I don't program Win32 applications anymore. I can spin up a web app with an LLM in an afternoon, and have it web-scale to the whole world in less time than it used to take to get a computer racked in a colo.
It's not 1979, the cable I use to go from USB-C to HDMI is more powerful than the computer that took us to space. By like, a million times.
Look, I'm not saying we shouldn't respect our elders. By this point, though my beard's not yet grey, relatively speaking I am an elder. I learned to program from paper books. Before ChatGPT, before Stack Overflow, before Google. There are some here that predate me by decades. If you're competing with a $10 million Oracle db system, and going from 6 ASM instructions to 5 in the inner loop will eke out that extra percent of performance, and win you the contract, by all means, sit down, roll up your sleeves, and hand optimize assembly in order to figure out how to get rid of that one instruction.
The joke is oft made, that other fields stand on the shoulder of giants, while in computer science, we stand on their toes. And it's not wrong. I can't wait to for the next new language to pop up and reimplement a DAG solver for their package management woes, and to invent a better sandbox for running untrusted code. That's still an unsolved problem. If this stuff interests you, the Computer History Museum in Mountain View, California is worth the visit. The only problem is that at the end of the tour is computers I grew up with, which has a certain way of making a fella feel old.
The travesty that is happening right now, is in the wake of Paul Allen's death, is the debache with his surviving sister and the Seattle Living Computer Museum.
Hackers (1995) is fictional but a cult classic. Freedom Downtime (2001) is about Mitnick and hacking culture. There's smaller documentaries too. There's that one about Wikileaks, that one about Cambridge Analytica. There's books like The Dream Machine (Mitchell), Unix: A History and a Memoir (Brian W. Kernighan), The UNIX‑HATERS Handbook (Simson Garfinkel et al.). There's http://folklore.org about the early days at Apple. Asiometry has https://www.youtube.com/watch?v=Ffh3DRFzRL0, a 20 minute bit about the Unix wars.
The source code to the original Microsoft DOS is at https://github.com/microsoft/MS-DOS. The Anarchist Cookbook is on Kindle, https://www.2600.com/Magazine/digital-back-issues goes back to 2010.
DefCON got too big for the Riv and the Sahara, and is now at the LVCC. Yeah it's not the same. It's never going to be the same, but some still gather for their yearly mecca and watch Hackers and get drunk in hotel suites paid for by corporate sponsors. Others stay home for various reasons.
Do we still keep what we're doing? I mean, I don't program in Z80 ASM assembly anymore. There are still classes in my code by the focus on OOP isn't what it once was. I'm not sure if I want to call it progress, but I don't program Win32 applications anymore. I can spin up a web app with an LLM in an afternoon, and have it web-scale to the whole world in less time than it used to take to get a computer racked in a colo.
It's not 1979, the cable I use to go from USB-C to HDMI is more powerful than the computer that took us to space. By like, a million times.
Look, I'm not saying we shouldn't respect our elders. By this point, though my beard's not yet grey, relatively speaking I am an elder. I learned to program from paper books. Before ChatGPT, before Stack Overflow, before Google. There are some here that predate me by decades. If you're competing with a $10 million Oracle db system, and going from 6 ASM instructions to 5 in the inner loop will eke out that extra percent of performance, and win you the contract, by all means, sit down, roll up your sleeves, and hand optimize assembly in order to figure out how to get rid of that one instruction.
The joke is oft made, that other fields stand on the shoulder of giants, while in computer science, we stand on their toes. And it's not wrong. I can't wait to for the next new language to pop up and reimplement a DAG solver for their package management woes, and to invent a better sandbox for running untrusted code. That's still an unsolved problem. If this stuff interests you, the Computer History Museum in Mountain View, California is worth the visit. The only problem is that at the end of the tour is computers I grew up with, which has a certain way of making a fella feel old.
The travesty that is happening right now, is in the wake of Paul Allen's death, is the debache with his surviving sister and the Seattle Living Computer Museum.