Weird. Today, I plug in a computer and it works. I install an OS and it works. It detects all of my hardware and gets online right away. My computer hasn’t randomly crashed in years. When I save my work, I don’t even consider the possibility that the save might fail and I’ll lose all of my changes. When I wonder how to do something, I can find the answer in seconds by typing plain English into one of several powerful web browsers. I also have a super computer in my pocket. Also, I have git which is so vastly better than anything I had in the 90s.
We seem to have two very different experiences of past technology.
One last thing. I installed Elementary OS on an old computer recently, and every app opened instantly. As in, I clicked it, and it was just there. No perceptible delay. I’m not entirely sure how they do it, but it seems they’re close to the goal in that department.
I remember the experience of shrink wrapped software just failing to install, or crashing as soon as it was opened, being something that happened semi-regularly. It was worse than trying to open an old flash website on a phone, a lot of times it was just totally unusable.
I guess not working at all is simple, but I would say the quality and reliability increase has been wonderful.
There is still software on the shelves? The last two I saw were Windows Vista and Norton. Admittedly I haven't looked in that section recently! (And to be fair, there are plenty of PC games to this day :)
> We seem to have two very different experiences of past technology.
Or you are talking about two different aspects. Older systems were more fragile, sure, but that doesn't have much to do with their usability. You can have a simpler UX on a strong foundation.
I didn't grew up with classic Mac (so no rosy glasses here) but the few times i used it (mainly via emulation, though i did use one real 'pizza box' Macintosh at the past) i really liked how downright simple the system is. This is a sentiment that was also echoed by my aunt who used classic Mac both at work and home back in the day and switched to Mac OS X at some point - decades later she still says that the classic Mac was much simpler and easier to use (she said that it was so approachable that one of her coworkers was actually referring to it as if it was some pet :-P). Note that when she started using computers she was already a full grown adult, so she hasn't associated classic Mac with her childhood or anything like that that people sometimes like to claim as reason for liking simpler stuff (which, i'll admit that it is partially true, but that only holds up for when you haven't used the stuff you claim to be simple any time recently and with modern tech and emulators this is really easy to confirm).
Also hardware is still sucky at times - my RX 5700 XT powered PC was randomly hanging and crashing all the time until AMD managed to stabilize things a bit a few months after the GPU's release. I still do not update regularly the drivers and keep an archive with the older drivers that worked around just in case something breaks because i do not trust the newer drivers to work without issues. Similarly on another system with a 2400G APU i installed Linux which would randomly lock up at startup - that was a brand new system BTW, it was just a system i bought with the cheapest components i could find.
Classic Mac OS was remarkably simple indeed and that simplicity died with Mac OS X. You could put the “System” and the “Finder” inside a folder called “System Folder” at the root of the drive and that was enough for you to have a bootable disk.
You could name your files whatever you wanted, as one should, no magical dot 3 letters extension at the end required.
There was a one to one relationship between a folder and a window and the files within it. One folder had always one window showing its contents which where always arranged in the way you set up. The arrangement was even preserved if you copied it to another drive.
I could go on and on. Memory management, process isolation and multitasking were deeply flawed and bolted on, but the rest of the experience was incredibly simple and cohesive and hasn’t been matched since.
Maybe a nit, but FWIW, on OS X you still can name your files whatever you want, 3 letter suffixes do not influence the file type, and folders still store their layout if in that mode (click the 4 squares in the toolbar and maybe you have to change one setting in the CMD J window not at a mac right now).
(not attcking you, the fact that it is a hidden setting probably proves your point. just find it interesting that this stuff is around still)
I install an OS and I have to set it up, which for some require an internet connection capable of downloading multiple gigabytes of memory.
When I watch the 8-bit guy, I see him turn on a computer from 1984 and while it sometimes fails for hardware related issues due to age and use, when it works it frequently just starts up in a few seconds, right into a basic prompt.
Meanwhile my laptop will sometimes get stuck displaying the boot logo when I start it with the button on the dock, sometimes it will refuse to see the external screen.
Not all micros used tapes, many used floppies. The classic Mac mentioned in the linked article even knew when you had a floppy inserted and could show (and open) an icon on the desktop once you did.
But you could also say the same for Windows 95 though, after you inserted the CD and had the window open then what? You still had to read a manual or whatever to figure out how the application worked.
Or even Windows 95 itself. Which isn't something that modern PCs have solved either - people who haven't used computers before to this day treat them as (black) magic boxes that they might break if they do something slightly odd because of how complex things are. And a modern Windows 10 computer has a ton more stuff on screen than a Windows 95 computer ever had.
Windows sorted that with autoexecution of an application delivered with the CD, the problem was like many productivity features it got abused by virus writers, thus eventually disabled by default.
Tell me why Windows shouldn't be able to start up in a few seconds, as a C64 did? My computer has multiple cores running multiple gigabit each. Detecting the connected hardware should be a trivial task.
I am fully aware that modern computers are supposed to be easier to use for the average person and they are certainly easier to _start_ to use, but having spent time teaching my grandparents to use their computer I am honestly not sure they wouldn't benefit from a simpler text based interface with less bullshit, random confusing crap that appears to change (live tiles in win10 is a specially annoying example) programs that ask/demand to be updated) and so on.
Of course they would have to learn to use it, and it would somehow have to access their garden websites, but it is not ridiculous. Not if you are a very new computer user and not if you are a very advanced computer user.
Anyway my childhood computer was a win98 computer, so I don't approach this from a nostalgic perspective but from the point of view that a modern computer should be better in all cases than an old machine.
> Tell me why Windows shouldn't be able to start up in a few seconds, as a C64 did?
Why don't you tell us why it should?
On an SSD it boots in <1min which is a feat considering how much functionality is just there on boot, unlike a C64 which literally does nothing on boot except show a BASIC prompt.
A modern computer can boot into UEFI shell in seconds which is comparable to what a C64 boots to... but why would I do that?
Even if you get your OS to a bare minimum like a bare Arch Linux install it takes a while to load for a reason.
We trade off some boot time for a lot of functionality and smooth experience once the initial RAM has been set up with the needed data structures for normal operation.
That ignores a lot of things, like why it has to load so much information off the disk to boot. The entirety of Windows 95, the interface of which is not much less computationally intensive than modern Windows, fit in <50MB. Loading that from disk today, even with spinning rust, should be nigh-instantaneous. What justifies orders of magnitude more data loading?
While I ultimately agree with you, a lot of things working better today is your skill and ability to buy nicer things.
Seeing my family try and throw bargain bin components at a computer is a painful experience. Worse, as now things are likely to be malware vectors.
And heaven help you if you are trying to get by with a cheap computer. At least a raspberry pi is just slow, similar ideas with windows leave me baffled at how bad it is.
True for me, but not for my mom. Today, she can print from her phone or computer on her very first attempt with a brand new printer. Holy hell, printing was such a PITA in the past. Do you remember those finicky cables with the screws? And installing drivers which then crashed or hung your computer? It’s amazing how much better most experiences are today.
> Do you remember those finicky cables with the screws?
I remember cables with screws (which didn't actually need to be screwed, really.) They were quite easy to use; not any harder than configuring WiFi is, and much more reliable. Less convenient as far as choosing locations, though.
> And installing drivers which then crashed or hung your computer?
I've been using PCs since before OS-level printer drivers were a thing and through the whole era where they were, and I've never had problems with printer drivers on PC (well, outside of mismanagement by IT on work computers.) I've had problems with drivers on Linux, and with printer support apps (effectively drivers) on Android on personal devices, though.
Amusingly, last few times I have tried to print from my phone have not been successful. I recall one printer wanting me to have a particular app. Or, now, one expected to have internet access, which I don't have at home other than my phone connection right now.
That said, I would still count this as having nice things. When we got rid of our ancient printer and bought a current one, it does work more often. Ironically, better from Linux, as I don't have to install drivers; because of course it came with a windows disk that has installed some crap on that computer.
So, try this with what you could find at Goodwill some day. Or work with a budget where you stay trying to get refills to work.
I had Windows hard crash using Google Chrome on Hulu.
I had Windows reset itself in the middle of a long-running calculation overnight, because obviously 100% CPU usage and programs that refuse to quit is a signal it’s safe to power cycle the computer.
I routinely have to fight my devices to do what I say, because they re-interpreted valid words into other ones.
I think modern tech reliability is as overstated as historic rosiness — this turd is polished, but it’s not more functional.
Agreed, I used Windows throughout the 90’s. I remember running one of the early web servers on my pc and was used to rebooting daily to avoid an unresponsive server.
I think what also added to it was NTs responsiveness. Today there are so many housekeeping jobs that might take your CPU that you never know how long e.g. opening settings will take. Clicking anything in explorer on a good Pentium 3 running NT4 always had the same satisfying latency, almost like an RTOS.
It's "more stable" but is it "more reliable"? Not when it decides it's going to reboot in the middle of a long-running job even when I've told it not to. The inability to disable windows from updating itself is one of the most terrible decisions they ever made.
I don't even understand your comment. Yes, definitely more reliable than crashing randomly when doing about anything for no reason with BSOD, like when plugging peripherals or when one piece of software made the whole OS crash.
It doesn't matter. If I'm running a job overnight, I'm not plugging or unplugging any peripherals or starting or stopping any other tasks. My sole expectation is that the system will run my job to completion. Not to rudely interrupt it to restart the system to install updates which it has decided are of a higher priority than the sole reason I am running the system.
Another situation is that I'm running virtual machines on a Windows 10 host. I want those machines to be available continuously, and I want any system reboots to be planned and manually executed. Windows 10 has taken that control away.
Windows 10 restarting itself on its own whims is a regular occurrence. It's completely and utterly unacceptable. Even if you change all the settings to prevent it, it still occurs.
Yes, you can try, and it does respect it to a degree. But not all the time. And it likes to reset the settings following updates. There isn't a single and unambiguous knob to set which will guarantee it will never ever happen. Even for "emergency" security updates.
I'd much prefer the MacOS or Linux approach which is to notify you but still requires you to action it. Nag me all you like. But always leave the "reboot" step under my control to explicitly sanction. Then I can ensure it happens at my convenience and that it doesn't interrupt any business-critical work in process.
What version of Windows? 10 definitely seems less stable and slower than 7. I haven't had any BSODs with 7, which is amazing looking back now (with 10 having unrecoverable blue screens, damn).
FWIW, to throw my anecdotes in: my Windows machine, built in 2018, has never crashed on me once, to my recollection. My Linux laptop (Thinkpad X1) has...locked up a handful of times, I think. My work laptop, a 2017 Macbook Pro, crashes on me near-daily, and the keyboard is slowly approaching unusable.
On a 2015 MacBook Pro my xubuntu was rock solid. On the ThinkPad p1 extreme it's a joke. The Nvidia driver core dumped on me yesterday. The usb bus got into a bad state the day before (this started happening after I started using a USB 3.0 HDMI adapter, to avoid having to use Nvidia to power external monitors). That caused the webcam on the same internal hub to stop working, an in our WFH times, a webcam is essential.
The crazy part is that the webcam driver I used on the MacBook was reverse engineered and never actually reached "stability" or got added to the kernel. I just compiled it and packaged it with DKMS and then forgot all about it.
In the last three years I think the only OS-level crashes I've had on Windows 10 were failing hardware. Of course, knowing what hardware to avoid is half the battle!
I remember having to edit autoexec.bat and config.sys files in a text editor to install a mouse driver. I also remember having to set IRQs using actual jumpers on my soundcard. I actually loved all of that but it wasn't really easy.
Not only did you have to worry about IRQ conflicts, some hardware didn't work well (or at all) at some of the offered IRQs. And there was no Google to help you.
Indeed. I remember cycling to the local library, searching the old serial terminals for a particular book that could explain an issue I was having, then order it to be sent to my local branch (as they invariably didn't have it) so I could pick it up a few days later. Just to solve a problem (usually pretty complex ones involving things like datasheets or programming manuals but still...).
The internet has provided us a way to handle much more complex issues in much less time just by offering immediate access to shared knowledge. That alone makes things much easier these days than back then.
But I feel it makes us dumber as well in a sense. Before the internet you couldn't start a major project without knowing all about it before you started. Now often we will just start, and google any problems on the way, often realising only later on that the initial approach was poorly chosen.
I can still remember with horror a Windows NT networking class that needed every student machine to have three ISA Ethernet cards manually configured with non-conflicting settings.
Can you have that ready by tomorrow, they asked after lunch...
It's a matter of perspective. It surely wasn't simpler, but it may have been more joyful somehow..
Maybe the fact that it was slower, less connected, less parts overall too and that you had to get into it.
I mean I'm a dev with experience but I have near zero control over my smartphone. There's tons of nice stuff on it (hardware and software) but it doesn't feel much nicer because I actually have no clue about what happens now what I can do.
I think this is one of those preach-to-the-choir HN topics. Like, you may feel joyful if you're the type that likes fiddling with gizmos and wants to know how everything works, but that's not everyone.
I don't deny it, but it might be something that is way more prevalent than people may assume.
We're complex animals we like feeling stuff. For instance I've heard people miss playing LPs. You have to fiddle with it, setup the player, take time to install the disc, the arm.. etc It's 100x more complicated than an ipod .. yet people miss it. Ease of use is not all.
80s Macs, maybe. But not 90s. I had to help admin an office full of System 7.5 & 7.6 PowerMacs in 1996/1997, all constantly crashing because FileMaker could not run in a stable way at the same time as Netscape, or some extension in the System folder would freeze up the system randomly.
Classic MacOS had no memory protection, memory management itself was terrible and not particularly dynamic (reserve N bytes for this app etc.), and the multitasking in System 7 was bolted on in an awkward way, and the migration from 68k to PowerPC introduced a boatload of bugs.
The screenshots on the article's page shows System 7.5. Those were the dark ages for the Macintosh frankly. The _concept_ was "it just works" and "computer for the rest of us" but that was the reality only if you stayed on a fairly narrow path, frankly.
You had extension conflicts all the damn time. There was a utility called Conflict Catcher (https://en.wikipedia.org/wiki/Conflict_Catcher) that would identify the culprit by doing basically a binary search and turning off half the extensions and rebooting (many times) and noting if there was a crash, and deducing the problem extension that way.
It was kind of like "git bisect" before that was a thing.
This craziness is no longer necessary.
I mean... when you DID get like 20 extensions all running together stably though, it was AWESOME.
Configuring IRQs on hardware and in BIOS always made me feel like a hacker. Windows 2000 and earlier also had problems with DHCP, I fixed many network setups back in the day, often half-assing it and just using static IPs/mask :D. Worked surprisingly well for small businesses where they used the same hardware for years.
Yeah. I did, too. I really don't feel inspired to subject my kids to that sort of bullshit. In fact, as a person who takes pride in calling himself a "software engineer," I find it... Disheartening... That this is as far as we've managed to get with personal computing in the last 30 years since I was young and interfacing with a machine. It's actually somehow worse while managing to be "better."
And remember the cut fingers from inserting and removing all the cards constantly trying out new jumper combos just to get the SoundBlaster, graphics card, ZIP drive and whatever other goodies all working at the same time :D
It wasn't easy but it was simple, or something on those lines. You knew which jumpers set what, and when you set an IRQ you'd know what you'd set it to. If two devices were conflicting you could understand how and why.
Now we're basically all cargo culting. Every few hours one of my USB devices reconnects itself and I'm genuinely not sure whether it's because of a firmware bug in my docking station or what.
Author seems to think PC users like to run multiple OSes on their machine. I think that might just be him, regular people have enough to learn in mastering a single OS, no need to add more.
Also, macOS still has a nice GUI to set startup disk, and most of the other features he shows old macOS having.
And the rest of this multi-OS pipe-dream, who wants any of this?
That may seem like an exaggeration but in my experience it's literally true.
I tried to explain to my wife what "Linux" was but had to explain what "Operating System" is first, and I don't think I ever made much of an inroad on either. It's just not interesting or relevant to her life at all. My in-laws, foggedaboutit.
>It's just not interesting or relevant to her life at all.
This is the key point I think a lot of tech people don't get. My dad's smart. He worked in pharma for many years and knows tons about a lot of the science (and business) related to that.
He just never had any interest in learning about things like file systems and so forth. A tablet for him was my best gift to myself ever. (And he now just uses a large smartphone. He doesn't use the vast majority of its capabilities because he's just not interested in them; he uses what's of interest to him.)
> He just never had any interest in learning about things like file systems and so forth.
Most people (especially today in 2020) don't know what a file is. And that's okay, or great even. A filesystem is a leap yet beyond that.
Apple takes great pains to hide the filesystem from the user in iOS. And that was probably a really, really good move. They have the freedom to change how the internals of devices use files and may have gotten some security benefit as well.
Mobile computing provided a great opportunity to enable some cool tech that breaks backwards compatibility. Like signed bootloaders, indexed-database-as-filesystem, permissions/capabilities. All of these were available on desktop OSs but haven't found a wide audience. Or hadn't, until much more recently. Sorry- bit of a tangent there.
Point being: way back when, in order to be productive with a computer you had to know more about its design. But today, many things "just work" without need to understand how/why.
"I used to be dismissive about tech issues; I started caring when I realized that tech is where the power struggles of this era will take place." -friend of mine
I love sharing this quote because it is so prescient. Sleep too long, and we may wake in chains.
My girlfriend is incredibly intelligent, but doesn't care much for computers or programming the way I do and just buys Apple stuff for its "just works" factor.
That said, I was having trouble getting sound working on a Linux laptop one day, and it turned out to be because my user didn't belong to the "audio" group and so didn't have permission to play sound. I explained this to her, and she told me she understood why that would make sense, and why someone would want to design a system to work that way, even though it is not a system she, personally, would use.
Aaaand this is why she is my girlfriend. :)
Not to take away from your wife, I'm sure she's lovely. I'm just glad to be with someone who's both smart and empathetic enough to understand why I do what I do, and why I like the weird computer shit I like. (My girlfriend on the Amiga 500 I bought: "I like this. Why didn't this take over?")
And if you try to argue the merits of one OS over another having had much experience with both, they will argue back with you even though 99.9999% of their experience is with one OS
They have to be taught, and normies learn about things like this the way they learn about much of the world: through marketing.
If the enormous hypetrain behind Windows 95 did one thing, it was to make "operating system" a household term -- because in order to care about Windows 95, people had to care about operating systems.
"Oh, I run my games on GameOS because it doesn't take a bunch of resources for GUI animations and multitasking, but I use PrettyOS for just regular work. I've got high hopes for VROS, they say I can use a weaker system with my Vive than on Windows."
People don't do it now because it's ridiculously difficult to do for limited utility.
What if you want to use the same form factor, same system, same files, you just want the system to allocate resources differently? That's what this hypothetical would be.
These kinds of articles come up occasionally about how good things were back on the day but I think it's largely due to forgetting all the problems, like:
- BSOD
- Running out of memory and trying to shift stuff to high mem.
- Software would come with 20 install floppies. Good luck if it failed on #18.
- editing autoexec.bat to get your cd drive to work
- hoping your game and your graphics card are compatible (as much as we can hate Microsoft, DirectX was a good thing they did)
- wanna chat with girls? First dialup modem, then knowing AT commands to disable "wait for dial tone", because somehow that didn't work in my country, installing an irc client, knowing the right server address and port to connect to, to be finaly able to send "asl?" to some random "_HornyGirl_" and getting back "13/f/cali"
It is true that things were not perfect in the 80s, but the author has a point. The point is that software designers had the goal of making things at least seem easier and consistent for users. That is the reason for standardized interfaces and APIs. Nowadays, computing seems to have become a free for all, where developers don't care (or have lost the hope of) presenting software in a way that is easy to use. Users are supposed to magically "figure out" how things work, and if they don't is because they're "stupid". Software users nowadays are supposed to put up with whatever crappy interface is was designed last week, just for the sake of "refreshing the UI" (see the disastrous UI updates promoted by Google and FB).
Well, the point of making users "figure it out" is so that you don't need a 100-page manual, or a training session to use the software.
As for the fact that interfaces change - I noticed that it's an issue mostly with Windows&Linux. MacOS has a high consistency of interfaces across the apps - might be due to a HIG that Apple publishes?
I could write a book on the headaches of getting certain hardware to actually go to an IRQ and/or address that won't conflict with another device. That hell went from MS-DOS all the way into Windows 98 and only went away once Windows 2000/XP made PnP actually work reliably.
I feel like the third sentence of this post should've been the title:
"Can we make a friendly Libre Desktop operating system with focus on simplicity, minimalist elegance, and usability?"
I like that idea.
Having lived through the 80s and 90s personal computing era, I can say there was very little that was easy. Especially the 80s. Everything was also super expensive. Things crashed all the time, I can remember hitting Save and having a program crash and losing work. Because I saved, so I wouldn't lose work. It took so much work to do anything back then. I guess it's easy to forget how much everything sucked.
I don't disagree with the central idea here though. I do think something that focuses on simplicity, minimalist elegance, and usability is a nice idea. Though I can't imagine many people agree.
So a commercial Ubuntu fork with some custom apps, got it. Still hardly groundbreaking though right? I'm thinking more of something like HaikuOS if you want something completely out there.
I do like the concept the author puts foward, but there's also quite a bit there I disagree with. The author rails against security features, which I think we can all agree are pretty important these days.
> I don't disagree with the central idea here though. I do think something that focuses on simplicity, minimalist elegance, and usability is a nice idea.
Indeed. I've been playing around with OpenBSD on a old ThinkPad I picked out of some recycling, and I think I'm getting there. I'm also looking at throwing Plan9 on a spare Raspberry Pi I have floating around and playing with that.
"Can we make a friendly Libre Desktop operating system with focus on simplicity, minimalist elegance, and usability?"
The author mentions "Decoupling of the hardware from the OS and of the OS from the applications"
I've been fooling around with Raspberry Pis, and like the idea of bare metal programming. Unfortunately, it isn't so easy, because something like USB requires a software stack. How much better it would be if all the hardware was memory-mapped!
Then we wouldn't need an OS at all if we didn't want one.
So, I'm still waiting for Raspberry Pis to be what they should have been in the first place. Pi's seem to be going the wrong way simplicity-wise.
Maybe one day someone will come up with a RISC-V machine that satisfies this niche. Such a machine would be many many years away, though.
Don't get distracted by the OP's nostalgia. The point they are trying to make was that older OS's had much more decoupled concepts for things like changing OS, applications, and settings. Decoupled enough that everything was accessible to the user through a small handful of concepts. You had to understand a disk drive, how to move things in a file manager, and how to run things... and that was about it. Our current systems are the opposite: tightly integrated to the point where every action can cause an avalanche of secondary effects, and flexibility is greatly reduced. The short list of main recommendations are not crazy at all on this community:
- boot from a read only, immutable OS on easily-replaceable media. (Ahem containers, nixOS)
- applications should be encapsulated in a single object which can be executed from anywhere in your filesystem. (Flatpak, containers)
- applications and the OS should be designed for offline by default. (Any user of EA games would back this idea)
- only persist user settings across boots, ideally in an OS neutral format (containers)
- expose less to zero UI configuration to the user (core design concept in Apple land, I gather)
- single user no login by design (many linux distros, windows until recently...)
There are more but you should read TFA. These are not radical concepts, they're the norm in other contexts.
Nooo. There's the problem. GUIs have always been a shitshow w.r.t. compatibility. I've pretty much come to the conclusion at this point that, if we want simple, we have to give up on wanting it with GUIs. Vendors just aren't going to ever agree. There's too much vision and financial incentive to compete. However terminals, are boring, and operating systems no longer have any reason to engineer incompatibilities. As of last year, they've all pretty much been like, yeah, we'll implement vt100 because we don't care anymore. Even Windows CMD prompt! It's great. This new consensus is going to make so much toil go away.
For example, here's an app I wrote a few months ago. It's a single file TUI application. The binary runs on all the different operating systems and looks exactly the same. Truly build-once run-anywhere. It literally looks the same on Linux as it does in Windows command prompt and Mac too etc. It doesn't have any dependencies. It's about 200kb. https://justine.storage.googleapis.com/blinkenlights/index.h...
I've been using it lately testing tool, because I want these single file programs to boot from BIOS on bare metal too. I've got it working for stdio so far. So once I add a virtio ethernet driver it'll be golden. Especially for cloud deploys. Every program you compile is its own operating system which runs on all the other ones too. I call it the Cosmopolitan C Library (NOTE: this page is work in progress!) https://justine.storage.googleapis.com/cosmopolitan/index.ht...
For me that would be exactly opposite of the direction I want things to go. I want more files and more editability. I want to be able to track down the audio file for that godawfully loud "CLANG!" sound Google Assistant makes and tear it out by the roots, then print it out and burn it for good measure.
The NeXTSTEP/Mac solution is to make the GUI treat application bundles as a single unit, while behind the scenes they're a transparent, self-describing directory hierarchy.
> GUIs have always been a shitshow w.r.t. compatibility.
I completely agree with your assessment that vendors will never agree on a common GUI toolkit.
However, if one were to think about it, anything shown on the screen is a gui of sorts. It's just that text mode was long ago standardised with the use of ascii and the corresponding cga/ega/vga drivers.
> Before 1963, computer manufacturers had over sixty different ways of representing characters in computers. Machines could not communicate with one another [0]
There were some wild inconsistencies with the different display drivers back in early days. It's just that no one seemed to want to standardise on renderable gui elements afaik. Every platform has redesigned the wheel, so to speak, giving us the mess we've always had.
It'd be interesting to see hardware level support for UI elements, but as you suggest, we're probably too far down the road for anyone to give up their niche GUI library.
Yeah it's pretty old. Textmode interfaces began with the teletypewriter in the nineteenth century and UNICODE began around that time too. The programming interface for these machines wasn't standardized until the 1960's with ASCII, which was extended a decade later by ANSI X3.64-1979, and then again in the 90's by UTF-8. That's the trifecta of textmode definitions which, only as of these last few years, became basically universally supported. CGA is kind of just a uint16_t[20][80] array for personal computers. See also: https://github.com/jart/cosmopolitan/blob/ea0b5d9/tool/build...
You're absolutely right that the inconsistencies with textmode used to be about as awful as the inconsistencies we see with GUIs today. I'm certain things will get better if we consider that GUIs are a relatively recent invention by comparison. So it'll probably take another fifty years for widget authors to forge out a consensus that's as good and widely supported as the one we have now for terminals.
A source of inspiration for good ideas might be had by looking at how early computers worked and handled this. My understanding, from stories my parents told (both of whom were active in early computing), is those from the 1950s and 1960s that used punched cards and paper tape. Common routines would be included as a deck of cards or spliced paper tape.
Uhhhh.... holy crap, sorry... but what is the author smoking? I started out with PC's in the early 80's. They were HARD to use (granted, as a pre-teen, I fell in love right away - but I was already a nerd). They were not intuitive. They didn't do very much. They generally sucked for most people, unless they were using a word processor (and even those were tough for some people)
Today, everyone can use one. They are very powerful, do a lot of things, and are comparatively simple to use - I mean, my MOM can use the damn things, which she never could have done in the 80's and 90's. And with very little support for me!
+1, around half of the things the author mentions seems like a pure nostalgia.
Some of the points contradict with each other - he praises having independent OSes (and read-only disks) in one point, and then he wishes for settings to be consistent through OSes.
Also, some of the things he mentions already exist in MacOS, which is weird because he uses an old MacOS as an example. For example one application-one file is still true - most apps are packages, and you can have multiple versions of each app on the same MacOS, and it doesn't matter where you run them from.
Sure, the settings are stored in separate directories, but this was also a conscious decision. In the times of DOS, you had your user files and folders scattered everywhere, and now it's in your home folder.
He also postulates for an abandonment of linking ("each object representing a file, not a link to a file") - that's actually quite interesting. I think it would go badly quick, but a nice idea.
Ah, and finally - no auto updates. Sorry, but if we have persistent internet, we need auto-updates for security reasons. I absolutely hate it at times (especially on Windows), but it's a necessity to keep the security :/
+1 also, I too came through that period, and it was a tricky time, but it was an exciting time as there were many different kinds of computers, and every advance was exciting, knowledge was hard to come by and we heavily relied on books and magazines and other people to find out things. If you went from one computer system to another, was only a slim chance you could do anything unless you had a manual or something to refer to.
We got an IBM PC XT in 1984. I disagree 180° with every claim about the past this article makes. Computers of today, and particularly mobile devices, are infinitely simpler and easier.
But the IBM PC XT and Macintosh of 1984 were orthogonal to each other already in the year of 1984. You argument would stand if the Macintosh and the XT were on par in 1984, but whey were not.
Things weren't much better on Macs; it's just that on DOS, when things weren't working you would type a bunch of different command line options and then give up, whereas on a Mac you would just give up.
I'm having 90's flashbacks to debugging nonworking printers and my girlfriend asking why, after many months since the installation, "is Netscape showing up as a hard drive"?
And then there's "no, you just closed all the windows; you didn't close the app"
I was a DOS user at the time the Mac first came out and there were a few Macs around my grad school. My recollection is similar to yours. The fix on DOS might be something arcane related to config.sys files or whatever but if you banged at it long enough you could often (though certainly not always) get something to work eventually. Macs were more likely to just work out of the box. But if it didn't, you pretty much gave up because there was much less to twiddle with.
The “good old days” of simplicity OP talks about can be experienced by using a tablet. Turn on, use, turn off. It’s sufficient for lots of everyday computing tasks and can even be paired with a keyboard and mouse. Everything “just works” for the most part and it’s a very simplified version of the typical desktop experience.
Not really. The author seems to want to retain a great deal of control over the system while maintaining simplicity.
The classic Macintosh system software is a reasonable example of this. Up to System 7.5 the operating system and the bulk of software only exposed components that were intended to be manipulated by the end user. For example: you could install a printer driver by dropping the appropriate file into System Folder:Extensions or install a font by dropping the appropriate file into System Folder:Fonts. If you wanted to remove a component, you could drag it to the trash. While you would lose access to a particular feature, everything else would continue to function. Much of that was facilitated by structured files, since everything could be bundled into the component's resource fork.
Application software behaved in much the same way.
Of course, that changed around System 7.5. The function of the various components were less evident than a printer driver, font, or application dictionary. Dependencies increasingly became things that must exist for the software to function, rather than being optional features. Mac OS eventually dropped the concept of resource forks, so things that used to be contained in one file were distributed across many.
There are many reasons for those changes and the complexity of modern software likely makes returning to that old model impractical, if not impossible. That being said, for all of the improvements we have seen a lot of control has been lost.
I work with macOS a lot, both for professional and personal use. I feel like Apple is making macOS increasingly abstract and taking away more control from the user with every update. For example, you used to be able to turn off auto-mounting of external/internal disks with a few command-line entries, but as of several versions ago, that feature was mysteriously removed and no one in the mac community seems to know how to fix it. It’s the kind of thing that makes me drift towards *nix OSs for personal computing and side projects. I still use and prefer Mint for such things.
System 6 was pretty much the pinnacle point for combining ease of use with space for tinkering. Something like that (which is what the author seems to be longing for) combined with solid, multitasking underpinnings, would be a welcome player in the current landscape.
You reminded me of an interesting feature in System 6: it was possible to turn multitasking off. I found the feature useful when it came down to crunch time in my university studies. Just as people today talk about disconnecting from the Internet when they need a distraction free work environment, I disabled multitasking.
As for System 6 being the pinnacle, I have a hard time deciding between System 6 and System 7.1. The early versions of System 7 added a few quality-of-life improvements while maintaining most of the simplicity of System 6. System 7.5 (or maybe System 7.1 Pro) is when the Macintosh System Software went off the rails in my opinion.
System 6 stays with me as much better, mainly because System 7 randomly crashed unless I had just the right set of extensions, while System 6 only went down for me when an application (usually predictably) misbehaved. 7.5(.5) was truly terrible, though 7.6.1 ended up being remarkably stable.
Fascinating...how did the system behave with multitasking off? I’ve never encountered an OS with such a feature. Closest I can imagine is something like iOS.
From the end user perspective: you could only use one application at a time. In most cases, switching between programs meant quitting the current program and launching another program from the Finder. There were a couple of exceptions to this:
* Some programs let you launch another program without quitting. If I recall correctly, those programs would remain in memory but would not run in the background. If you wanted to return to the original program, you would have to exit the program it launched. This was typically available in development tools.
* Desk accessories were device drivers that looked like programs. This allowed the end user to run small programs, like a calculator or clipboard manager, while continuing to use another piece of software.
The more interesting question is how did the system behave with multitasking turned on, since the original system software did not support multitasking. For a while multitasking support could be added with a separate piece of software. System 6 was the first version of the Macintosh system software to incorporate multitasking and it was turned off by default (if I recall correctly). System 7 was the first version where multitasking could not be disabled.
I suspect that multitasking could be added in a relatively seamless fashion to the system software since developers were supposed to use the Macintosh Toolbox (API), so things like memory management were always mediated by the operating system. The main user-visible quirk was the ability to partition memory, which ensured that properly written software would not overwrite the memory of other applications. (Granted, there was no memory protection, so it could happen.)
Except when you want to type something - which is pretty much every 30 seconds or so; you have to put the iPad down on a surface, then type it with your fingers and off you go. If I am using a pencil, it is annoying as hell to type with one hand.
iPad is only great for taking notes + consuming long form content. Browsing the internet requires keyboard input.
Which is why I mentioned that with most modern tablets you can connect a mouse and keyboard. I agree it’s not for advanced users who are keystroke command junkies (like myself) but for many others — I am thinking of users like my mom, specifically — it’s great.
But you can get keyboards for your iPad. I’m using one now, a Brydge. It’s pretty decent. It’s not quite like a full-size desktop keyboard, but typing stuff like this is just fine.
Voice dictation .... if you are comfortable with sending all your input to a random 3rd party server. That's how the 2 major providers work nowadays unfortunately.
We had mediocre on-device voice dictation when Pentiums + 486's were all the rage; (ie. Dragon Naturally Speaking 1.0 in 1997) why can't we get on-device voice dictation on these ARM devices at similar or slightly better quality, since these SoC's are 10x what we could do in the 90s?
I sort of agree with the parent. I get into serious browsing/searching/etc. and I usually just end up grabbing a laptop. I can do it on my tablet but it just seems like too much work. Of course, that may reflect habits and if I were to only have a tablet I'm sure I could make it work. My dad certainly has no trouble ordering from Amazon all the time just using his phone.
Listen, I love retro computing. And sure, the systems were much simpler. Systems get more complex over time. I wouldn't be typing in this edit field in a web browser within a window in a multitasking GUI OS if not.
With that said, things certainly were not easier. Plug in a device now days and viola it works! Transfer a archive from Linux -> OS X -> Windows and you can view it (and even using the same multi-platform apps) across the board. ...the list goes on forever.
Can things be easier? Sure? Simplicity? Yes! Is it all rainbows and unicorns? Nah (as people have pointed out already you lose control).
Many systems exist for things like single file applications. Of course no one can agree on what that should look like it seems. AppImages, .apks (which are just Zip files), static binaries, containerized applications, etc. OS X has had .app's for a long time which are just a folder of files (and the common way to transfer that is a mountable image or archive).
People have tried many times to come up with embedded icons in ELF. Why some Linux users push back is still beyond me. Yes we know we need multiple formats and resolutions. OK, support that. .desktop files are nice, but that shouldn't be something I have to mess around with generally. Most systems still don't have a nice way to even create these files (which let's be honest: are just the Windows equivalent to a .lnk desktop shortcut).
Oof I could go on forever here. I think I'll stop.
I think this guy has a too-rosy view of the 80s and 90s. Probably used only Macs as most of his examples are about Mac.
By the way, a Mac still lets you select a boot drive in the exact same way. And current Macs still remember the positioning of icons and on a Mac you can still run apps from anywhere (unless the developer didn't follow Apple's standards but that's not Apple's fault). If you take his comments purely within the Mac field of view which he seems to do, some of these statements are simply false. And the unrestricted, non-password users? Well yeah it would be nice but the cold dark world of the internet has caught up with that. Good luck with your botnets.
I don't think he's ever edited config.sys and autoexec.bat files, struggled with setting nonconflicting IRQs with jumpers and set up XMS/EMS high-memory drivers, just to have enough to load all the TSRs you needed and have enough real memory available available to run a program.
Or used Word Perfect which required a 'cheat sheet' with all the different function key combos. Most programs required classroom training for all non-geek people to use them.
I loved that time too. But it was definitely not easier than it is now.
By the way, someone looking for this simpler experience can find it in the mobile OSes. An iPad already provides most of these things while being secure (but at the expense of Apple deciding what you can and can't do with your device). A lot of people already use a mobile or tablet as their sole computing device.
4) Spend 10 minutes saving your carefully written code to tape.
5) Scream at the tape the next day because it stretched a bit and the computer now won't read it back, or your little sister recorded the spice girls over it :P
I did love that time too though! But it did have its drawbacks.
But really, this experience is not far away if you're looking for it. A raspberry pi kinda offers that. Maybe not the 500ms, but it boots in seconds to a CLI and it's cheap enough to just leave running. Windows 10 is also really fast at booting (basically because it secretly just hibernates its kernel but it's a smart and useful trick).
Well in effect that same thing happened to me.. Atari's tape drives were horribly unreliable. It was a tense time listening to the screeching during read operations. You could usually hear it go wrong, and some error would appear, having to start all over again (and the position counters were cheap and unreliable too).
My 1010 tape drive also had really really thin plastic stems under the buttons, which pressed on quite heavy mechanical tape controls.. Obviously they didn't last much longer than the warranty period :X I removed the front panel and pushed the metal parts by hand for a while until the whole thing just gave out.
The new-style XE tape drive I got after that was better but still had the poor reliability.
There was also a disk drive option but it cost more than the computer itself (and in fact it was almost another computer as it was powered by a similar CPU, just with less RAM). So being a schoolboy I never owned one.
6) You remember your friends brought some games yesterday and you had to tweak tape player's head position. Where is your screwdriver again?
7) Spend 20 minutes trying to re-align the heads so you can load the tape.
...
If you want "turn on and start coding" experience, then programmable calculators are your friend. Really, give them a try -- they have the same BASIC, support your familiar "PLOT 10,0" commands, but at the same time do not lose all your data when you turn then off, and do not require tapes.
It's possible, but it's just not the same thing. There are orders of magnitude more layers of abstraction in play. "Fullscreen a terminal" implies graphics - so X, toolkits, window managers, on top of kernel and userland etc. It would take you decades to wrap your head around all that; a little 6502 computer you can grok in a month.
Quick, what stovepipe of software would let you type "PLOT 10, 10: DRAW 20, 20" to draw a line :)
Any system with Python (so any Linux system)? Here is a real-example, just tested on my Ubuntu PC:
$ python3
Python 3.6.9 (default, Oct 8 2020, 12:12:24)
[GCC 8.4.0] on linux
Type "help", "copyright", "credits" or "license" for more information.
>>> from turtle import *
>>> penup()
>>> setposition(10, 10)
>>> pendown()
>>> setposition(100, 100)
Not quite the same commands, but pretty close. And you have more modern things like "bgpic", "onclick" and entire Python library.
Yeah, but you're missing the point the parent is making. On a C64, you boot into bare basic and can immediately start playing with sprites and lines and stuff. With a modern computer you have BIOS, 80 bazillion lines of Windows or Linux, your Python installation and about 90 layers in between. It's simply not possible to fully understand a modern computer. Just take a look at all the lines of code in Windows and Python that lets you play with turtle graphics (forget about the other million lines of other code I don't even know about). Then, compare that sum with the amount of code in the C64. See the difference? Sure a modern computer does a lot more, but it's also a black-hole/ accretion disk of bad design glued together.
If you want to play with sprites and lines and stuff, use modern PC. It does the things that poor C64 never dreamed of, and if you want constraints, there are always things like "1k demos" which let one express their creativity while keeping the minimal amount of code around.
if you want to understand stuff down to the metal, there is a whole embedded world out there. From the simple 8051 with a half-dozen registers to 32 bit ARMs where you can learn how to set up DMA transfers. (My favorite one is 8-bit AVR: the datasheet is small enough you can read it end to end, there are plenty of peripherals, but at the same time it is slow enough you have to occasionally solve "how to code this quickly" puzzle)
I see no reason to mix those two, except because of nostalgia. Yes, in the past you had to be blacksmith and woodworker -- but those are gone. You can still mix those two, and the very first rasp you made still feels very special, but let's not pretend it was somehow better than the modern store-bought one.
I'm not sure if there is a goal here, just pointing out how times have changed. I shift between we should throw out the whole modern desktop and completely redesign it from the ground up to resigning myself to the understanding that the complexity is unavoidable in a fashion if you want networking, USB, web browser, GPU...etc.
The desktop itself doesn’t really seem very ‘modern’. It’s really not very different from what was available in the 80’s and 90’s, just with better graphics on some systems.
Most of the complexity in those things you mention comes from asynchrony.
Come on. I think you're remembering the past with rose colored glasses. I loved my C64 in the 80s but it was not user friendly, compared to say an iPad running playgrounds.
And it’s not like we understood how graphics worked, or how the basic interpreter was built in order to type those commands anyway. Even if some of us do now, most people who used those machines never understood them.
I do agree with you in essence - I want that simplicity back, but I don’t miss actually using those machines. I wrote some quite complex software in BASIC, and have no desire to return to that.
I want something with the immediacy and programmability of that era, but with modern capabilities and programming concepts.
Yeah, I basically want a machine with a fast CPU and lots of RAM that boots into something like LISP or Forth with a very minimal OS (SD card navigation, text editor, image and audio file player, simple graphics).
We have a group that has been playing with that concept, songseed.org/dinghy, including the notion of a ‘forth workstation’.
Through this, I have found networking to be a significant design pressure on the rest of the system, inviting complexity. The comparison of retrobsd and litebsd in the pic32 paper talks about this.
Another conclusion - if you want the platform to be relevant on high end hardware, you must use Windows or Linux as part of your foundation due to the gpu driver situation.
It sounds like you've thought this through pretty well. Once you add in networking and a web browser, it might not be possible to avoid all that complexity.
I feel like I have only scratched the surface :) If you take interest in this, we hang out in #dinghy on freenode. We have bursts of activity, and then long periods of silence.
But I do feel like the world is ripe for a revolution in a few directions here.
-- Browser
To your point that a conventional browser creates unavoidable complexity: I agree.
It may be possible to disrupt that stack entirely, by creating something that developers would prefer to develop for.
The new conversational interaction between whatsapp users and corporate chatbots (e.g. airline rebooking over whatsapp) shows developer and user appetite for non-HTTP internet activity. As does Slack, somewhat.
This gopher-like 'Gemini' thing that has been popping up recently is interesting.
On the Dinghy website, there is a proposal for an engine, /Limit/. It would be a Forth VM that would maintain an async connection back to a webserver-like server. Instead of having HTTP+HTML+CSS+Javascript, you would deliver sites as forth bytecode. This could be tighter yet more flexible than a web browser. An Everything-Is-A-X based overthrow of the current web browser model.
/Limit/ could be simulated in a browser (using javascript and websockets) so that people who only had a browser could browse a Limit server without adding new software.
Hypothesis: /Limit/ would have common purpose with alt-OS communities that do not have first-class browsers: Haiku, Plan9, Amiga, Suckless, people who run old SGIs for fun. Imagine if you could just recompile a C codebase, and change some headers, and get a full-featured browser-like thing running on these systems.
-- Networking
I am confident that there is a non-complex path through this problem.
Hypothesis: interaction with the systems API should be purely asynchronous, and TCP interaction will look nothing like the Berkeley sockets API.
But there is an occam-based OS, RMoX, that looks to be a demonstration of the async OS concept. I hit obstacles trying to evaluate/verify this.
On the Dinghy website, there is a paper for "Drift". This proposes an async OS for the amd64 architecture. I noticed drafting problems in this paper earlier, will revise soon.
The addition of io_uring to Linux may allow another approach: create an async API mezzanine on io_uring; regard this as your Bedrock, rather than the hardware itself.
Once io_uring is mature, it may be possible to fork Linux in order to discard /all/ syscalls except io_uring. This kernel would have ongoing Linux driver support, but with a much smaller kernel surface area.
-- Your requirements
Regarding your stated requirements, 'SD card navigation, text editor, image and audio file player, simple graphics', you could build this on top of the Maximite architecture. But it would lack networking.
Whereas the Dinghy concept goes beyond your stated requirements.
Sounds like I'd love to play with the system y'all are hoping to build. I'd also be ok with something like a web browser, but just Forth scripts being shared like you said. As long as I have text and images and a way to view them, I don't think anything else matters.
It sounds like y'all are trying to build the original internet OS (iOS, but years before Apple took the term) that Carl Sassenrath of Amiga and Rebol fame was trying to build. Rebol is a lot like both lisp and forth, and is very powerful. Carl thought users would just share Rebol scripts over the network. To give some context, a fully graphical tetris is like 3/4 a page of Rebol code. If only he had open sourced it sooner.
Thanks for your note. I have had a several bursts of play with Red, but it keeps slipping my mind. I suspect it would be an easier starting point than building up Forth from nothing.
There may be a challenge bootstrapping that ecosystem on obscure platforms. If I remember correctly, you need an existing Red or Rebol in order to build the latest Red.
I will update the Limit page with a reference to this.
No problem. Rebol is built on C I think and is an interpreted language that I would guess is similar to Ruby in speed (as in fairly slow). Red should be a lot faster, but I wonder if they'll ever get there with the project being so ambitious. I do think you currently need Rebol to build the Red binary, but it'll eventually be fully self hosted and under 5 MB in size. A Red OS would be pretty neat as in you could use the high level words in interpreted mode and throw in some Red-System level code for performance if you need it. I'm a little concerned though that some of the high performance libraries will be commercial. I don't mind paying, but licensing always gets cloudy in those cases.
That's absolutely true. It was much simpler to start creating. Now you have a choice in platforms that can achieve the same with similar effort, but you have to start by picking and installing one, whereas those computers came with it.
PS: You also used an Atari? I think it was DRAWTO there :)
Have you seen Scratch [0]? It runs entirely in the web browser and you don't need to install it.
It is designed for kids and it looks a bit goofy with all those drag-and-drop programming commands, but don't be fooled -- it is a pretty modern language, with flow control, variables, functions and all.
At the same time it is great for "just start creating" things -- you start by drawing pictures, then you animate an object and the next thing you know you have a fun game :)
They have some modern machines meant to replicate that experience, but on more modern hardware (Ex: 480 MHz lets your BASIC code run way faster than Assembly could on those 8-bit machines), but you still boot straight into BASIC (pretty much no OS) in about 3 seconds. Pretty nifty.
PCs were not the only alternative to Macs in the Early 80s and 90s.
In front of me I have an Amiga 600 (with an accelerator) and the whole OS can be installed from 5 floppy disks and is roughly equivalent to MacOS. I also have an Atari ST 1040 with TOS. No autoexec or IRQ madness to deal with. They mainly just work.
> By the way, a Mac still lets you select a boot drive in the exact same way.
The UI might be similar here, but the experience is vastly different. Back in The Day it was pretty common to build customized System Folders by hand to fit your personal needs and your machine's resource limits, e.g. to fit in a RAM Disk on a PowerBook to save battery on the go. The minimum bootable (note: not "usable") blessed folder could be just two files — a System suitcase and the Finder executable.
I know what you mean, but I wouldn't say it was pretty common. Most of the systems I saw back then had very long boots while all the random 20 extensions fired up.
Yes! This! Sooooo much time tweaking extended vs expanded memory (who came up with those names?) and fiddling with sound card IRQ settings to get a new game running.
Slightly alarmed by the implied air-quotes around "security". Does the author think there's no need for security? That it's not important?
A large chunk of the problems we have now are because no-one thought about security in the 80's and 90's when the various protocols we use were being invented.
It was only kinda safe because not connected to the internet. As soon as the internet came along all those simple protocols got abused royally, and all those simple OS's got pwned routinely.
I remember those days. Being the family IT support, once a month I had to clean my sister's laptop of all the viral crap she kept clicking on. Today is better. Much, much better.
Except the protocol/format we use for communicating text in the public is terminally convoluted, requires executing arbitrary code, is laughably energy-inefficient and has only one or two apps left that can deal with it. When SGML on which HTML is based (also released in the 80s) was a "humanist" idea to prevent exactly that from happening.
> Slightly alarmed by the implied air-quotes around "security". Does the author think there's no need for security? That it's not important?
I can't speak for the author, but I think a lot of this industry sacrifices way way too much in the name of nebulous "security". Case in point: Spectre and Meltdown are not likely to ever be a significant threat to any desktop computer user, nor to any server that is only running trusted code, yet a lot of IT people lost their shit over it and were willing to significantly degrade their performance via heavy handed mitigations.
It’s worse in networking where firewall admins block first and ask questions later, breaking VoIP, making video calls slow, and breaking anything P2P. The rationale is security but if you ask them what risk is being mitigated by blocking this or that they can’t answer. It’s superstition.
The vast vast majority of targeted attacks today are by exploiting the least secure part of the system: the meat bag in front of it.
But if the whole shebang had been built with multi-user, online, and hostile as a default we wouldn't have spent 20 years dealing with the failure of protocols in the face of black hats, and the meat bags would be more secure in their ignorance.
ah, I would spend hours colorizing icons using resedit 2.1.3. Instead, I should've been out playing balls. lol.
re: the article, as much as I have a lot of fond memories of System 7, I don't think today's modern OS seem that bad.
MS has been trying to abstract away legacy complexity bit by bit, it's very difficult to just throw everything away, the worst one being flaky file sharing functions.
But over time, I'm sure we'll get to better place.
I'm not sure what dreamland the author lived in. Computers today are relatively about the same hassle wise as they've always been. The exact problems may have changed, but I find myself experiencing roughly the same amount of problems with computers as I always have which is basically:
Enough that I always realize they're there and sometimes they annoy me but overall, having a computer and having access to all the awesome things computers can do is worth the hassle.
> Back in time when things were easy, things “just worked” right out of the box and you did not have to configure much
To which my comment makes a number of counter points. You had to configure a lot. And things rarely just worked. And if they worked, they were rarely consistent.
The article is an interesting read, but on nearly every point of it, I prefer today’s computers to any of the past. The exception being his point about apps requiring internet access and being generally hard to keep in multiple copy form.
The vast majority of the article is talking about specific ways in which they were better, but HN has latched on to the "just works" and decided to dredge up everything from IRQ conflicts (a thing on PCs, while the article mostly talks about MacOS) to every software bug they ever encountered because they like being contrarian.
So true. I’d love to have a modern macOS that is as streamlined and well thought out as OS 8/9 or a modern NeXTSTEP. Instead we have bloated, buggy, fugly operating systems that restrict what you can do more and more with every new version. What a shame.
I tend to agree with this assessment. As the author demonstrates, classic Macintosh computers are/were much simpler than modern Macs and also easier to use.
They typically also sported a tiny black and white display (vs. 27" or larger 5K or 4K full/wide color and HDR displays), extremely primitive multimedia capabilities (Macromedia director vs. the web, iMovie, Unity...), much slower storage (slow floppy disks and hard drives vs. modern flash storage), and was more likely to crash or freeze due to lack of memory protection and preemptive multitasking (though it didn't have the problem of breaking when you lose internet connectivity like modern macOS.)
Modern IP networking is powerful and connects you to the world, but AppleTalk was largely plug-and-play and had nice dynamic resource discovery (which influenced the design of Bonjour/mDNS.) Note classic Macs could be connected to the IP internet, but they lacked web browsers until ~1992. ;-)
As far as Apple is concerned, I think iOS may be the inheritor of much of the simplicity of the classic Macintosh: one app at a time, simplified multitasking, hood welded shut with minimal user-serviceable components inside. But an iPhone also does much more than a typical Macintosh of the 1980s or early 1990s, notably full wireless, mobile internet access with modern web support, streaming video, television, and music, wireless/mobile voice and video calls, multimedia messaging and email, full speech recognition and automatic dictation, Siri/digital assistant, full voice control of the device, automatic language translation, accelerated 3D graphics with augmented reality support, automatic computational photography, high-resolution, color digital painting and photo editing with dozens of layers, video editing, CD-quality sampling, music synthesis, and multitrack recording, millions of downloadable apps, etc.. And it runs all day on batteries and fits in your pocket.
You are indeed correct, but 1999 is more "late 90s" and by that point the Macintosh (and Mac OS) had become vastly more powerful, with the PowerPC G3/G4, color graphics, GPUs, etc., not to mention advances in the web platform.
Note I said "Classic Macs" (think compact Macintosh models with 68000 processors) rather than classic Mac OS.
By 1999 the shift to OS X was also underway, with Mac OS X Server 1.0 released that year.
>> Desktop computer systems, especially those based on Linux, are way more complicated than typical personal computers in the 80s and 90s.
Because in the 80s things would boot to a DOS prompt or a basic interpreter and you had to know some commands to type to get anything to happen. Today they boot to GUI with useful applications to run. Well, except for Windows.
90s (Windows 95 and later, or any version of MacOS) personal computers weren't easier to use in any dimension than modern ones; 80s pre-PC ones, and DOS PCs, were only in a weird masochistic enthusiast sense where it took less work to get to a prompt for an fairly obscure and limited programming language/CLI, but more work to actually do anything significant with it.
Any given hardware platform tended to be a more stable target over it's lifetime in the pre-PC era, which is a kind of simplicity (not really ease, but once you've climbed the learning curve the ground isn't shifting under you.) OTOH, any given hardware platform was likely to fade into irrelevance sooner and if you wanted to swap to something with decent current support more of what you learned would be irrelevant, so even that stability was of fairly limited utility.
This is impossible. The internet killed that kind of simplicity. You just can't have the kind of minimalistic OS that you could back then. It wouldn't be able to handle the security and performance challenges of the modern web.
Of course, if you just want a simpler GUI for Linux, then that is indeed possible. But that's an uphill battle for another reason... the underlying libraries that you'd build you hypothetical GUI on are all built with the assumption that the user wants a rich command-line and config file interface to everything. Also you'll have to give up modularity. If you build a GUI frontend for ALSA, you'll also need to make it work with PulseAudio and Jack.
Wow, were macs really this easy to use? I remember having to fuss with autoexec.bat and config.sys to squeeze a few more kilobytes of ram to get certain games to run (looking at you, wing commander) and having to worry about the difference between “expanded” and “extended” memory, and so on. Windows 95 and it’s descendants didn’t make things much better- now in addition to the actual file system there are virtual file systems (the registry) that frequently get out of sync with the actual one, causing additional problems. It’s an impressive achievement if macs really were this easy to use!
It’s maybe something about having tools that are intrinsically more powerful.
To pick on an example given in the article:
File managers that “seamlessly” (air quotes intentional) support multiple network protocols and archive formats have a supposedly flat learning curve; so flat that no mental effort is expected.
Indeed there is plenty and very confusing for anyone not prepared to it upfront.
Firing up a dedicated - specific purpose - application primes the users’ attention and makes eventual struggles expected and even gratifying to solve.
It’s a hunch, no hard proof for this; maybe some behavioral scientist can chime in
Before we get all nostalgic about MacOS, let's just take a moment to remember how far behind it was in the late 90's. Windows 95 had preemptive multitasking. MacOS did not. If a program running on MacOS didn't feel like relinquishing control back to the OS, tough.
OSX finally added preemptive multitasking but, for a long time, it continued to provide little in the way of user customization. It was the Apple way or the highway. If you happened to like the Apple way this was fine. If you had other ideas, it was painful. Linux and Windows were both far more customizable.
On the Windows side of things, DOS was beautifully simple but horribly limited and, at times, not very simple to use. Half the challenge of PC gaming in the 90's was just getting the games to run! Who remembers what IRQ their soundcard used?
Things have gotten easier to use, but also more complicated. Complexity and ease of use are not necessarily opposing or entwined.
Some Windows applications are contained entirely within their program folder. Executables, libraries, configuration files, all in one place. If you move it to another drive, all you have to do is update the shortcut and it will work. If you move it to a new PC, just run the executable and it will work, complete will all your old settings. If you want to mess with its config files, you know where they are.
Other applications have bits and pieces widely distributed across various system folders hidden deep within hidden directories. If you try to move it, it will break. If you try to copy it to a new PC, it won't work. If you try to edit the config files, there might be two or three copies in different locations, one of which supersedes the others.
We don't need to return to the dark days of MacOS and DOS to make things simpler under the hood and easier for users to mess around with. We all want to write programs that are intuitive when they're running, but perhaps we also need to make them intuitive to mess around with when they're not running. OS's and programs can still be complex and user friendly even if we embrace simpler and more hackable installation footprints. Heck, just leaving comments in the config files to explain what things do would be a giant leap forward for some programs.
Stop treating software like turnkey installations and more like cars. Expect some of your customers to look under the hood and make modifications.
I'm not entirely sure what you mean, but I think I'm all for it! :D
Ancient computer magazines seem like magical treasure troves. I recently linked to an issue of Creative Computing which included the source listing for the Oregon Trail game that you could type in and run, as well as an article describing the game and a transcript of someone playing (and actually winning, haha.)
I'd say that's pretty interesting and educational because the code has to go through your brain and out your fingers, and then you have to debug it afterward. It cracks the black box open and shows you the inside components inside; even if you might not fully understand them immediately you can see what they are made of and how they work at a micro level. And limited system memory, page count, and and finger endurance seem to have kept source listings reasonably short and manageable, while still enabling interesting and non-trivial applications. There may also be more incentive to explore both the gameplay and its implementation in some detail since you put so much effort into it, rather than just clicking on the next opaque binary download.
That being said... github is also a magical treasure trove, but usually without the friendly magazine articles explaining the software, so the onramp is steeper even if you don't have to type the code in yourself. I'd like to see a github index of short-form, relatively self-contained software of this sort. Ideally with accompanying articles. ;-)
I'd also like to see more modern magazines like the MagPi.
Ah the irony of considering linux more complicated than windows or mac because it doesn't hide the complexity... compared to adding an extra layer of complexity on top of the real complexity just to hid it.
Not sure some here are missing his point but I think he isn’t talking about Windows or macOS, but Linux and even then not necessarily as a stand-alone OS but in a dual boot scenario too.
OS X is an incredible operating system, vastly more powerful than anything else at all ever.
There’s a million ways OSX is a work of art in OS design.
More than anything, it just works and multitasks smooth as a Swiss watch.
Simple, intuitive, consistent.
Yesterday I was using my iPhone and it said “share network password fro your Mac?” And it connected my to my local wireless. That’s beyond belief ease of use.
OSX knows what it is.... a powerful consistent desktop OS, 1000x better than anything from the 1980s or 1990s.
I was there in the 1980s and 1990s and it just wasn’t easy to use. Nothing was.
But...why? Why would I ever need to interact with any of the application files? Why would I need to move it to another portion of the disk? Give it a shortcut to click on to activate it and who cares? It could be running one file or a million files. The only files I care about are the ones I create and edit. I can't imagine any need to interact with the application files. If your users need to interact with the application files or move them around in any way, you've already failed at usability.
The way the Mac handles it, is a great way. An application is a directory with a special structure. The finder treats that directory like a file. You can double click it for launching the application, you can move it like a single files. Usually, no installation is required. That also means, deinstallation is a single delete. On the shell you can just cd into that directory like any other.
This way, the application developer does get all the benefits of a file system, while the user all of having a single file. While you don't need to do this constantly, being able to relocate an application on your file system is a very nice thing. Like moving it to a different volume or disk. Having it an a users home or the system application directory. Especially, having any amount of compies of the application (different versions) in as many places as you want.
Not having the need to "install" an application is a big advantage, not having to "deinstall" an even bigger. Having the application doesn't change your system. This is especially great if you run your systems for many years. Having to do a "clean install" is a symptom of a not cleanly separated system.
> Why would I need to move it to another portion of the disk?
Sometimes you want to store applications on different disks. Like say a removable USB drive so you can use it at multiple computers, or you want to run some from an SSD but others are ok running from a larger slower disk. Or maybe you just want to run them directly off the network. "Installation" is just a copy operation from one media to another, "uninstallation" is just a delete operation. In other words: there is a minimum of abstraction, the application is a single file and performing the same operations you do on any other file to them does exactly what you think it does and the application is stored exactly where you think it is stored.
The author mentioned "the Haiku operating system each application is one package, and each package is one file that is loop-mounted into the filesystem to make its contents accessible."
Now if you don't ever need to interact with application files why invent all this complexity when you can just expose the files individually and not ever care about that again.
Maybe it's easier for development if they're individual? Who knows? As a user the number of files needed for an application is irrelevant. Maybe the dictionary that Word uses for the spellchecker doesn't need to be a part of the executable, so if they want to update it, they can replace just that file and they don't have to make you download the entire program again?
We seem to have two very different experiences of past technology.
One last thing. I installed Elementary OS on an old computer recently, and every app opened instantly. As in, I clicked it, and it was just there. No perceptible delay. I’m not entirely sure how they do it, but it seems they’re close to the goal in that department.