I had a Mac from around 1993 to 1997. I even wrote articles for minor league Mac journals.
My impression is that Microsoft was founded to bring programming to the people, and the Mac was founded to bring great software experience, which I appreciate. But Apple (represented in my mind by Steve Jobs) didn't want his platform to be flooded by crappy software. Even HyperCard never had his wholehearted support (I used it extensively, despite its odd and limited language), and I think Apple was glad to finally dump it.
If you had a reason to write crappy software, as I did, you ended up with Visual Basic. What I mean by "crappy" is software that solves a problem reliably, but can't be scaled. For instance, installation might require help from the author, and entering a bad value might make it crash, but a small handful of tolerant users can handle these issues.
The solution today, with open source, is that the people bring programming to the people.
Open source is completely impenetrable to most of the population.
Between an impressive list of incompatible languages, the vagaries of Git, the complexities of command line access, the reliance on some version of the Unix folder structure - with endless confusion between /etc, /local/etc/, /usr/local/etc and on and on, the weirdnesses of package managers, and completely unfamiliar editors like Emacs and VSCode, the scene might as well be proprietary.
VB at least made if possible to write software in a reasonably accessible and integrated way. So did Hypercard. So does Excel.
The real problem with Automator, AppleScript, etc, is that they're hopelessly underdocumented and the tools are mediocre.
If Apple had made them more prominent and documented them thoroughly they'd have been far more successful - perhaps in an industry-changing way.
But they seem more like side projects that someone spent a lot of time and a little money on. Clearly Apple was never seriously interested in them.
That is such hyperbole. I have no formal IT background and I taught myself the Linux command line, Python, and Emacs in the course of a single year after I heard about this new Linux thing that was free to install. This was 2000, and from communities like Slashdot I saw there were plenty of other nerdy young people taking up the CLI and other Linux stuff, just out of curiosity and without any connection with the IT industry.
Sure, none of this software will ever appeal to the broad masses*, but since the 1990s programming has become so much more accessible to the general public, because of both the free-of-charge software and the free-of-charge documentation. Learning to program on earlier platforms could be very expensive.
* Still, I have heard rumors of a corporation in the 1980s where the secretarial staff – middle-aged women with no formal computer-science education – wrote some custom Emacs Lisp and/or Awk. Maybe learning even relatively arcane stuff can be done by anyone if their salary depends on it.
I would still disagree. 80s home computers pretty much all came with BASIC built-in, up to the PCs with QBASIC. The documentation was right there, and you had everything you needed to just boot up and start programming. Many even booted up within the programming language by default.
And at the time, most programs were text-mode. I/O was only keyboard, screen, and disk (and maybe mouse, joystick, and serial port if you wanted to get fancy). When your program started you had full control of the entire system. There was no multitasking, 100+ processes running in the background, nor GUI architecture over an OS subsystem. So you only needed to learn a very very few things to be able to build software roughly equivalent to professional store-bought software. Simple logic and data structures and 3-6 types of I/O and that was it.
From there it was a short step to something more powerful like Turbo Pascal or Visual Basic, again both of which were pretty much "batteries included" everything you need in one tool.
Today's landscape with its enormous stacks, build tools, cambrian explosion of languages, frameworks, and libraries, etc., and layers upon layers of abstraction and indirection between you and the machine, plus countless dependencies, package managers, containers, etc., all of which are constantly changing and shifting out from under you, is baffling even to professional developers.
Luckily, it is easier to find documentation, tutorials, and other actual people to ask about stuff. But also a lot of that is quickly outdated, faddish and evangelical, just assumes that you already know 500 other things, or assumes that you know absolutely nothing, so only covers the barest basics.
So it's really still more difficult than just picking up a book about C, QBASIC, Turbo Pascal, or Visual Basic and getting started was back in the day.
I've mostly self taught in programming since around 1981, and I've helped several friends and colleagues learn. My impression is that it's more difficult, like you say, but not prohibitively so.
People just instinctively steer clear of the "professional" tools and documentation, and choose their battles. Even well into the Windows 3.1 era, a lot of people who programmed stuck with MS-DOS text mode. Today, our code runs in Jupyter notebooks. We get stuff done. If it needs a framework or a container, we just don't go there.
There's a mild suspicion amongst amateurs, that the professionals are creating the complexity for their own entertainment or job security. It doesn't seem to make the software better (most "users" think that software is getting worse and worse), or less costly to write and maintain.
To put it more charitably, the struggles of commercial programmers are not invisible to us. I work in a department adjacent to a large programming team. By not attempting to write commercial-scale software, we avoid many if not most of the headaches.
There was at least one popular home computer which was never just text mode but booted in permanent bitmapped graphic mode, directly in BASIC line editor. Entering a command would execute immediately, entering it with the line number in front would add that line to the program.
So right after boot a single command could draw a line or a circle. One was not supposed to even type the letters for any command, but had to enter it like on the pocket calculators: the commands were written on the keyboard, in more colors, so just looking at the keyboard one directly saw all the available commands.
The barrier there was learning to enter the needed mode to reach the appropriate color and learning the syntax and the meaning of the commands.
Every user learned to enter at least one command and its syntax: the one which allowed loading any program from the tape. It was a full command with the empty string parameter meaning "match any name": LOAD ""
People are managing to wring Python / Jupyter installations together using online tutorials, writing, and sharing useful code. The open source community brought us those tools. Today, all of the "top" programming languages have open-source toolchains.
I agree about the portability of VB, HC, and Excel code, including spreadsheets. At least, VB6. ;-) But people are managing with the more fragmented ecosystems of open source tools such as Python toolchains.
Interestingly tools like ChatGPT can make some of these slept-on tools more usable. "Write me a script using __ that sends me notifications whenever __ happens"
> My impression is that Microsoft was founded to bring programming to the people, and the Mac was founded to bring great software experience, which I appreciate.
Apple did develop a native version of Basic that let you create Basic programs that took advantage of the Mac UI, but Microsoft forced them to cancel it as a condition for continuing to license their Basic for the Apple II.
> Apple's original deal with Microsoft for licensing Applesoft Basic had a term of eight years, and it was due to expire in September 1985. Apple still depended on the Apple II for the lion's share of its revenues, and it would be difficult to replace Microsoft Basic without fragmenting the software base.
Bill Gates had Apple in a tight squeeze, and, in an early display of his ruthless business acumen, he exploited it to the hilt. He knew that Donn's Basic was way ahead of Microsoft's, so, as a condition for agreeing to renew Applesoft, he demanded that Apple abandon MacBasic, buying it from Apple for the price of $1, and then burying it.
He also used the renewal of Applesoft, which would be obsolete in just a year or two as the Mac displaced the Apple II, to get a perpetual license to the Macintosh user interface, in what probably was the single worst deal in Apple's history, executed by John Sculley in November 1985.
It seems like you're reacting to the phrase "bring programming to the people" occurring in the same sentence as Microsoft. I don't think GP was trying to present Bill Gates as having been on some kind of altruistic moral crusade. I think the point GP was making is that Microsoft was founded on compilers for hobbyists, whereas Apple was thinking about end user experience.
Maybe the phrase "to the people" is confusing things because it suggests some kind of noble, high-minded motivations on the part of Gates. That wasn't how Gates (or Jobs) thought. Actually, the whole idea of a company having some social responsibility mission was not something that existed at that time. These guys wanted to sell a product that people would pay money for, that's all. In that era, this was perfectly socially acceptable and completely normal.
The statement "Microsoft was founded to bring programming to the people" is true, but it may be a little unclear if people are reading it as suggesting something other than the profit motive. However, for a long time, Microsoft really did view its own success as dependent on providing customers ways to write and sell their own software. That is in fact empowering to the customer. I'm pretty sure Gates did care about this because it was what was driving sales of his products.
> They created a Mac native Basic at launch, and only Microsoft's anticompetitive arm twisting kept if from being released.
Wow, that almost makes me want to cry.
I discovered personal computers when the computer basically felt like a programming language.
I might be wrong, but if a friendly language installed on every computer out of the box had held, the evolution of “people’s” languages might have been a huge thing to this day.
Not disagreeing with your point about MS, but it's not a conspiracy theory that Macintosh dev tools were nonexistent. Apple's solution was "buy a Lisa". (well, maybe they considered this a 3rd party opportunity.) No indication they even wanted homegrown BASIC type stuff.
No, not at all, the folklore story doesn't even address Apple's strategy. Their developer relations were "Apply Here" back then. Macintosh shipped with zero developer tooling, not C/C++, not Pascal, not BASIC.
And, we have no idea what Donn's BASIC was actually like ... it probably wasn't "Visual Basic", or even a platform. Just better than Microsoft's halfazz port of their micro basic. (which did have some quickdraw commands.)
If you'd like to see what Donn's BASIC was like, you can! The Internet Archive has a version that runs in-browser [1].
Although the Folklore article says there were two books describing Donn's BASIC, I believe there were at least three: Introduction to Macintosh BASIC, Using Macintosh BASIC, and The Macintosh BASIC Handbook. All three are available at vintageapple.org [2].
> we have no idea what Donn's BASIC was actually like
We literally know exactly what it was like. Dartmouth was already using a late beta of the software for a Basic class when Microsoft forced Apple to cancel the release of the final version.
From that story is sounds like MacBasic wasn’t a priority at all to Apple, even on the Mac team. The fact they let MacBasic slip from the launch window is a pretty big red flag, I’d be interested to hear why Bryan Stearns decided to leave(I feel the author is saying a lot by not touching on this more)
I think the idea that Microsoft or Apple were founded for any reason but money is pretty much a joke. Every action and reaction in the story revolves around money and how to get it.
I suspect it might have been a timing/market positioning thing.
By 1984, "boots to BASIC" was reserved for low-end kit: C64, Atari XL, and increasingly Apple II. The IBM PC and clones were rapidly defining what a high-end personal computer was, and for them, the BASIC was an useless appendix (Casette BASIC on IBM kit) or a bullet point at the back of the pack-in software list (BASICA/GWBASIC)-- it might be there, but commercial boxed software was the primary selling point.
> Gates and others at Microsoft had gone on the record repeatedly saying they intended for Windows and the Macintosh to be sufficiently similar that they and other software developers would be able to port applications in short order between the two. Few prospects could have sounded less appealing to Sculley. Apple, whose products then as now enjoyed the highest profit margins in the industry thanks to their allure as computing's hippest luxury brand, could see their whole business model undone by the appearance of cheap commodity clones that had been transformed by the addition of Windows into Mac-alikes. Of course, one look at Windows as it actually existed in 1985 could have disabused Sculley of the notion that it was likely to win any converts among people who had so much as glanced at MacOS. Still, he wasn't happy about the idea of the Macintosh losing its status, now or in the future, as the only GUI environment that could serve as a true, comprehensive solution to all of one's computing needs. So, within weeks of Jobs's departure, feeling his oats after having so thoroughly cowed Digital Research, he threatened to sue Microsoft as well for copying the "look and feel" of the Macintosh in Windows.
> He really ought to have thought things through a bit more before doing so. Threatening Bill Gates was always a dangerous game to play, and it was sheer folly when Gates had the upper hand, as he largely did now. Apple was at their lowest ebb of the 1980s when they tried to tell Microsoft that Windows would have to be cancelled or radically redesigned to excise any and all similarities to the Macintosh. Sales of the Mac had fallen to some 20,000 units per month, about one-fifth of Apple's pre-launch projections for this point. The stream of early adopters with sufficient disposable income to afford the pricey gadget had ebbed away, and other potential buyers had started asking what you could really do with a Macintosh that justified paying two or three times as much for it as for an equivalent MS-DOS-based computer. Aldus PageMaker, the first desktop-publishing package for the Mac, had been released the previous summer, and would eventually go down in history as the product that, when combined with the Apple LaserWriter printer, saved the platform by providing a usage scenario that ugly old MS-DOS clearly, obviously couldn't duplicate. But the desktop-publishing revolution would take time to show its full import. In the meantime, Apple was hard-pressed, and needed Microsoft -- one of the few major publishers of business software actively supporting the Mac -- far too badly to go around issuing threats to them.
> Gates responded to Sculley's threat with several of his own. If Sculley followed through with a lawsuit, Gates said, he'd stop all work at Microsoft on applications for the Macintosh and withdraw those that were already on store shelves, treating business computing henceforward as exactly the zero-sum game which he had never believed it to be in the past. This was a particularly potent threat in light of Microsoft's new Excel spreadsheet, which had just been released to rave reviews and already looked likely to join PageMaker as the leading light among the second generation of Mac applications. In light of the machine's marketplace travails, Apple was in no position to toss aside a sales driver like that one, the first piece of everyday Mac business software that was not just as good as but in many ways quite clearly better than equivalent offerings for MS-DOS. Yet Gates wouldn't stop there. He would also, he said, refuse to renew Apple's license to use Microsoft's BASIC on their Apple II line of computers. This was a serious threat indeed, given that the aged Apple II line was the only thing keeping Apple as a whole afloat as the newer, sexier Macintosh foundered. Duly chastised, Apple backed down quickly -- whereupon Gates, smelling blood in the water, pressed his advantage relentlessly, determined to see what else he could get out of finishing the fight Sculley had so foolishly begun.
> One ongoing source of frustration between the two companies, dating back well into the days of Steve Jobs's power and glory, was the version of BASIC for the Mac which Microsoft had made available for purchase on the day the machine first shipped. In the eyes of Apple and most of their customers, the mere fact of its existence on a platform that wasn't replete with accessible programming environments was its only virtue. In practice, it didn't work all that differently from Microsoft's Apple II BASIC, offering almost no access to the very things which made the Macintosh the Macintosh, like menus, windows, and dialogs. A second release a year later had improved matters somewhat, but nowhere near enough in most people's view. So, Apple had started work on a BASIC of their own, to be called simply MacBASIC, to supersede Microsoft's. Microsoft BASIC for the Macintosh was hardly a major pillar of his company's finances, but Bill Gates was nevertheless bothered inordinately by the prospect of it being cast aside. "Essentially, since Microsoft started their company with BASIC, they felt proprietary towards it," speculates Andy Hertzfeld, one of the most important of the Macintosh software engineers. "They felt threatened by Apple's BASIC, which was a considerably better implementation than theirs." Gates said that Apple would have to kill their own version of BASIC and -- just to add salt to the wound -- sign over the name "MacBASIC" to Microsoft if they wished to retain the latter's services as a Mac application developer and retain Microsoft BASIC on the Apple II.
> And that wasn't even the worst form taken by Gates's escalation. Apple would also have to sign what amounted to a surrender document, granting Microsoft the right to create "derivative works of the visual displays generated by Apple's Lisa and Macintosh graphic-user-interface programs." The specific "derivative works" covered by the agreement were the user interfaces already found in Microsoft Windows for MS-DOS and five Microsoft applications for the Macintosh, including Word and Excel. The agreement provided Microsoft with nothing less than a "non-exclusive, worldwide, royalty-free, perpetual, non-transferable license to use those derivative works in present and future software programs, and to license them to and through third parties for use in their software programs." In return, Microsoft would promise only to support Word and Excel on the Mac until October 1, 1986 -- something they would certainly have done anyway. Gates was making another of those deviously brilliant tactical moves that were already establishing his reputation as the computer industry's most infamous villain. Rather than denying that a "visual display" could fall under the domain of copyright, as many might have been tempted to do, he would rather affirm the possibility while getting Apple to grant Microsoft an explicit exception to being bound by it. Thus Apple -- or, for that matter, Microsoft -- could continue to sue MacOS's -- and potentially Windows's -- competitors out of existence while Windows trundled on unmolested.
> Sculley called together his management team to discuss what to do about this Apple threat against Microsoft that had suddenly boomeranged into a Microsoft threat against Apple. Most at the meeting insisted that Gates had to be bluffing, that he would never cut off several extant revenue streams just to spite Apple and support this long-overdue Windows product of his which had been an industry laughingstock for so long. But Sculley wasn't sure; he kept coming back to the fact that Microsoft could undoubtedly survive without Apple, but Apple might not be able to survive without Microsoft -- at least not right now, given the Mac's current travails.
You're exactly right that Apple has always had an ambivalent relationship with software developers. On the one hand, they needed them for their products to have any value for consumers. On the other hand, Jobs clearly saw low-quality, third party software as something that could taint the product Apple was selling. They really saw app developers as something very dangerous and difficult to control. Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad.
I like the egalitarian nature of the Gates view, but ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform, and no amount of high-quality software can completely offset this.
>Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad.
I like the egalitarian nature of the Gates view, but ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform, and no amount of high-quality software can completely offset this.
That is wrong on so many levels. I grew up in a Windows environment in the '90s tro '00s and had the best video games in the world at my finger tips, while Mac users at the time had what, some high quality photo editing software? Good for them I guess.
So unless you don't count video games as software, Gates was right. DOS and Windows being the main platforms for the hottest videogames of the moment, was what led to the popularity of the PC over Macs at the time. I think Doom alone was responsible for millions of PC and MS-DOS sales.
Yeah, Job's Next-Step machines and SW were cool, UNIXy and all, but to what end if nobody but the wealthiest businesses and institutions bought them in numbers you can count on a few hands? In fact, Jobs understood from the fail of the Next-step and the success of the DOS-PC that you need to get your devices cheaply in the hands and homes of as many casual consumers as possible (hence the success of the colorful and affordable iMac G3) and forget the premium business workstation market.
I wrote the comment and I too grew up in the MS-DOS/Windows world. My family had a Mac and an Apple ][e, but I didn't use them as much as the various PCs.
As I say in another comment, I think Gates's view was right for that time. The Jobs view was needlessly limiting the potential market for his machines. Gates didn't make the machine, but he was making a thing that was sold with every machine, and he understood that more apps means more reasons why people might want to buy a PC.
One problem with the Gates view today is that if you make it easy for anyone to write and distribute any kind of app, you end up with not only a lot of low-quality software but even negative-quality software like ransomware and malware. It's surprising that so many people want to write that stuff, but apparently they do. Every modern platform, including Windows, has gone to extreme lengths to block this stuff, even though the more laissez-faire approach would be to just ignore it and assume other 3rd parties will write programs to block it. The problem was that malware was an existential threat to the entire Windows platform, and Microsoft couldn't afford to just let the market sort that one out.
I believe the Jobs view is the correct one for the present era. Every platform has app stores, code signing, DRM, and other kinds of security and non-security restrictions. It's certainly easier to run random code on Windows 10 than macOS Ventura (or especially iOS), but no platform vendor is completely app neutral these days.
I agree with you 100%. I think maybe the problem is that the platform vendors think it's too difficult to explain fine-grained access controls to end users, whereas an app store is dead simple to explain. And, as you observe, an app store makes money whereas ACLs do not.
Indeed I think Jobs approach to software dev was too far ahead of time, Gates pragmatic approach proved to have more leverage for computer/platform sales and growth.
In the present era, running random code is trivial - JavaScript, via a web browser. It runs in a sandbox, which limits what it can do, but it's a really big sandbox and you can do really useful things within it.
Apple has always gone for the premium end of the market, and with vastly increasing wealth inequality, that's where the money is these days. You can see this is in other luxury industries which make incredible amounts of money considering how small their markets are (in terms of numbers of participants).
This focus on high-quality software has also encouraged _better_ software developers to build in Apple's ecosystem. Even today a lot of the best software is only available for Mac or iPhone.
I agree with FirmwareBurner's sibling. All personal computers were expensive luxury items at that time. Most of them cost as much or more than a used car, which people would have considered much more useful than a computer.
Apple's machines were the most expensive, but it wasn't because they were higher quality (that was not the general perception outside the Apple world). It was because Apple refused to allow clone makers into the market, so they were manufacturing in tiny volumes compared to Dell and Compaq and a million other random names like Packard-Bell (Sear's store brand PC) that probably no one remembers.
> Even today a lot of the best software is only available for Mac or iPhone.
I really don't see that at this point, and I do use my Mac every day. Most of the software I use is also available on Windows and Linux, and anything that isn't has a clear Windows or Linux equivalent.
The only software I use that is absolutely only available on a single platform with no equivalent is actually Windows software. I'm not a gamer, but that's apparently a whole category that is Windows-only.
I'm curious what Mac software you use in 2023 that is only available on Mac.
> I'm curious what Mac software you use in 2023 that is only available on Mac.
Ulysses, Bear, Things, Reeder, Mela, Alfred, MindNode, just to name a few. These apps are incredibly well designed, and have no equivalents in the Windows or Linux worlds. I know because I also have a PC and I’ve looked very hard for replacements.
Additionally, apps like 1Password, Scrivener, iA Writer, and Arc Browser started their life on the Mac. Some social media networks, like Instagram and VSCO, were iPhone apps before they released on other platforms. Because Apple users are generally not averse to spending money on software, all the really good software reaches the Mac and iPhone a long time before it becomes available on other platforms.
iCloud itself is something that no other platform can compete with. I never have to create a separate account for any app in the Apple ecosystem because they just sync using iCloud without me doing any extra work. When I install them on my phone later, my data is already there. The Windows/Android world have no equivalent of this.
Apps really are better in the Apple world. I blame Microsoft for underinvesting in native Windows UI toolkits.
> I'm not a gamer, but that's apparently a whole category that is Windows-only.
Not anymore, really. Proton has done an amazing job at making the huge library of Windows games out there work on Linux, so at this point it's a pretty good contender. (Hence the popularity of the Steam Deck.)
We were talking about the past. And back in those days when computers were expensive, most people and companies were writing SW for the most popular platform out there, which at the time was watever had the best bang for the buck thjat home users could afford: Coomodore, Sinclair, Amiga, MS-DOS, etc.
>Even today a lot of the best software is only available for Mac or iPhone.
Again, we are talking about the past. TikTok might be the hottest app today, but back then it was Doom.
I see a lot of poor people with iPhones, Apple Watches, Earpods and such. These are what you'd call an "affordable luxury" and probably a bargain when you consider you might go through 5 $300 Android devices in the time that you get out of a $1000 iPhone and all that time you are struggling with an Android.
but the psychology is weirder in that rich people know a "luxury" watch costs upwards of $10k so it is quite the trick to convince the hoi polloi that a $500 watch is a luxury device at the same time.
I've noticed also that poor folks are also very aware of expensive wireless plans from Verizon but seem to have little awareness of wired internet plans, free Wi-Fi, etc.
... and will your Android actually charge when you plug it into a charger?
I've had roughly five Android devices (tablets and phones) that had the curious problem that I'd plug them into a charger and it was always 50-50% if they would actually charge. This would happen when the devices were new and were starting from zero (often it would take several times to charge the device for the first time) and still happens when the devices are in regular use. All the time I leave one plugged into the charger over night and... nothing. Plug it into the same charger again and maybe it charges.
I've noticed the same problem w/ Sony devices both cameras and the PS Vita if they are used with non-Sony chargers.
Not if they are kernel related, and only since APEX became a thing on Android in Android 10.
However, it doesn't change the main point, most phone users care about updates on their phone as much as they care about updating any other device with a CPU in their home.
I meant that most vulnerabilities in the wild that pose a threat to users and end up actually being exploited by malware, are the ones usually covered by updating Google Play services.
Yes, kernel vulns will always be uncovered, but how often does the average user encounter malware that exploits that attack vector?
The biggest attack vectors on Android are Google Play services and your browser. Keep those up to that and you're good even with older phones.
Jobs' view sounds essentially the same as Nintendo's view in the wake of the video games crash, which eventually became the unanimous model in that industry. With iOS they also got to enforce it like a console, with a whitelist of acceptable software
I was so used to this view by Nintendo that it was hard to believe seeing a lot of really low quality games in the Nintendo switch store nowadays. The worst part is that it's still not easy to publish there (or get access to the SDK in the first place) in an independent way without a publisher or a history of already successfully released games.
Yeah, I think a big part of how Gates and others came into their view was that there were so many examples like Nintendo where a platform vendor had artificially limited the third party ecosystem and it constrained the success of the platform as a whole.
Basically, the Gates view was the more "correct" one for the 1980s/1990s. At that time, most people did not own personal computers, even many businesses didn't, and growth came from on-boarding these non-users. The more apps available on your platform, the more likely a non-user would be to see a reason to buy a computer at all. Also, the platform Gates was selling was much cheaper to try.
Today, everyone owns at least one computing device. Platforms aren't competing for non-users (there are none left), they are competing for users of other platforms. The Jobs view works better in that world since it improves platform quality perceptions.
By the way, there is no particular reason why the computer field had to go in this direction. If you look at the history of aviation, there were a lot of kit airplanes in the 1920s and 1930s because people thought that over time more and more people were going to want to own an airplane. That turned out to be wrong. Most people are content to pay somebody else to transport them.
Saas is the computing equivalent of an airline, and it's very popular, but it hasn't fully displaced individual computer ownership. We'll see what happens long term.
The writing is on the wall though, isn't it? Even companies that you'd expect to go all-in on user freedom and ownership have locked down their platforms and ratcheted up the service offerings. Not a single FAANG company is willing to take a definitive stand in defense of their users. They all just see a bottom line waiting to be butchered.
And for most users, I'd wager SaaS has displaced computer ownership. Stuff like the App Store, Play Store and even game consoles are ultimately a service unto themselves. Even disregarding that, the "happy path" of modern software is paved far away from user empowerment. People are generally more willing to pay for a subscription than to switch to a more convenient hardware platform.
I'm really not sure. Five years ago, I thought that SaaS would completely replace local computing. Now, I'm less certain.
Yes, SaaS is very popular and there are far fewer local applications now. However, there still seems to be some value in local applications because it's thought to be a better end user experience. Look at Slack. It started out as a website, pure browser-based SaaS. Over time they developed a local app, which is basically the same Javascript but packaged up as a local Electron app. This seems to be regarded as superior to the web version for most people.
Consider Zoom. There was initially a native client and an equivalent web version. They seem to be actually killing off the web version now -- access to it is a non-default option for the meeting creator, buried on a screen with 100+ meeting settings nobody understands or reads. They apparently want to be a local-only program.
As long as there is demand for local apps, people will continue owning general purpose, re-programmable microcomputers. Maybe the vendor can lock you out of some low-level stuff like Intel ME or the BIOS, but these platforms are still allowing people to download a compiler and run their own programs directly on the metal.
I'm not sure what direction it will ultimately go, but my hunch is that if it were possible for SaaS to 100% displace local computing, it would have already happened.
Perfectly fair response. I think your examples are also good evidence that people like native software, or at least the appearance of a cohesive platform. Apparently "non-native" apps like an SPA or WASM tool will probably turn most normal users off. Good enough native support can make a web client redundant, even if 'native' means Javascript with OS-native controls.
To counter that though, I think we have to define SaaS concretely and ask what a more service-driven world would look like. I consider SaaS to not only be the paid components of an OS, but also the "free" services like Google and the App Store. Holistically speaking, our OSes are saturated with services; Find My, Android Notify services, OCSP/Microsoft/Google telemetry, online certificate signing, and even 'assistants' that fail to assist without internet. What more could we possibly put online?
It's a bit of a rhetorical conclusion to leave things on. I'll certainly see worse examples in my lifetime, but the status quo is bad enough in my opinion.
I agree with you about the absurd degree of service saturation. I've discovered many things I cannot do without an Internet connection because one trivial step in the process absolutely requires communication with some network endpoint. I found one app where, if you block it's ability to send analytics reports back to Mixpanel, it will run 3 or 5 times but after that refuse to run again until an Internet connection is available. I thought it was absurd since it proves the developers actually considered offline use cases, but decided that they couldn't allow that to go on indefinitely.
Anyway, sure, let's leave it there and see where things are in 5 years. It'll be interesting to find out!
Only if the users can be satisfied by the supposedly higher quality software, produced by devs willing to pay the Apple tax and unafraid of being Sherlock-ed
There are only two smartphone platforms, and both of them have app stores and make other efforts to control the overall end user experience. What do you disagree with?
> Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad.
Gates's view was also that, once the market had sorted the good from the bad, Microsoft would, if desired, just implement whatever killer apps third party developers had discovered and make them part of Windows, cutting the floor out from under the third party developers. In other words, Gates viewed third party developers as doing market research for Microsoft that Microsoft didn't have to pay for.
> ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform
Windows is still well ahead of both macOS and iOS in terms of market share, so I think it's more that Jobs and Gates had different definitions of success.
Yep, and Apple does this too. The Spotlight feature in macOS (OS-wide search, it's the little magnifying glass in the top right) was a feature that Apple blatantly ripped off from a popular early-2000s third party extension. Adding this to the OS killed that company overnight. Forget their name, but someone else might remember.
All platform vendors seem to do this, which is unfortunate.
Twice: The usage of a giant text input box as a launcher or command palette was pioneered on the Mac by LaunchBar and Quicksilver and later Alfred, etc. Spotlight for a long time was only a menu extra.
> Gates's view was also that, once the market had sorted the good from the bad, Microsoft would, if desired, just implement whatever killer apps third party developers had discovered and make them part of Windows, cutting the floor out from under the third party developers. In other words, Gates viewed third party developers as doing market research for Microsoft that Microsoft didn't have to pay for.
Reminds me of how Amazon sees third-party sellers.
Maybe that is the standard abusive strategy of platforms that hold a dominant market position.
> Jobs clearly saw low-quality, third party software as something that could taint the product Apple was selling.
I mostly disagree with this, so long as the low quality software isn't pushed by the platform then it's not that much danger to the reputation. Lack of quality software is a much larger danger, and it's not easy to tell ahead of time where the quality will come from. The big problem is when you are running your own app store and it becomes packed with garbage.
It feels like they eventually managed to enforce this when the App Store came along, rejecting “useless” or “redundant” apps. Eventually they gave up on that idea because they loved seeing the number of apps rise, even if those apps were, in the end, low-quality software.
If they didn't have hidden 1st party api stuff that 3rd parties have difficulty leveraging this moniker would feel more useful/relevant, IMO. In the current context they use these labels to protect their workflow from other entities.
When I setup computers for family members I've used things like Windows 10 S mode BECAUSE it limits the software that can be installed. They are too low skill to know what applications to trust.
At my work, I similarly have gotten secure administrative workstations deployed with again, very limited lists of software. These users being compromised has such extreme consequences that one cannot trust any application to run on these computers.
So I can certainly appreciate the need to be very exclusive with the software allowed on a device. Yet I see this exclusivity being rather anti-consumer in practice. iOS is the most secure operating system I've ever seen for the average person, and yet things like adblocking are behind a paywall. Neutering Adblock for the sake of security is considered acceptable. Blocking alternatives to WebKit may reduce the overall attack surface area Apple contends with and it also limits the functionality of the phone since v8 is the de facto standard. Blocking alternatives to the App Store absolutely enhances security while also forcing consumers to pay more. Then you get into things which have nothing to do with security truly, like how you can only make "Music" your default music player on iOS, not Spotify.
I really appreciate what Windows did with "S mode" because it's optional yet provides benefits to consumers in niche situations. I similarly appreciate the enterprise features of Windows to configure a laptop which has limited access to software to enhance security. Apple on the other hand forces the exclusive use of their software to benefit themselves at the expense of their customers. It is unacceptable and I would vote for anybody pledging to outlaw it as monopolistic abuse. Software exclusivity is useful, shoving it down people's throats is abhorrent.
Personally I’m not annoyed at low quality software - I just avoid it. What I draw a bright red line at is intrusive/abusive software (essentially malware). I’m glad apple enforces a standard with regard to software behavior that is acceptable versus not.
The 128k Mac that came out in 1984 might have been the first microcomputer that didn't come with some kind of BASIC, machine language monitor, or other tools for developing software. That is, the point of every micro up until then was that you were going to program it yourself. In some sense it was a real step backwards in terms of people losing control of computers that Apple has continued with iDevices.
At first you could not do any software development for a Mac at all on a Mac, instead you were expected to get a Lisa if you wanted to develop for the Mac. Part of that was that even though the Mac had twice the memory as most computers at that time, the GUI ate up a lot of it, so you could just barely run the bundled MacWrite and MacPaint. Part of the reason why Jobs insisted on keeping the Mac monochrome until the 1987 Mac II was to conserve memory. (When it did come out, the Mac II was priced closer to a Sun workstation than even an IBM PC AT.)
I remember that, it's why I didn't get a Mac. Although it was a few years later for me.
The only computer that I'd used up until then was an Apple II that had been donated to our school. I managed to talk my parents into buying one, and they took me to the store but I couldn't find one. Asked the salesman and he said they didn't sell them anymore but now they had these new Macintosh computers.
I looked at the rinkydink black and white screen with a mostly-blank desktop with just a few icons you could drag around or click on it. Couldn't even figure out how to get to the command line, let alone into BASIC. When I asked the salesman he said that you couldn't and seemed utterly baffled that someone would actually want to use their computer.
So I left and went and bought a PC clone instead. Plugged it in and booted it up and I was coding immediately. PCs just worked, and let you do whatever you wanted. Macs were kind of toy computers at the time.
That philosophy still kind of holds to this day, though less so now. I use a Mac for work, and luckily they did finally make it possible to get to a command line and to program on it. At some point, they realized they had to give some power to it or else become utterly irrelevant. But you can tell that it's still not intended for power users. It still can't run most software, and what it does have is limited and simplified, from the OS on up, and still defaults to making things difficult except for the simplest use cases.
> Even HyperCard never had his wholehearted support
When HyperCard was introduced Steve Jobs didn't work at Apple. When he came back, he killed it along with many other projects he had no role in, like the Newton.
I say this as someone that loved the Newton. The Newton was a millstone for Apple at the time. Just like printers, scanners, and cameras.
The Newton had several problems. The first it was intended to be an entirely new platform distinct from the Mac. There was zero overlap between the runtime environments of the two platforms. Nothing you made on a Mac was ever going to run on a Newton. This would stretch already thin third party developers even thinner.
The second major problem is it had cheaper competition with a better market fit in the form of Palm. A Palm Pilot was half the price of a MessagePad and did most of the same tasks. It also actually fit in a pocket which meant it could be carried around and used as intended.
A third problem was its OS was an older model lacking memory protection and preemption. By 1997 it was clear that multitasking protected memory OSes were the future if for no other reason increased stability in day to day operations. Rebuilding the NewtonOS with those features would be a major project.
The MessagePads were bulky and expensive. They were too big to fit in a pocket meaning the only way to carry them was a bespoke case or a briefcase. They weren't that capable so a true road warrior worker was just going to get a laptop. Their target market was John Sculley, executives that didn't want to tote around even bulkier laptops.
The Newton didn't make a lot of sense as a product and killing it off with the rest of the Apple peripherals made complete business sense.
I don't disagree. I have a soft spot for the Newton because I had a lot of respect for Larry Tesler, who got Apple interested in Common Lisp for a while as part of the Newton project. CL was used to invent Dylan which was originally intended as the Newton programming language. Dylan ended up like the Newton: The invention itself didn't have much impact but the project and the people who worked on it moved computer science forward in many important ways.
Oh, and the genesis of what we now call the ARM computer architecture was the Newton project.
The Newton was out for 5 years by the time the US Robotics Palm Pilot debuted. However, the first really good Newton was the 2000 and that was right around the time the Palm Pilot was released.
I was rooting for the Newton but at the same time, I found myself mostly using my Palm Pilot (and later my Handera 330) while my MessagePad sat on my desk unused.
> My impression is that Microsoft was founded to bring programming to the people
Why do you think that? I was not around in that period, but the impression I get is that the foundation of Microsoft[0] was antithetical to "bringing programming to the people".
Long before entering the operating system business, Microsoft was a company dedicated to making tools for programmers, like compilers and interpreters for many languages, for the CP/M operating system and for most kinds of early personal computers.
The tools for programmers have remained a major part of their products during the MS-DOS era, until the early nineties. The documentation for their compilers and libraries was excellent and many people have learned much about programming from the Microsoft manuals (or from reverse engineering their programs).
Only after the transition to Windows and their success in suppressing the competition for MS Office by their control over the development of the operating system, the programming tools business has progressively become only a small part of the Microsoft products and the programmers only a small part of their customers.
All you need to do is head over to archive.org and look at a Byte magazine from "back in the day". It was filled with ads and reviews for programming tools like Turbo C and Turbo Pascal. It was a golden age, and I was there for it.
That famous Gates letter is not inconsistent with the idea of large numbers of hobbyist programmers. He cared a lot about the piracy issue, but he was never trying to gate who was "allowed" to be a programmer.
Microsoft's first product was a BASIC interpreter. It was all about bringing programming to the masses. Saying "to the people" implies some kind of empowerment, and I don't think that was ever quite as much a part of Microsoft's identity (although it was there in some form). "To the masses" is a better way to describe it in my opinion.
We had an original IBM PC (not a clone, genuine IBM), and I actually don't recall it doing that. Maybe that was on some models and not others?
The way I accessed BASIC was to boot up PC-DOS (IBM-branded version of MS-DOS) and then run some program that was like QBASIC although it might not have been called that yet. Later I switched to MS-DOS 3.21, and that came with real QBASIC. I also switched from BASIC to Turbo Pascal and Turbo C. Great times!
They sold software tools at the beginning and further down the line used the tools to become the platform of choice for corporations. They controlled the OS and the tooling and this gave them great advantage in business software market share.
You could get Windows source (or parts of it) if you needed to work on low-level stuff from MS and their MSDN was and is miles better than Apple developer documentation and tooling.
Bringing programming to the people does not mean giving away your work for gratis. The fuss over the increasing use of non-free but source available licenses instead of OSI approved FOSS licenses shows that people are coming around to Gates' viewpoint.
> Apple (represented in my mind by Steve Jobs) didn't want his platform to be flooded by crappy software.
This kind of shocks me.
When you go through macintosh garden and look at all the software (there were tons) a hefty enough portion of it looks like the 80s'/90's equivalent of AppStore fart apps. "Porno" apps with interfaces and interactivity that could have been constructed in Powerpoint.
So that (gatekeeping) didn't work. And then they extended the life of all of that garbage with blue box.
> My impression is that Microsoft was founded to bring programming to the people, and the Mac was founded to bring great software experience
They were both founded to make money. Microsoft was never close to the people and the "great software experience" on the Mac, if it existed, was limited to rich people.
Actually, rich people at that time were probably the least likely to own a computer. If you're talking about somebody who lives in mansion and has a butler and a chauffeur, that type of person was not buying computers and would not have had any interest in them. (I don't know what those people did with their time, but it sure as hell wasn't computing.)
Personal computers were expensive by middle class standards [1], and the sweet spot was educated professionals like doctors and lawyers because they could afford it and it was interesting/useful to them and their families. A rich person who inherited an oil fortune would have been able to afford it, but not interested. A teacher would have been interested, but unable to afford it.
[1] At a time when a decent, no-frills used car would have been around $1,000, you had something like this: ~$700 for a more limited VIC-20 or Z80-based machine, ~$2,000 for a PC-clone, and maybe double that for the Macintosh. For most buyers, it was a big but not impossible expense.
Yeah, that's totally fair. I think I misunderstood because the word "rich" in that era was really reserved for people with butlers and yachts, and excluded those who were merely highly paid professionals like doctors and lawyers. But the word "rich" today is used somewhat differently, for most people it probably means to include highly paid professionals as well as CEOs and oil fortune heirs.
Also, at that time, there was not this extreme divergence in income that you have today. A lawyer might have made more money than a police detective, but the difference wasn't really that extreme and often these two people lived in the same neighborhood. Since that doesn't seem to be true anymore, it makes sense that the definition of "rich" has shifted.
I think that apple wanted to appeal to the makers at the beginning to get traction but really what they wanted was more consumers. Great hardware and long lasting but is for the elite that can afford it, most people in HN are in that elite. But ultimately apart from raising the bar in user experience is just another money making corporation. And that does not make a world a better place.
Yeah, the whole Apple cult thing has never made much sense. I remember talking to some environmentalists in the mid nineties who cared a lot about reducing industrial activity but were also strongly advocating the Mac. I was trying to ask why they felt so strongly about that given that any computer is basically a non-essential luxury good that requires a lot of toxic metals to produce [1], whereas pencil and paper has very low environmental impact and even some recycleability. There wasn't a clear answer. I think it's irrational.
[1] at that time, electronics used a lot of lead, cadmium and mercury. It's less of an issue now.
Officially, their dev tools cost money. Here is the 1997 pricing for various versions of Visual Basic 5.0 and Visual C++ 5.0 in 1997: ($99, $499, $1,199)
Unofficially, almost nobody ever paid for that stuff in my experience and it was ridiculously easy to get for "free." As was the norm in those days there was zero copy protection. You could sail the pirate seas, you could buy one "legit" copy and share it with the whole org, you could get the MSDN subscription and get "evaluation" editions of literally their entire dev tools catalog on a big stack of CDs with zero limitations besides the legality of the situation. I'm sure that with minimal effort it was easy to get a hold of Mac dev tools for free as well. But Windows was so ubiquitous, as was "free" Microsoft software, and you could build a nice little PC for cheap.
I always wondered why Microsoft charged for that stuff in the first place. Were they actually making money on it directly? Did they sort of want to give it away for free, but were wary of antitrust implications?
Apple had a different strategy. Perhaps this is an incorrect perspective it seemed like they just didn't care about independent garage and small business developers. They left things up to 3rd parties. In a lot of ways this was better -- it probably led to a richer ecosystem? For example, looking at the Microsoft strategy, it's not hard to see that they drove Borland into the grave.
This was a little bit later in the timeline, but I remember attending an event in Philadelphia about F#, and I walked out of it with a licensed copy of Visual Studio .NET and SQL Server. They were just handing them out. I had an earlier copy of Visual Studio from when I was in college, and I was working in .NET at work, but no one at the event knew that. It was just - thanks for showing up to hear about functional programming, take some DVDs on your way out!
It always felt to me like there was a culture of openness in the world of Microsoft in a way that didn't exist in Apple culture. You gotta pay for dev tools to use dev tools on a Mac. You want an FTP client on a Mac? Buy Cyberduck or Fetch. My impression of Apple was that everything was proprietary and had a price. Whereas I could cobble a computer together that ran Windows, there was oodles of shareware and freeware as well as professional tools. You have full access to the registry and file system in Windows, and you could very easily hack the bejesus out of the OS if you wanted to. It was great for tinkering. Everything was backwards compatible - the point where I had Windows 10 running on a 2005 era Dell laptop that had come with Windows XP, and I had managed to upgrade legitimately without paying (I think the Windows 8 beta converted into a full Windows 8 upgrade, free).
Today, I'm typing this from a 2021 Macbook Pro with USB-C ports - when I travel for work, I bring one charger, and it charges my laptop, my phone, my earbuds, even my mirrorless camera. When I need software, I can usually find something using Homebrew. The value you get for your money on a Mac is much better, but it's still a steep barrier to entry, IMO - even though I'm in a much better position today, and bought this without breaking a sweat. There's a lot of tinkering-related things I miss about the Microsoft ecosystem, but I've largely moved out of the weeds for my day to day work on a computer. All the software I was using on my Windows machine is multi-platform now, and the performance and battery life on these Apple native chips is hard to ignore. As a developer, it's just as simple, if not easier now, to build on Macs - ever since OSX opened up the Linux ecosystem to these devices. That, in conjunction with superior hardware, finally convinced me to switch after at least three decades of being staunchly Microsoft.
Hahaha I hate it but I'm doing that thing where I agree with the other 50 things you said but point out on thing I don't agree with, but:
The value you get for your money on a Mac
is much better, but it's still a steep barrier
to entry, IMO
Is it that steep? A refurb M1 Macbook Air straight from Apple with 1 year of AppleCare is $850. $1,189 if you want 16GB + 512GB.
That is more money than a lot of people can afford, especially outside of the US.
But boy, it's so much cheaper than ever. That's like, two Xboxes (+ controllers + a game or two) so it feels within the reach of a broad swath of the population.
Admittedly... you can get a nice-enough Windows/Linux developer desktop or laptop with a little Craigslist or FB Marketplace effort for like, $150.
I'm not excusing it, but people had to pay for access to a computer in the first place, and ways of funding software development without selling either hardware or software, such as advertising, were not really prevalent yet.
Microcomputers greatly expanded private access to computers, and Microsoft brought programming to microcomputers.
My impression is that Microsoft was founded to bring programming to the people, and the Mac was founded to bring great software experience, which I appreciate. But Apple (represented in my mind by Steve Jobs) didn't want his platform to be flooded by crappy software. Even HyperCard never had his wholehearted support (I used it extensively, despite its odd and limited language), and I think Apple was glad to finally dump it.
If you had a reason to write crappy software, as I did, you ended up with Visual Basic. What I mean by "crappy" is software that solves a problem reliably, but can't be scaled. For instance, installation might require help from the author, and entering a bad value might make it crash, but a small handful of tolerant users can handle these issues.
The solution today, with open source, is that the people bring programming to the people.