Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

You're exactly right that Apple has always had an ambivalent relationship with software developers. On the one hand, they needed them for their products to have any value for consumers. On the other hand, Jobs clearly saw low-quality, third party software as something that could taint the product Apple was selling. They really saw app developers as something very dangerous and difficult to control. Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad.

I like the egalitarian nature of the Gates view, but ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform, and no amount of high-quality software can completely offset this.



>Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad. I like the egalitarian nature of the Gates view, but ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform, and no amount of high-quality software can completely offset this.

That is wrong on so many levels. I grew up in a Windows environment in the '90s tro '00s and had the best video games in the world at my finger tips, while Mac users at the time had what, some high quality photo editing software? Good for them I guess.

So unless you don't count video games as software, Gates was right. DOS and Windows being the main platforms for the hottest videogames of the moment, was what led to the popularity of the PC over Macs at the time. I think Doom alone was responsible for millions of PC and MS-DOS sales.

Yeah, Job's Next-Step machines and SW were cool, UNIXy and all, but to what end if nobody but the wealthiest businesses and institutions bought them in numbers you can count on a few hands? In fact, Jobs understood from the fail of the Next-step and the success of the DOS-PC that you need to get your devices cheaply in the hands and homes of as many casual consumers as possible (hence the success of the colorful and affordable iMac G3) and forget the premium business workstation market.


I wrote the comment and I too grew up in the MS-DOS/Windows world. My family had a Mac and an Apple ][e, but I didn't use them as much as the various PCs.

As I say in another comment, I think Gates's view was right for that time. The Jobs view was needlessly limiting the potential market for his machines. Gates didn't make the machine, but he was making a thing that was sold with every machine, and he understood that more apps means more reasons why people might want to buy a PC.

One problem with the Gates view today is that if you make it easy for anyone to write and distribute any kind of app, you end up with not only a lot of low-quality software but even negative-quality software like ransomware and malware. It's surprising that so many people want to write that stuff, but apparently they do. Every modern platform, including Windows, has gone to extreme lengths to block this stuff, even though the more laissez-faire approach would be to just ignore it and assume other 3rd parties will write programs to block it. The problem was that malware was an existential threat to the entire Windows platform, and Microsoft couldn't afford to just let the market sort that one out.

I believe the Jobs view is the correct one for the present era. Every platform has app stores, code signing, DRM, and other kinds of security and non-security restrictions. It's certainly easier to run random code on Windows 10 than macOS Ventura (or especially iOS), but no platform vendor is completely app neutral these days.


I don't think closed app stores are the answer. It doesn't actually prevent malware.

Much better is a genuine permissions system that the user controls and the apps can't circumvent.

The reason closed app stores are so popular isn't because of security but because everyone wants a finger in that sweet 30% take pie.


I agree with you 100%. I think maybe the problem is that the platform vendors think it's too difficult to explain fine-grained access controls to end users, whereas an app store is dead simple to explain. And, as you observe, an app store makes money whereas ACLs do not.


Indeed I think Jobs approach to software dev was too far ahead of time, Gates pragmatic approach proved to have more leverage for computer/platform sales and growth.


And greater innovation in the application space.

Not that there wasn’t innovation on the Mac, but not as much, and often in the end, not as successful.


In the present era, running random code is trivial - JavaScript, via a web browser. It runs in a sandbox, which limits what it can do, but it's a really big sandbox and you can do really useful things within it.


Apple has always gone for the premium end of the market, and with vastly increasing wealth inequality, that's where the money is these days. You can see this is in other luxury industries which make incredible amounts of money considering how small their markets are (in terms of numbers of participants).

This focus on high-quality software has also encouraged _better_ software developers to build in Apple's ecosystem. Even today a lot of the best software is only available for Mac or iPhone.


I agree with FirmwareBurner's sibling. All personal computers were expensive luxury items at that time. Most of them cost as much or more than a used car, which people would have considered much more useful than a computer.

Apple's machines were the most expensive, but it wasn't because they were higher quality (that was not the general perception outside the Apple world). It was because Apple refused to allow clone makers into the market, so they were manufacturing in tiny volumes compared to Dell and Compaq and a million other random names like Packard-Bell (Sear's store brand PC) that probably no one remembers.

> Even today a lot of the best software is only available for Mac or iPhone.

I really don't see that at this point, and I do use my Mac every day. Most of the software I use is also available on Windows and Linux, and anything that isn't has a clear Windows or Linux equivalent.

The only software I use that is absolutely only available on a single platform with no equivalent is actually Windows software. I'm not a gamer, but that's apparently a whole category that is Windows-only.

I'm curious what Mac software you use in 2023 that is only available on Mac.


> I'm curious what Mac software you use in 2023 that is only available on Mac.

Ulysses, Bear, Things, Reeder, Mela, Alfred, MindNode, just to name a few. These apps are incredibly well designed, and have no equivalents in the Windows or Linux worlds. I know because I also have a PC and I’ve looked very hard for replacements.

Additionally, apps like 1Password, Scrivener, iA Writer, and Arc Browser started their life on the Mac. Some social media networks, like Instagram and VSCO, were iPhone apps before they released on other platforms. Because Apple users are generally not averse to spending money on software, all the really good software reaches the Mac and iPhone a long time before it becomes available on other platforms.

iCloud itself is something that no other platform can compete with. I never have to create a separate account for any app in the Apple ecosystem because they just sync using iCloud without me doing any extra work. When I install them on my phone later, my data is already there. The Windows/Android world have no equivalent of this.

Apps really are better in the Apple world. I blame Microsoft for underinvesting in native Windows UI toolkits.


That's not down to iCloud doing the sync.


> I'm not a gamer, but that's apparently a whole category that is Windows-only.

Not anymore, really. Proton has done an amazing job at making the huge library of Windows games out there work on Linux, so at this point it's a pretty good contender. (Hence the popularity of the Steam Deck.)


Not every gamer is a Steam customer.

I have a library of ISO oldies that won't "just work" on Linux without a shit Tonne of googling and fiddling around.


>that's where the money is these days.

We were talking about the past. And back in those days when computers were expensive, most people and companies were writing SW for the most popular platform out there, which at the time was watever had the best bang for the buck thjat home users could afford: Coomodore, Sinclair, Amiga, MS-DOS, etc.

>Even today a lot of the best software is only available for Mac or iPhone.

Again, we are talking about the past. TikTok might be the hottest app today, but back then it was Doom.


> Even today a lot of the best software is only available for Mac or iPhone

1% of the whole games industry is available for Mac and there are amazing pieces of software just there.


I see a lot of poor people with iPhones, Apple Watches, Earpods and such. These are what you'd call an "affordable luxury" and probably a bargain when you consider you might go through 5 $300 Android devices in the time that you get out of a $1000 iPhone and all that time you are struggling with an Android.

It's kinda like

https://www.goodreads.com/quotes/72745-the-reason-that-the-r...

but the psychology is weirder in that rich people know a "luxury" watch costs upwards of $10k so it is quite the trick to convince the hoi polloi that a $500 watch is a luxury device at the same time.

I've noticed also that poor folks are also very aware of expensive wireless plans from Verizon but seem to have little awareness of wired internet plans, free Wi-Fi, etc.


>you might go through 5 $300 Android devices in the time that you get out of a $1000 iPhone

Maybe take care of your Android as you would an iPhone and it will also last as much as an iPhone, but at 1/3 the cost.

Most people (worldwide) are not spending 1k on a phone, even 300 is a lot.


Will you get security updates for a decade for Android phones like the iPhone 5s has had?


Yes, Google Play services, where most of the attack vectors lie, will get updated.

Most people don't keep a phone that long but change it after 4-5 years or so. How many 5s still exist in the wild?


... and will your Android actually charge when you plug it into a charger?

I've had roughly five Android devices (tablets and phones) that had the curious problem that I'd plug them into a charger and it was always 50-50% if they would actually charge. This would happen when the devices were new and were starting from zero (often it would take several times to charge the device for the first time) and still happens when the devices are in regular use. All the time I leave one plugged into the charger over night and... nothing. Plug it into the same charger again and maybe it charges.

I've noticed the same problem w/ Sony devices both cameras and the PS Vita if they are used with non-Sony chargers.


I used the Wileyfox Swift for seven years until LineageOS stopped updates for it. I expect similar longevity from contemporary contenders.


Probably not, most people don't care.


Most android vulnerabilities get patched via Google Play Services.


Not if they are kernel related, and only since APEX became a thing on Android in Android 10.

However, it doesn't change the main point, most phone users care about updates on their phone as much as they care about updating any other device with a CPU in their home.


I meant that most vulnerabilities in the wild that pose a threat to users and end up actually being exploited by malware, are the ones usually covered by updating Google Play services.

Yes, kernel vulns will always be uncovered, but how often does the average user encounter malware that exploits that attack vector?

The biggest attack vectors on Android are Google Play services and your browser. Keep those up to that and you're good even with older phones.


Good boots are like PV panels. Poor people don't have the money to buy them. Middle class spends less in the end, since the investment pays off.

Rich people on the other hand usually did not get rich by saving money.


Jobs' view sounds essentially the same as Nintendo's view in the wake of the video games crash, which eventually became the unanimous model in that industry. With iOS they also got to enforce it like a console, with a whitelist of acceptable software


I was so used to this view by Nintendo that it was hard to believe seeing a lot of really low quality games in the Nintendo switch store nowadays. The worst part is that it's still not easy to publish there (or get access to the SDK in the first place) in an independent way without a publisher or a history of already successfully released games.


Yeah, I think a big part of how Gates and others came into their view was that there were so many examples like Nintendo where a platform vendor had artificially limited the third party ecosystem and it constrained the success of the platform as a whole.

Basically, the Gates view was the more "correct" one for the 1980s/1990s. At that time, most people did not own personal computers, even many businesses didn't, and growth came from on-boarding these non-users. The more apps available on your platform, the more likely a non-user would be to see a reason to buy a computer at all. Also, the platform Gates was selling was much cheaper to try.

Today, everyone owns at least one computing device. Platforms aren't competing for non-users (there are none left), they are competing for users of other platforms. The Jobs view works better in that world since it improves platform quality perceptions.


By the way, there is no particular reason why the computer field had to go in this direction. If you look at the history of aviation, there were a lot of kit airplanes in the 1920s and 1930s because people thought that over time more and more people were going to want to own an airplane. That turned out to be wrong. Most people are content to pay somebody else to transport them.

Saas is the computing equivalent of an airline, and it's very popular, but it hasn't fully displaced individual computer ownership. We'll see what happens long term.


> Saas [...] hasn't fully displaced individual computer ownership.

The writing is on the wall though, isn't it? Even companies that you'd expect to go all-in on user freedom and ownership have locked down their platforms and ratcheted up the service offerings. Not a single FAANG company is willing to take a definitive stand in defense of their users. They all just see a bottom line waiting to be butchered.

And for most users, I'd wager SaaS has displaced computer ownership. Stuff like the App Store, Play Store and even game consoles are ultimately a service unto themselves. Even disregarding that, the "happy path" of modern software is paved far away from user empowerment. People are generally more willing to pay for a subscription than to switch to a more convenient hardware platform.


> The writing is on the wall though, isn't it?

I'm really not sure. Five years ago, I thought that SaaS would completely replace local computing. Now, I'm less certain.

Yes, SaaS is very popular and there are far fewer local applications now. However, there still seems to be some value in local applications because it's thought to be a better end user experience. Look at Slack. It started out as a website, pure browser-based SaaS. Over time they developed a local app, which is basically the same Javascript but packaged up as a local Electron app. This seems to be regarded as superior to the web version for most people.

Consider Zoom. There was initially a native client and an equivalent web version. They seem to be actually killing off the web version now -- access to it is a non-default option for the meeting creator, buried on a screen with 100+ meeting settings nobody understands or reads. They apparently want to be a local-only program.

As long as there is demand for local apps, people will continue owning general purpose, re-programmable microcomputers. Maybe the vendor can lock you out of some low-level stuff like Intel ME or the BIOS, but these platforms are still allowing people to download a compiler and run their own programs directly on the metal.

I'm not sure what direction it will ultimately go, but my hunch is that if it were possible for SaaS to 100% displace local computing, it would have already happened.


Perfectly fair response. I think your examples are also good evidence that people like native software, or at least the appearance of a cohesive platform. Apparently "non-native" apps like an SPA or WASM tool will probably turn most normal users off. Good enough native support can make a web client redundant, even if 'native' means Javascript with OS-native controls.

To counter that though, I think we have to define SaaS concretely and ask what a more service-driven world would look like. I consider SaaS to not only be the paid components of an OS, but also the "free" services like Google and the App Store. Holistically speaking, our OSes are saturated with services; Find My, Android Notify services, OCSP/Microsoft/Google telemetry, online certificate signing, and even 'assistants' that fail to assist without internet. What more could we possibly put online?

It's a bit of a rhetorical conclusion to leave things on. I'll certainly see worse examples in my lifetime, but the status quo is bad enough in my opinion.


I agree with you about the absurd degree of service saturation. I've discovered many things I cannot do without an Internet connection because one trivial step in the process absolutely requires communication with some network endpoint. I found one app where, if you block it's ability to send analytics reports back to Mixpanel, it will run 3 or 5 times but after that refuse to run again until an Internet connection is available. I thought it was absurd since it proves the developers actually considered offline use cases, but decided that they couldn't allow that to go on indefinitely.

Anyway, sure, let's leave it there and see where things are in 5 years. It'll be interesting to find out!


Only if the users can be satisfied by the supposedly higher quality software, produced by devs willing to pay the Apple tax and unafraid of being Sherlock-ed

If your need was niche then go away.


There are only two smartphone platforms, and both of them have app stores and make other efforts to control the overall end user experience. What do you disagree with?


I'm scratching my head what besides F-Droid and Altstore are available for smartphone platforms. Unless you mean the OS itself.


> Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad.

Gates's view was also that, once the market had sorted the good from the bad, Microsoft would, if desired, just implement whatever killer apps third party developers had discovered and make them part of Windows, cutting the floor out from under the third party developers. In other words, Gates viewed third party developers as doing market research for Microsoft that Microsoft didn't have to pay for.

> ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform

Windows is still well ahead of both macOS and iOS in terms of market share, so I think it's more that Jobs and Gates had different definitions of success.


Yep, and Apple does this too. The Spotlight feature in macOS (OS-wide search, it's the little magnifying glass in the top right) was a feature that Apple blatantly ripped off from a popular early-2000s third party extension. Adding this to the OS killed that company overnight. Forget their name, but someone else might remember.

All platform vendors seem to do this, which is unfortunate.


It's an easy thing to find, the name even became a verb. Sherlock was the program: https://www.urbandictionary.com/define.php?term=sherlocked


Twice: The usage of a giant text input box as a launcher or command palette was pioneered on the Mac by LaunchBar and Quicksilver and later Alfred, etc. Spotlight for a long time was only a menu extra.


> Gates's view was also that, once the market had sorted the good from the bad, Microsoft would, if desired, just implement whatever killer apps third party developers had discovered and make them part of Windows, cutting the floor out from under the third party developers. In other words, Gates viewed third party developers as doing market research for Microsoft that Microsoft didn't have to pay for.

Reminds me of how Amazon sees third-party sellers.

Maybe that is the standard abusive strategy of platforms that hold a dominant market position.


Copying others’ successful strategies is probably something that has existed since before apes even evolved into humans.


How is Windows ahead of iOS in terms of market share? The Windows phone was discontinued _years_ ago?


If you look at all types computing platforms combined, then Windows is ahead of iOS.

https://en.wikipedia.org/wiki/Usage_share_of_operating_syste...


MS made killer software, literally killing the competition by any means.


> Jobs clearly saw low-quality, third party software as something that could taint the product Apple was selling.

I mostly disagree with this, so long as the low quality software isn't pushed by the platform then it's not that much danger to the reputation. Lack of quality software is a much larger danger, and it's not easy to tell ahead of time where the quality will come from. The big problem is when you are running your own app store and it becomes packed with garbage.


It feels like they eventually managed to enforce this when the App Store came along, rejecting “useless” or “redundant” apps. Eventually they gave up on that idea because they loved seeing the number of apps rise, even if those apps were, in the end, low-quality software.


If they didn't have hidden 1st party api stuff that 3rd parties have difficulty leveraging this moniker would feel more useful/relevant, IMO. In the current context they use these labels to protect their workflow from other entities.


When I setup computers for family members I've used things like Windows 10 S mode BECAUSE it limits the software that can be installed. They are too low skill to know what applications to trust.

At my work, I similarly have gotten secure administrative workstations deployed with again, very limited lists of software. These users being compromised has such extreme consequences that one cannot trust any application to run on these computers.

So I can certainly appreciate the need to be very exclusive with the software allowed on a device. Yet I see this exclusivity being rather anti-consumer in practice. iOS is the most secure operating system I've ever seen for the average person, and yet things like adblocking are behind a paywall. Neutering Adblock for the sake of security is considered acceptable. Blocking alternatives to WebKit may reduce the overall attack surface area Apple contends with and it also limits the functionality of the phone since v8 is the de facto standard. Blocking alternatives to the App Store absolutely enhances security while also forcing consumers to pay more. Then you get into things which have nothing to do with security truly, like how you can only make "Music" your default music player on iOS, not Spotify.

I really appreciate what Windows did with "S mode" because it's optional yet provides benefits to consumers in niche situations. I similarly appreciate the enterprise features of Windows to configure a laptop which has limited access to software to enhance security. Apple on the other hand forces the exclusive use of their software to benefit themselves at the expense of their customers. It is unacceptable and I would vote for anybody pledging to outlaw it as monopolistic abuse. Software exclusivity is useful, shoving it down people's throats is abhorrent.


> yet things like adblocking are behind a paywall

Are you referring to iCloud Private Relay?


More that most Adblock extensions on safari are paid such as AdGuard, whereas on Android I can get better AdBlockers working for free.


AdGuard ad blocking is free on iOS Safari. DNS privacy and custom filters do require a subscription.


Firefox with ublock origin works well on Mac though. I think safari is overrated anyway.


Purify, Hush, Ghostery are free.


Personally I’m not annoyed at low quality software - I just avoid it. What I draw a bright red line at is intrusive/abusive software (essentially malware). I’m glad apple enforces a standard with regard to software behavior that is acceptable versus not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: