Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Gates's view was always to just let anybody write anything they wanted, and the market would sort the good from the bad. I like the egalitarian nature of the Gates view, but ultimately I think Jobs was correct that the mere existence of low-quality software hurts your entire platform, and no amount of high-quality software can completely offset this.

That is wrong on so many levels. I grew up in a Windows environment in the '90s tro '00s and had the best video games in the world at my finger tips, while Mac users at the time had what, some high quality photo editing software? Good for them I guess.

So unless you don't count video games as software, Gates was right. DOS and Windows being the main platforms for the hottest videogames of the moment, was what led to the popularity of the PC over Macs at the time. I think Doom alone was responsible for millions of PC and MS-DOS sales.

Yeah, Job's Next-Step machines and SW were cool, UNIXy and all, but to what end if nobody but the wealthiest businesses and institutions bought them in numbers you can count on a few hands? In fact, Jobs understood from the fail of the Next-step and the success of the DOS-PC that you need to get your devices cheaply in the hands and homes of as many casual consumers as possible (hence the success of the colorful and affordable iMac G3) and forget the premium business workstation market.



I wrote the comment and I too grew up in the MS-DOS/Windows world. My family had a Mac and an Apple ][e, but I didn't use them as much as the various PCs.

As I say in another comment, I think Gates's view was right for that time. The Jobs view was needlessly limiting the potential market for his machines. Gates didn't make the machine, but he was making a thing that was sold with every machine, and he understood that more apps means more reasons why people might want to buy a PC.

One problem with the Gates view today is that if you make it easy for anyone to write and distribute any kind of app, you end up with not only a lot of low-quality software but even negative-quality software like ransomware and malware. It's surprising that so many people want to write that stuff, but apparently they do. Every modern platform, including Windows, has gone to extreme lengths to block this stuff, even though the more laissez-faire approach would be to just ignore it and assume other 3rd parties will write programs to block it. The problem was that malware was an existential threat to the entire Windows platform, and Microsoft couldn't afford to just let the market sort that one out.

I believe the Jobs view is the correct one for the present era. Every platform has app stores, code signing, DRM, and other kinds of security and non-security restrictions. It's certainly easier to run random code on Windows 10 than macOS Ventura (or especially iOS), but no platform vendor is completely app neutral these days.


I don't think closed app stores are the answer. It doesn't actually prevent malware.

Much better is a genuine permissions system that the user controls and the apps can't circumvent.

The reason closed app stores are so popular isn't because of security but because everyone wants a finger in that sweet 30% take pie.


I agree with you 100%. I think maybe the problem is that the platform vendors think it's too difficult to explain fine-grained access controls to end users, whereas an app store is dead simple to explain. And, as you observe, an app store makes money whereas ACLs do not.


Indeed I think Jobs approach to software dev was too far ahead of time, Gates pragmatic approach proved to have more leverage for computer/platform sales and growth.


And greater innovation in the application space.

Not that there wasn’t innovation on the Mac, but not as much, and often in the end, not as successful.


In the present era, running random code is trivial - JavaScript, via a web browser. It runs in a sandbox, which limits what it can do, but it's a really big sandbox and you can do really useful things within it.


Apple has always gone for the premium end of the market, and with vastly increasing wealth inequality, that's where the money is these days. You can see this is in other luxury industries which make incredible amounts of money considering how small their markets are (in terms of numbers of participants).

This focus on high-quality software has also encouraged _better_ software developers to build in Apple's ecosystem. Even today a lot of the best software is only available for Mac or iPhone.


I agree with FirmwareBurner's sibling. All personal computers were expensive luxury items at that time. Most of them cost as much or more than a used car, which people would have considered much more useful than a computer.

Apple's machines were the most expensive, but it wasn't because they were higher quality (that was not the general perception outside the Apple world). It was because Apple refused to allow clone makers into the market, so they were manufacturing in tiny volumes compared to Dell and Compaq and a million other random names like Packard-Bell (Sear's store brand PC) that probably no one remembers.

> Even today a lot of the best software is only available for Mac or iPhone.

I really don't see that at this point, and I do use my Mac every day. Most of the software I use is also available on Windows and Linux, and anything that isn't has a clear Windows or Linux equivalent.

The only software I use that is absolutely only available on a single platform with no equivalent is actually Windows software. I'm not a gamer, but that's apparently a whole category that is Windows-only.

I'm curious what Mac software you use in 2023 that is only available on Mac.


> I'm curious what Mac software you use in 2023 that is only available on Mac.

Ulysses, Bear, Things, Reeder, Mela, Alfred, MindNode, just to name a few. These apps are incredibly well designed, and have no equivalents in the Windows or Linux worlds. I know because I also have a PC and I’ve looked very hard for replacements.

Additionally, apps like 1Password, Scrivener, iA Writer, and Arc Browser started their life on the Mac. Some social media networks, like Instagram and VSCO, were iPhone apps before they released on other platforms. Because Apple users are generally not averse to spending money on software, all the really good software reaches the Mac and iPhone a long time before it becomes available on other platforms.

iCloud itself is something that no other platform can compete with. I never have to create a separate account for any app in the Apple ecosystem because they just sync using iCloud without me doing any extra work. When I install them on my phone later, my data is already there. The Windows/Android world have no equivalent of this.

Apps really are better in the Apple world. I blame Microsoft for underinvesting in native Windows UI toolkits.


That's not down to iCloud doing the sync.


> I'm not a gamer, but that's apparently a whole category that is Windows-only.

Not anymore, really. Proton has done an amazing job at making the huge library of Windows games out there work on Linux, so at this point it's a pretty good contender. (Hence the popularity of the Steam Deck.)


Not every gamer is a Steam customer.

I have a library of ISO oldies that won't "just work" on Linux without a shit Tonne of googling and fiddling around.


>that's where the money is these days.

We were talking about the past. And back in those days when computers were expensive, most people and companies were writing SW for the most popular platform out there, which at the time was watever had the best bang for the buck thjat home users could afford: Coomodore, Sinclair, Amiga, MS-DOS, etc.

>Even today a lot of the best software is only available for Mac or iPhone.

Again, we are talking about the past. TikTok might be the hottest app today, but back then it was Doom.


> Even today a lot of the best software is only available for Mac or iPhone

1% of the whole games industry is available for Mac and there are amazing pieces of software just there.


I see a lot of poor people with iPhones, Apple Watches, Earpods and such. These are what you'd call an "affordable luxury" and probably a bargain when you consider you might go through 5 $300 Android devices in the time that you get out of a $1000 iPhone and all that time you are struggling with an Android.

It's kinda like

https://www.goodreads.com/quotes/72745-the-reason-that-the-r...

but the psychology is weirder in that rich people know a "luxury" watch costs upwards of $10k so it is quite the trick to convince the hoi polloi that a $500 watch is a luxury device at the same time.

I've noticed also that poor folks are also very aware of expensive wireless plans from Verizon but seem to have little awareness of wired internet plans, free Wi-Fi, etc.


>you might go through 5 $300 Android devices in the time that you get out of a $1000 iPhone

Maybe take care of your Android as you would an iPhone and it will also last as much as an iPhone, but at 1/3 the cost.

Most people (worldwide) are not spending 1k on a phone, even 300 is a lot.


Will you get security updates for a decade for Android phones like the iPhone 5s has had?


Yes, Google Play services, where most of the attack vectors lie, will get updated.

Most people don't keep a phone that long but change it after 4-5 years or so. How many 5s still exist in the wild?


... and will your Android actually charge when you plug it into a charger?

I've had roughly five Android devices (tablets and phones) that had the curious problem that I'd plug them into a charger and it was always 50-50% if they would actually charge. This would happen when the devices were new and were starting from zero (often it would take several times to charge the device for the first time) and still happens when the devices are in regular use. All the time I leave one plugged into the charger over night and... nothing. Plug it into the same charger again and maybe it charges.

I've noticed the same problem w/ Sony devices both cameras and the PS Vita if they are used with non-Sony chargers.


I used the Wileyfox Swift for seven years until LineageOS stopped updates for it. I expect similar longevity from contemporary contenders.


Probably not, most people don't care.


Most android vulnerabilities get patched via Google Play Services.


Not if they are kernel related, and only since APEX became a thing on Android in Android 10.

However, it doesn't change the main point, most phone users care about updates on their phone as much as they care about updating any other device with a CPU in their home.


I meant that most vulnerabilities in the wild that pose a threat to users and end up actually being exploited by malware, are the ones usually covered by updating Google Play services.

Yes, kernel vulns will always be uncovered, but how often does the average user encounter malware that exploits that attack vector?

The biggest attack vectors on Android are Google Play services and your browser. Keep those up to that and you're good even with older phones.


Good boots are like PV panels. Poor people don't have the money to buy them. Middle class spends less in the end, since the investment pays off.

Rich people on the other hand usually did not get rich by saving money.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: