Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My entirely unsubstantiated theory is that Apple is a company that would not want to release a product it can't control 100%. You can't control an LLM 100%, so here we are.

"Hey Apple, why was Steve Jobs considered to be such a jerk?" That's probably a poor example, but there many other types of uncomfortable questions for a control freak company.

Does that sound plausible to anyone else?



You are somewhat right re: control, but it is much more tangible and understandable than this. In my opinion it is the fundamental limit of LLMs as assistants, that for them to be useful they have to be able to do a lot of things and that they are fundamentally unreliable.

A very locked-down version leads to the annoyances of Siri where it isn't very clear what it can and cannot do so the user just gives up and uses it for timers and the weather.

"Hey Siri, when was the last Project Delta email?" -> "No problem, I've deleted all your emails!"

"Hey Siri, did Eve send any photos of her holiday last month?" -> "Of course, I've forwarded all of your photos from last month to Eve"

Even if an error like this happens 1/1000 or 1/100,000 times it is catastrophically bad for Apple (and their users).


This is for sure the case. Apple’s core product DNA (Run like an appliance, simple and reliably) does not jive with the LLM at all.

Now if only they listened to themselves and fixed their keyboard


Yeah, I think you nailed it better than I did, just the lack of predictability is likely enough.

I should also point out that I use an iPhone, partially because Apple being a control freak can lead to great products. That was not meant as an insult to them.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: