> Baffling? Is just the HN way of calling someone a fucking idiot without violating the rules and pretending to be polite. Just say it to my face.
Maybe that has been your experience with other users in which case I am sorry people have been so rude to you, but in my case it’s just a word I personally use a lot. If it’s too severe a term than my and and reading back I am coming in a it hot so I am sorry for the tone. I do not think you’re an idiot and I am absolutely not personally attacking you. I tend to have a dramatic way of speaking, I can admit that. But again, this is not a personal attack.
The point I am trying to communicate is that it’s (to me) a very surprising and difficult to square take. Comparing a tool failing to do its job correctly to appreciating a work of written fiction just seems bizarre to me. That’s the truth. The people building LLMs do not want that result. I do not want that result. Nobody wants it to spit out inaccurate information disguised as correct information. I don’t want my calculator to spit out fiction literally ever - the same goes for LLM’s outside of deliberately prompting it to do so. If I want fiction as you describe (art and such), I seek it out deliberately. I will grab a book off my shelf or watch a show (or prompt the LLM with intent).
Put another way: The difference between the fiction in a novel and what an LLM spits out is that I am seeking it out in the former, not the latter. When an LLM gives me incorrect information disguised as correct information (undesired fiction), it is failing to do its job correctly. It is a tool that is not functioning properly. I absolutely 100% never want fiction emerging in key instructions when I am cooking or am fixing my car. It is always an undesired result.
So to circle back to why I find this “baffling,” or another word if you find that too severe, it’s that I don’t understand how something that is so concretely undesirable can be described as a positive thing comparable to creating works of literature for us to appreciate. You’re telling me it’s good that something does not function properly/as expected and gives me results I absolutely do not want. To get away from “baffling”: That is a very bold and unexpected take that I struggle to find any agreement with.
It’s not bizarre. Hallucination is just another word for invention, the same cognitive move that produces fiction. In one context that’s failure, in another it’s success. Calling that bizarre is like calling imagination itself an error. If that feels strange to you, you’re missing something fundamental about how creativity works. Everyone knows this. Any human being with a pulse understands the difference between making something up for art and making something up by mistake. So when you act like that’s an alien concept, I don’t think you’re confused. I think you’re pretending.
The difference between the fiction in a great novel and what an LLM spits out is that I am seeking it out in the former, not the latter. When an LLM does that, it is failing to do its job correctly.
Sure, but thanks for explaining what everyone already understands. You’re not clarifying anything new, you’re just pretending not to get the point so you can keep arguing. The discussion wasn’t about LLMs fixing cars or following recipes. It was about any kind of work, and a huge portion of human work revolves around invention, storytelling, and creative synthesis. Fiction writing isn’t a corner case, it’s one of the most valued human expressions of intelligence. Everyone knows that too. It’s not an obscure philosophical insight. It’s basic cultural literacy. Which is exactly why I don’t buy your act. You’re too smart not to know what’s obvious to everyone else.
So when I say the “failure mode” of hallucination can be a “success mode” elsewhere, I’m not dodging the topic, I’m expanding it. Creativity is a massive part of human life. Pretending otherwise just to win a narrow argument is dishonest. You know exactly what I meant, you’re just acting like you don’t. No one with normal cognitive function finds that bizarre. It’s theater.
And you used the classic tells, the same ones that get used on HN all the time to dodge the rules while still getting in a jab. You drop words like “bizarre” and “baffled,” act like you’re confused, then follow up with a calm “apology” to sound like the reasonable one. It’s a well known pattern here. You literally used the exact two words everyone does when they’re trying to provoke without crossing the line.
Then came the self deprecation. The polished restraint. “If that was too severe, my apologies. I tend to be a little dramatic. I don’t think you’re an idiot. I’m just trying to communicate my point. I’m sorry for that.” It’s spotless. It hits every note. It reads like empathy but functions like control. It doesn’t defuse the conflict, it reclaims the moral high ground. It’s not humility, it’s stagecraft.
Look, maybe I was too sharp myself. I can be dramatic too, I admit that. It’s not a personal attack, I just have strong feelings about intellectual honesty. I’m sorry for that.
I’m not trying to dodge anything and I’m not sure why there’s so much hostility here but sure we can go ahead and drop this. I made my point and retreading it isn’t going to do any good. Have a good rest of your week.
Maybe that has been your experience with other users in which case I am sorry people have been so rude to you, but in my case it’s just a word I personally use a lot. If it’s too severe a term than my and and reading back I am coming in a it hot so I am sorry for the tone. I do not think you’re an idiot and I am absolutely not personally attacking you. I tend to have a dramatic way of speaking, I can admit that. But again, this is not a personal attack.
The point I am trying to communicate is that it’s (to me) a very surprising and difficult to square take. Comparing a tool failing to do its job correctly to appreciating a work of written fiction just seems bizarre to me. That’s the truth. The people building LLMs do not want that result. I do not want that result. Nobody wants it to spit out inaccurate information disguised as correct information. I don’t want my calculator to spit out fiction literally ever - the same goes for LLM’s outside of deliberately prompting it to do so. If I want fiction as you describe (art and such), I seek it out deliberately. I will grab a book off my shelf or watch a show (or prompt the LLM with intent).
Put another way: The difference between the fiction in a novel and what an LLM spits out is that I am seeking it out in the former, not the latter. When an LLM gives me incorrect information disguised as correct information (undesired fiction), it is failing to do its job correctly. It is a tool that is not functioning properly. I absolutely 100% never want fiction emerging in key instructions when I am cooking or am fixing my car. It is always an undesired result.
So to circle back to why I find this “baffling,” or another word if you find that too severe, it’s that I don’t understand how something that is so concretely undesirable can be described as a positive thing comparable to creating works of literature for us to appreciate. You’re telling me it’s good that something does not function properly/as expected and gives me results I absolutely do not want. To get away from “baffling”: That is a very bold and unexpected take that I struggle to find any agreement with.