Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>How often do you think a ChatGPT user walks away not just misinformed, but misinformed with conviction? I would bet this happens all the time. And I can’t help but wonder what the effects are in the big picture.

this is so wrong! i simply can't get ChatGPT to admit something clearly wrong. it can play both sides and gives nuance which is exactly what i expect. but it is so un-sycopanthic that it won't leave you feeling like you are right. any examples of it doing so are welcome! show me examples where it takes a clearly wrong or false idea and makes it look as if it is a good idea (unless you specifically ask it to do it).



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: