Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Not that I expect more of a language model, no matter how "large".

That's a weirdly dismissive statement. The fundamental problem is that a lot of these terms are from after the AI's cutoff point. It's perfectly able to handle terms like "Emacs", "RenTech" or "MSFT", and it can guess that "4070 Series" probably refers to a GPU.

ChatGPT in a few years will probably be perfectly able to produce the correct answers.

(Actually, ChatGPT consistently claims its current cutoff is April 2023, which should let it give a better answer, so I'm taking a few points off my explanation. But it still feels like the most probable one.)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: