My point is: LLMs sound very plausible and very confident when they are wrong.
That’s it. And I was just offering a trick to help remembering this, to keep checking their output – nothing else.
My point is: LLMs sound very plausible and very confident when they are wrong.
That’s it. And I was just offering a trick to help remembering this, to keep checking their output – nothing else.