What people thought about LLMs five years ago, and how close we are to AGI right now are unrelated, and it's not logially sound to say "We were close to LLMs then, so we are close to AGI now."
It's also a misleading view of the history. It's true "most people" weren't thinking about LLMs five years ago, but a lot of the underpinnings had been studied since the 70s and 80s. The ideas had been worked out, but the hardware wasn't able to handle the processing.
> True honesty requires acknowledging that we truly have no idea. Progress in AI is happening faster than ever before, but nobody has the slightest idea how much progress is needed to get to AGI.
It's also a misleading view of the history. It's true "most people" weren't thinking about LLMs five years ago, but a lot of the underpinnings had been studied since the 70s and 80s. The ideas had been worked out, but the hardware wasn't able to handle the processing.
> True honesty requires acknowledging that we truly have no idea. Progress in AI is happening faster than ever before, but nobody has the slightest idea how much progress is needed to get to AGI.
Maybe, but don't tell that to OpenAI's investors.