Lemme start by saying this is objectively amazing. But I just really wouldn't call it a breakthrough.
We had one breakthrough a couple of years ago with GPT-3, where we found that neural networks / transformers + scale does wonders.
Everything else has been a smooth continuous improvement. Compare today's announcement to Genie-2[1] release less than 1 year ago.
The speed is insane, but not surprising if you put in context on how fast AI is advancing. Again, nothing _new_. Just absurdly fast continuous progress.
Why wouldn't it? I still have to hear one convincing argument how our brain isn't working as a function of probable next best actions. When you look at amoebas work, and animals that are somewhere between them and us in intelligence, and then us, it is a very similar kind of progression we see with current LLMs, from almost no state of the world, to a pretty solid one.
We had one breakthrough a couple of years ago with GPT-3, where we found that neural networks / transformers + scale does wonders. Everything else has been a smooth continuous improvement. Compare today's announcement to Genie-2[1] release less than 1 year ago.
The speed is insane, but not surprising if you put in context on how fast AI is advancing. Again, nothing _new_. Just absurdly fast continuous progress.
[1] - https://deepmind.google/discover/blog/genie-2-a-large-scale-...