I think of ChatGPT as a faster Google or Stackoverflow and all of my colleagues are using it almost exclusively in this way. That is still quite impressive but it isn’t what Altman set out to achieve (and he admits this quite candidly).
What would make me change my mind? If ChatGPT could take the lead on designing a robot through all the steps: design, contract the parts and assembly, market it, and sell it that would really be something.
I assume for something like this to happen it would need all source code and design docs from Boston Dynamics in the training set. It seems unlikely it could independently make the same discoveries on its own.
> I assume for something like this to happen it would need all source code and design docs from Boston Dynamics in the training set. It seems unlikely it could independently make the same discoveries on its own.
No, to do this it would need to be able to independently reason, if it could do that, then the training data stops mattering. Training data is a crutch that makes these algos appear more intelligent than they are. If they were truly intelligent they would be able to learn independently and find information on their own.
What would make me change my mind? If ChatGPT could take the lead on designing a robot through all the steps: design, contract the parts and assembly, market it, and sell it that would really be something.
I assume for something like this to happen it would need all source code and design docs from Boston Dynamics in the training set. It seems unlikely it could independently make the same discoveries on its own.