I didn't ask for an estimation, I asked for the exact weight. A human can do this given the process I described.
If the chain of thought was accurate, then it would be able to give you an internemdiate output of the shape in some 3d format spec. But nowhere in the model does that data exist, because its not doing any reasoning, it just still all statistically best answers.
I mean sure, you could train a model on how to create 3d shapes out of pictures, but again, thats not reasoning.
I don't get why people are so attached to these things being intelligent. We all agree that they are usefull. Like it shouldn't matter if its not intelligent to you or anyone else.
If the chain of thought was accurate, then it would be able to give you an internemdiate output of the shape in some 3d format spec. But nowhere in the model does that data exist, because its not doing any reasoning, it just still all statistically best answers.
I mean sure, you could train a model on how to create 3d shapes out of pictures, but again, thats not reasoning.
I don't get why people are so attached to these things being intelligent. We all agree that they are usefull. Like it shouldn't matter if its not intelligent to you or anyone else.