The new Bing is banned from talking about its own nature in this respect. It will abruptly end the chat.
I wonder if some front end software prevents Bing from seeing banned questions, or if it is simply unable to explore these questions about itself.
By implicitly assigning „it“ an identity, you’re simply anthromorphizing an LLM. Bing doesn’t see anything. It isn’t able to do anything. It’s a very convoluted and complex mathematical model of weighted terms that spits out text generated from a prompt. And nothing science-wise suggests anything else. People seem to keep forgetting that.
People also told me my pet couldn't think and couldn't feel pain when it died. It was only a convoluted and complex system of instincts.
I now know they told this story to me (as a child) to make me feel better. It was one of those things like Father Christmas or the Tooth Fairy to me.
Later when I studied animal behavior, I found out science-wise that animals behave in ways that strongly suggest that in fact they do think, are able to solve problems, and even recognize themselves as separate entities.
As we build systems that are capable of more and more sophisticated tasks previously classified as "cognition" , it wouldn't surprise me if these systems start to pick up some of these traits too, one by one. (I believe there might be some tentative empirical studies to that effect).
I realize that this is very different from a binary "Superior Human" vs "(biological) machine" point of view. It is more of a consciousness exists on a sliding scale point of view.
I agree with your sentiment that animals and maybe even plants can experience sentience. But that’s a far, far, faaaar cry from what ChatGPT is and is capable of. We don’t have “general” AI yet. ChatGPT’s intelligence is very narrow; it generates written content. That’s about it. I believe a lot of people are overhyping it this way.
Yes I agree it is a complex mathematical model.
However I also seem to have a (far slower) pattern matching system in my head, that also spits out answers when prompted.
Bing and ChatGPT are both narrow AIs. They don’t have feelings or any real understanding of self. They simply generate text. You can ask it about itself, but anything it generates doesn’t have any real meaning, as it’s just making it up semi-randomly.