Because most sociologists suggest that most people don’t take time to critically think like this. Emotional brain wins out usually over the rational one.
Then you have this idea of the sources of information most people have access to being fundamentally biased and incentivized towards reporting certain things in certain manners and not others.
You basically have low odds of thinking rationally, low odds of finding good information that isn’t slanted in some way, and far lower odds taking the product of those probabilities for if you’d both act rationally and somehow have access to the ground truth. To say nothing of the expertise required to place all of this truth into the correct context. But if you did consider the probability of the mother having to be an AI expert then the odds get far lower still off all of this working out successfully.
100% accurate! She has a tendency to read one person's opinion on it and echo it. I have seen it for years with things. I'm not shocked AI is the current one but I wish it were easier to get her to take time to learn things and think critically. I have no idea how I'd begin to teach her why so much of the fear mongering is ridiculous.
Yeah there are legitimate risks to all of this stuff but, to understand those and weigh them against the overblown risks, she'd have to understand the whole subject more deeply and have experimented with different AI. But you even mention ChatGPT she's talking about how it's evil and scary.
> She has a tendency to read one person's opinion on it and echo it.
...and when the people whose opinions she parrots are quietly replaced with ChatGPT, her fears will have been realized-- at that point she's being puppeted by a machine with an agenda.
Then you have this idea of the sources of information most people have access to being fundamentally biased and incentivized towards reporting certain things in certain manners and not others.
You basically have low odds of thinking rationally, low odds of finding good information that isn’t slanted in some way, and far lower odds taking the product of those probabilities for if you’d both act rationally and somehow have access to the ground truth. To say nothing of the expertise required to place all of this truth into the correct context. But if you did consider the probability of the mother having to be an AI expert then the odds get far lower still off all of this working out successfully.