Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Y’all can throw pithy sayings at each other all you like but memory is not the same as understanding, and AI does offer plenty of opportunity for humans to cognitively disengage. Doesn’t necessarily mean most people will, but it’s very likely that most people will.


Especially if we're more concrete: If your job is, say, in administration and what the machine answers is correct enough that in in 8 out of 10 cases you can basically copy-paste it, I'd say it's extremely likely that it's going to increase the amount of errors made.


In the setting you describe I think it will _reduce_ the errors to 20%


No, since the setting is not specifying the initial rate, it might as well increase or staying stable at 20%.

But there are other factors, like, is the amount of outcomes done also changing, thus affecting the absolute number of errors?

Also, does the side effect of disengage the person in most cases means it has side effects like not paying the same attention to what would stand out as a big issue that needs more attention and consideration than business as usual?

And so on


It's an interesting thing to think about. From the way it's talked about, I would predict that AI will enable people who are more cognitively inclined to think in more complex and refined ways; while other people that over-rely on the results would be the ones that decline.

However, research[1] suggests that relying on AI tools degrades reasoning and cognitive ability regardless of your cognitive ability, and may even cause users to stop making their own choices[2].

1. https://www.404media.co/microsoft-study-finds-ai-makes-human...

2. https://www.nature.com/articles/s41599-023-01787-8




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: