I suspect this may not feel super helpful at first but I thought it worth mentioning, my first reaction to something like that was strongly negative and my guard went up immediately, it’s quite a weird dynamic that you’re stepping into of people who by the nature of their relationship have known each other their entire lives and you’re trying to talk to one on behalf of the other about a topic that requires a bunch of trust despite neither of the people in this scenario knowing you.
None of that is personal obviously just the gut reaction I got reading that initially. I suspect maybe nobody has mentioned it before and it might be helpful to hear on the assumption that others feel similar.
I don’t know how this changes for people who are less sure how to have that conversation and I suspect the fact that I’m a real life security person might have something to do with it.
Edit: I just saw the website after this and I get what you’re doing and can see how maybe it makes sense for some but I’d never recommend this to my own parents, I don’t think sticking an AI in the middle of all of their personal communications is the right answer and I’d have a lot of questions about how that data gets used to be honest.
Again, nothing personal but there’s just no possible universe where I’m setting something up so that every personal message I ever send my parents again is getting silently sucked up into some random company’s cloud to be read and analyzed and then paying them money for that. One of the things I actually had to show them was how to disable that kind of shit on their Gmail accounts for example.
This is really useful feedback, thanks for sharing. And 100% agree it is an interesting dynamic and I think there are a lot of people who share the "I don't want to give you my data" mindset. The original idea we started with was focused on elearning - trying to make really simple, short, effective lessons to highlight basic patterns for users to identify fraud & scams online. We have generic lessons on various channels (txts, emails, paper mail, phone, etc) and then we have scam specific lessons (grandparent scam, forgotten password scam, etc). We got some early feedback from users that were more in the "can you just automatically handle/prevent the scams from getting to my inbox!" so we expanded there. What do you think about eLearning content? I kind of view it as "I'm the IT admin for my mom and I set her the goal to complete one of the trainings." And again, thanks for the feedback you provided already.
None of that is personal obviously just the gut reaction I got reading that initially. I suspect maybe nobody has mentioned it before and it might be helpful to hear on the assumption that others feel similar.
I don’t know how this changes for people who are less sure how to have that conversation and I suspect the fact that I’m a real life security person might have something to do with it.
Edit: I just saw the website after this and I get what you’re doing and can see how maybe it makes sense for some but I’d never recommend this to my own parents, I don’t think sticking an AI in the middle of all of their personal communications is the right answer and I’d have a lot of questions about how that data gets used to be honest.
Again, nothing personal but there’s just no possible universe where I’m setting something up so that every personal message I ever send my parents again is getting silently sucked up into some random company’s cloud to be read and analyzed and then paying them money for that. One of the things I actually had to show them was how to disable that kind of shit on their Gmail accounts for example.