Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doesn't matter. In their homes, or in their gardens, or when they are going out and have one foot on the street.

Illegal collection can't reliably be prevented. Get a photograph, extract the biometric data, enter them in your database, delete the photograph. Job done.

Maybe the only way to get control back on this is to require that these databases to be open source (or publicly available) and make it illegal to hide those databases. Citizens must also have the right to be removed from those databases.



At what point do we stop trying to prevent unpreventable illegal activity, and try to put systems in place for dealing with its undeniable existence?

I'm not certain how such systems would look in this case (everybody wears Guy Fawkes masks in public?), but this debate reminds me of prohibition vs. harm reduction in the War on [People Using] Drugs.


I don't think the comparison holds. It's not like there're thousands of generations of face recognition fighting against a wrong-headed prohibition; this sort of panopticon garbage is a clear violation of societal norms. I don't think you accept that misbehavior from artificial persons is a given; even as we look for a technical solution, we should also be pressuring the legislatures to amend laws to prevent this.


Does the "Right to be Forgotten" cover the datacenter neurologic?


Then you will still need a reliable auditing system to audit what they're deploying is what's in the open source repository. The government would solve that by creating the "Open Source Auditing Department" or somesuch. And at that point, who trusts them anyway?


I trust democratic governments and the justice system a bit more than companies. The Bill Of Rights (or equivalents in other countries) is a much nicer read than any EULA.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: