Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Terrible reporting. Improving health outcomes by AI analysis of patient data is a much bigger prize - morally and commercially - than anything which could be achieved through ad targeting. Google is far too smart to squander such an opportunity by abusing patients' trust.


The big problem is the precedent it sets for data access.

What are the criteria for who gets access? What are the constraints of that access?

This story covers the latter being blown apart, the constraints were poorly defined and implemented and thus even if the criteria is well defined access to far more data was made possible.

I'm sure that few patients desire an end to research, or would argue that such access isn't a good thing... but what of the insurance industry? Should they have access? Would the NHS be able to define and enforce those constraints?

Perhaps that's an obvious no.

What then of an insurer partnering with a medical research company, from the viewpoint of "This costs insurance a lot of money, we'd like to fund a way to reduce that financial exposure".

The grey areas emerge immediately.

If we cannot control access to patient data, data that would be trivial to either strip anonymity or just to have in aggregate enough to still produce net-negatives (i.e. correlated by post code would reveal enough with little extra work)... and if we cannot define and enforce the constraints of access... then we really shouldn't be sharing what is highly sensitive and personal information that was originally only disclosed between a patient and a Doctor under the premise that what is shared is covered by the explicit and implicit confidentiality of that conversation.

It's always worth remembering:

Data was acquired under doctor patient confidentiality.

If we considered that data to have a licence, it is the most restrictive licence possible. One could consider what has happened here as a re-licensing without permission. Such an act could have a chilling effect on the relationship between the doctor and patient.


You are making some implicit assumptions that they data access isn't highly controlled.

I have seen a few of these sorts of deals killed because of data access concerns, and/or computation requirements ("you can have access to anonymized data, but you have to run your code in a sandbox on our health servers").

And, this is why we have legislation.


Less implicit, from the originally linked article:

> The scale of the sharing program was apparently misrepresented to the public, originally announced as an app to help hospitals monitor patients with kidney disease with real-time alerts and analytics. But since those patients don't have their own separate dataset, Google has argued it needs access to all patient data from the participating hospitals.

No assumption there, they didn't have a separate dataset and so granted access to all patient data.


"so granted access to all patient data"

Yes, but under what conditions? Many privacy laws apply here, and treating Google as some monolithic entity where everyone working there can now read anyone's personal health history is inaccurate.


Its psuedononymous data the NHS has previously admitted can be deanonymized given sufficient effort but such deanonymization carries criminal and civil penalties.


Nope. To set a precedent it would have to precede. Giving de-identified medical records to researchers is a long-standing, well-established and regulated process. The only interesting thing here is that it's Google and not some PhD's university lab.

Here's HHS on what HIPAA has to say about this: [0]

[0] http://www.hhs.gov/hipaa/for-professionals/privacy/special-t...


It so happens Google has the perfect means at its disposal for de-anonymizing large swaths of such data: trillions of user location records, calendar appointments, emails, and texts. It's not too hard to put all that together to match a specific encounter record, for example.


which would both violate their contract and also be illegal.


And Google would never break the law or breach a contract. Especially a contract they signed with the UK Government.

I mean, other than that time just a few years ago[0] where Google broke the law and then breached the contract they signed with the UK Government.

[0] http://www.bbc.com/news/technology-19014206


I do not trust Google and I am not being given a choice.


not sure if you saw my comment downstream? would encourage you to read original piece for a more nuanced presentation of the information - https://www.newscientist.com/article/2086454-revealed-google...

happy to address criticism


Hi Hal I thought the article should have compared and contrasted with other government run large data sharing programs such as CMS qualified entity or AHRQ HCUP.


thanks for the comment. This would be interesting, but not sure it would have made sense to pack it into one article that is already heavy with data terminology for a lay reader.

Will definitely be looking into healthcare data more, as this story has resulted in some interesting leads


Google is not a monolithic entity. You'd have to trust the individual researchers who have access to the data, and we don't even know who they are. And the NHS didn't ask their patients for permission before handing this data over, so whether you trust them or not is irrelevant, they get your data anyway.


That seems incredibly naive. They utilize every other bit of information they collect, why wouldn't they utilize this data?

Google is a corporation. It can't have good intentions of its own. It's the thousands of employees who will potentially be working with and handling the data that you need to worry about.


Google places the most comprehensive controls on PII of any place I've seen, including hospital environments subject to HIPAA. (Mostly because, unlike the hospitals, they have the technical clue how to enforce it properly. The hospitals... are still learning about computer security, and it's not their forte: http://arstechnica.com/security/2016/03/two-more-healthcare-... )

Getting access to private information in Google is hard - my experience as a researcher here is that there's a strong incentive to find an open-source or non-PII dataset before touching user data. I'll go through my year here without ever touching even the most innocuous PII data.

It's very unlikely to me that thousands of people will have access to this data. It's much more likely that a small handful will, and that they'll be supported by others with no access whatsoever. From the article, in fact:

"The agreement clearly states that Google cannot use the data in any other part of its business. The data itself will be stored in the UK by a third party contracted by Google, not in DeepMind’s offices. DeepMind is also obliged to delete its copy of the data when the agreement expires at the end of September 2017."

From an incentive perspective, the potential value-add of abusing the data is tiny compared to the potential costs and loss of user trust. Google's very aware of how important it is to maintain user trust -- http://www.techradar.com/us/news/internet/google-we-have-a-c...

Corporations don't have brains, but they have cultures, and Google's culture -- composed of those thousands of engineers -- is quite fanatical about protecting user privacy. It's one of the non-technical things that's impressed me most during my time here.

The risk with a company like Google is if the economic winds and culture changes, but that's a long-term process, and is also the reason for legally-binding contracts to do things like delete the data (see above).

tl;dr: Google has the technical means to protect the confidential data better than almost any other agency, including from its own employees. The most important question to ask is whether the NHS structured the data sharing in a way that provides for long-term protection, and (IANAL!) it sounds like it from the article.

Source: I'm a professor who deals with our IRB occasionally, have colleagues doing joint CS-medical research, pushed patients around a hospital in a younger life, and am on sabbatical for the year at Google.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: