I won’t start with the jokes around GDPR and how the regulatory framework was outdated even before the ink dried on the legislation. Even if you give people the “choices”, the default options win because the incumbent has a productivity moat.
For example, search engines. Google dominates the search because it is effective. I have seen a rash of startups like Ecosia and DuckDuckGo (yes, that’s a search engine) but the default Google has something better. They are too big for regulation. The alternatives cannot match its marketing or lobbying potential. Therefore, while the companies are promising the “privacy alternatives”, privacy is getting eroded completely.
Here’s something interesting from HBR.
Many observers blame Europe’s tough laws regarding data privacy, which are more stringent than those in the United States and China. But our survey suggests that regulation isn’t the problem. Rather, it’s a disconnect between private and public health systems, which makes it difficult for AI programmers to access enough data with which to train their algorithms.
I think this is true for most of the healthcare systems. I don’t foresee Epic opening up it’s API’s for other researchers to access it but would rather remain the preserve of other monopolies within the countries, like Google for access. They have a tacit understanding of protecting their turfs. The other alternative for Epic would be to invest in a startup and then acqui-hire it. Their legion of lawyers are better placed than me to comment on the specifics.
Here is the structured summary:
- Rather, it’s a disconnect between private and public health systems, which makes it difficult for AI programmers to access enough data with which to train their algorithms.
- In many European countries, clinical data is kept by individual hospitals and clinics despite the existence of centralized public health systems.
As a result, health care companies may not have access to large centralized data “lakes” that contain the right amount of the right type of data in the right format for deploying AI software, which may ultimately compromise patients’ access to better quality healthcare.
- To get access to clinical data, the first port of call for a health care AI startup is usually an internal committee of doctors, legal staff, and hospital administrators, who determine whether access should be granted in principle.
- Owkin, a Franco-American startup that deploys AI for clinical development, faced this challenge at each hospital it worked with.
- Hospitals, especially those in a public health system, are usually resource-constrained, and managing data is not often a priority.
- To make a hospital’s data usable, Owkin would bring in its own computers and deploy its own data managers to clean up and standardize the data.
- To share the setup costs in one hospital pilot, Owkin partnered with a large pharma company, giving it access to the hospital data in return.
- Algorithms trained on small quantities of data do not work well, so hospitals must combine their data “ponds” to create a lake.
But hospitals can be as reluctant as corporations are to share their data; they don’t want to make it too easy for patients to move to other hospitals, and they have concerns about confidentiality.
- As regulations evolve, hospitals and pharmaceutical companies expect their partners to keep up with compliance in terms of protecting such data.
- Owkin and Nabta Health, a Dubai-based women’s health startup, usually build an encryption layer on top of the raw data set (creating statistic noise) to hide sensitive patient information while keeping dynamism.
- Nabta Health also developed blockchain-based technology that enables patients to manage their personal health data and tracks how the data is shared by physicians and hospitals.
- Nabta Health and ExactCure — a French startup that uses AI to personalize medication dosages — are developing AI that can tap into patients’ data from other sources, such as wearables that capture information on body temperature, heart rate, sweat, and movement.The data thus obtained can be fed to an AI algorithm that hospitals and clinics can use to tailor individual care paths for their patients.
I am not commenting on the specifics of the proposals lined up but it is a compelling thought process.