A wide-ranging, UK government-commissioned industrial strategy review of the life sciences sector, conducted by Oxford University’s Sir John Bell, has underlined the value locked up in publicly-funded data held by the country’s National Health Service — and called for a new regulatory framework to be established in order to “capture for the UK the value in algorithms generated using NHS data”.
The NHS is a free-at-the-point of use national health service covering some 65 million users — which gives you an idea of the unique depth and granularity of the patient data it holds.
And how much potential value could therefore be created for the nation by utilizing patient data-sets to develop machine learning algorithms for medical diagnosis and tracking.
“AI is likely to be used widely in healthcare and it should be the ambition for the UK to develop and test integrated AI systems that provide real-time data better than human monitoring and prediction of a wide range of patient outcomes in conditions such as mental health, cancer and inflammatory disease,” writes Bell in the report.
His recommendation for the government and the NHS to be pro-active about creating and capturing AI-enabled value off of valuable, taxpayer-funded health data-sets comes hard on the heels of the conclusion of a lengthy investigation by the UK’s data protection watchdog, the ICO, into a controversial 2015 data-sharing arrangement between Google-DeepMind and a London-based NHS Trust, the Royal Free Hospitals Trust, to co-develop a clinical task management app.
In July the ICO concluded that the arrangement — DeepMind’s first with an NHS Trust — breached UK privacy law, saying the ~1.6M NHS patients whose full medical records are being shared with the Google-owned company (without their consent) could not have “reasonably expected” their information to be used in this way.
And while the initial application the pair have co-developed does not involve applying machine learning algorithms to NHS data, a wider memorandum of understanding between them sets out their intention to do just that within five years.
Meanwhile, DeepMind has also inked additional data-sharing arrangements with other NHS Trusts that do already entail AI-based research — such as a July 2016 research partnership with Moorfields Eye Hospital that’s aiming to investigate whether machine learning algorithms can automate the analysis of digital eye scans to diagnose two eye conditions.
In that instance DeepMind is getting free access to one million “anonymized” eye scans to try to develop diagnosis AI models.
The company has committed to publishing the results of the research but any AI models it develops — trained off of the NHS data-set — are unlikely to be handed back freely to the public sector.
Rather, the company’s stated aim for its health-based AI ambitions is to create commercial IP, via multiple research partnerships with NHS organizations — positioning itself to sell trained AI models as a future software-based service to healthcare organizations at whatever price it deems appropriate.
This is exactly the sort of data-enabled algorithmic value that Bell is urging the UK government to be pro-active about capturing for the country — by establishing a regulatory framework that positions the NHS (and the UK’s citizens who fund it) to benefit from data-based AI insights generated off of its vast data holdings, instead of allowing large commercial entities to push in and asset strip these taxpayer funded assets.
“[E]xisting data access agreements in the UK for algorithm development have currently been completed at a local level with mainly large companies and may not share the rewards fairly, given the essential nature of NHS patient data to developing algorithms,” warns Bell.
“There is an opportunity for defining a clear framework to better realise the true value for the NHS of the data at a national level, as currently agreements made locally may not share the benefit with other regions,” he adds.
In an interview with the Guardian newspaper he is asked directly for his views on DeepMind’s collaboration with the Royal Free NHS Trust — and describes it as the “canary in the coalmine”.
“I heard that story and thought ‘Hang on a minute, who’s going to profit from that?’” he is quoted as saying. “What Google’s doing in [other sectors], we’ve got an equivalent unique position in the health space. Most of the value is the data. The worst thing we could do is give it away for free.”
“What you don’t want is somebody rocking up and using NHS data as a learning set for the generation of algorithms and then moving the algorithm to San Francisco and selling it so all the profits come back to another jurisdiction,” Bell also told the newspaper.
In his report, Bell also highlights the unpreparedness of “current or planned” regulations to provide a framework to “account for machine learning algorithms that update with new data” — pointing out, for example, that: “Currently algorithms making medical claims are regulated as medical devices.”
And again, in 2016 DeepMind suspended testing of the Streams app it had co-developed with the Royal Free NHS Trust after it emerged the pair had failed to register this software as a medical device with the MHRA prior to trialling it in the hospitals.
Bell suggests that a better approach for testing healthcare software and algorithms could involve sandboxed access and use of dummy data — rather than testing with live patient data, as DeepMind and the Royal Free were.
“One approach to this may be in the development of ‘sandbox’ access to deidentified or synthetic data from providers such as NHS Digital, where innovators could safely develop algorithms and trial new regulatory approaches for all product types,” he writes.
In the report Bell also emphasizes the importance of transparency in winning public trust to further the progress of research which utilizes publicly funded health data-sets.
“Many more people support than oppose health data being used by commercial organisations undertaking health research, but it is also clear that strong patient and clinician engagement and involvement, alongside clear permissions and controls, are vital to the success of any health data initiative,” he writes.
“This should take place as part of a wider national conversation with the public enabling a true understanding of data usage in as much detail as they wish, including clear information on who can access data and for what purposes. This conversation should also provide full information on how health data is vital to improving health, care and services through research.”
He also calls for the UK’s health care system to “set out clear and consistent national approaches to data and interoperability standards and requirements for data access agreements” in order to help reduce response time across all data providers, writing: “Currently, arranging linkage and access to national-level datasets used for research can require multiple applications and access agreements with unclear timelines. This can cause delays to data access enabling both research and direct care.”
Other NHS-related recommendations in the report include a call to end handwritten prescriptions and make eprescribing mandatory for hospitals; the creation of a forum for researchers across academia, charities and industry to engage with all national health data programs; and the creation of between two and five digital innovation hubs to provide data across regions of three to five million people with the aim of accelerating research access to meaningful national datasets.
Featured Image: Rido/Shutterstock