A bold and ambitious plan: Harnessing data and improving health and care continuity for the next decade
The NHSX and DHSC draft strategy aims to improve patient outcomes, by placing them at the centre of the care journey. Dr Dipesh Hindocha at Doctor Care Anywhere explores how health data systems can be used to help patients directly engage with their care experience, and build trust in their providers
As the UK emerges from COVID-19 restrictions, some of the old debates are beginning to reassert themselves, and now the minds of politicians and the public are returning to long-term public health issues. In June, the publication of a draft strategy paper from NHSX and the Department of Health and Social Care (DHSC) was scarcely noticed by many. The paper threw into sharp relief the debate about where patient privacy should end, and where joining up NHS data across the NHS estate and beyond can be expected to drive better outcomes for patients and clinicians alike.
There will surely be many across the private and public sectors in healthcare and technology who will feel the publication of Data Saves Lives: Reshaping Health and Social Care with Data marks a significant and welcome step. Harnessing the relatively untapped potential of joined-up patient data use brings closer the prospect of delivering significant health, social care, and services improvement across the NHS and elsewhere in the wider health system. The national public health emergency caused by COVID-19 has highlighted how effective data analytics can simultaneously save lives while driving efficiencies – both of significant value to the NHS. This has played a huge role in tracking the virus and infection hotspots, prioritising the distribution of critical resources, mapping regions and groups for mass vaccination and, most importantly, in the case of this latest initiative, accelerating the ease of access and sharing of patient care records.
It is therefore encouraging to see the importance that DHSC and NHSX is attaching to placing the patient at the centre of the care pathway. People should be closer to their own data, which will enable healthcare professionals to review the entire clinical journey from GP surgery through to patient aftercare, and ultimately improve outcomes.
This concept is something the private sector is rapidly innovating. We therefore feel well placed to provide our perspective and to assess this initiative through the lens of both a primary care clinician and a thought leader in the private, digital health space. We seek to strike a balance between someone embedded within the NHS, and an innovator outside it.
Putting patients at the centre of the care journey
The crucial point, from a patient perspective, is about building a system that can maintain trust, confidentiality, and transparency. It is also about building and maintaining the highest standards in data processing, which will be critical to building trust amongst patients, clinicians, and wider organisations.
The rapid rise of data use over the last ten years, and digitalisation of many aspects of our lives, has led, somewhat understandably, to concerns about the scope for data leaks, and the fear of inappropriate use being made of individuals’ data. To this end, developing and enabling a transparent health data system that allows users to see how and when their data is being used, is to be warmly welcomed.
Another question that needs resolving is how we come to a shared understanding of what ‘data’ mean to those using the system, ie the patient, and the care giver. Given the complex nature of health data, data privacy, data handling, and legalities surrounding data, if patients are going to ‘buy-in’ to the concept, they need to be persuaded of its value and indispensability for their own wellbeing. The Wellcome Trust’s Understanding Patient Data, a health data initiative working in partnership with the NHS, which seeks to bring transparency, accountability, and, crucially, understanding, will be an important tool in delivering this. However, further initiatives will be needed to bring all stakeholders onboard, such as clinicians and healthcare managers. Patients will increasingly engage with their data in the future. This is inevitable. But if participants, such as those clinical and care staff delivering healthcare, are not well positioned clearly or openly to explain or discuss this with their patients, then full acceptance cannot be guaranteed.
“ The ability to review such information in innovative and more efficient ways, can help clinicians, researchers, and healthcare managers make transfor mative differences to patient and public health outcomes ”
From a technical perspective, personal data stores, in whatever form they take, will only work if all the moving parts of the system are speaking the same health language and information can be contextualised by the person receiving or reviewing the information. The mass consolidation of data, and the joining up of all the constituent parts will require a unified governing framework and terminology system. Universally accepted clinical vocabulary systems, digital platforms and frameworks, such as SNOMED CT, will of course help ensure there is ubiquitous language across all systems. But we would suggest more still needs to be done.
Access to patient data
Access to relevant patient data is hardly a new or groundbreaking concept. However, as witnessed acutely during the pandemic, access to substantially enhanced data sets and the ability to review such information in innovative and more efficient ways, can help clinicians, researchers, and healthcare managers make transformative differences to patient and public health outcomes.
However, such innovative analytics will only be as good as the data being supplied. Transposing raw data, some high quality, some low quality, which will come from various sources including digital platforms and apps, will need extensive health analytics capabilities. And as patients are increasingly likely to add their own data to the system, it is imperative that proper governance and validation frameworks are put in place to best inform the decision-making process, and provide clarity and reassurance both to patients and clinicians.
“ Building the right architectures and standards to allow third parties readily to innovate and share those learning with the NHS, and supporting this with a clearer pathway to achieving effective whole-of-market strategies, is exactly what the proverbial doctor ordered ”
We may see this when linking digital and remote assessment tech with real-time patient data collection, such as through home monitoring systems, which really could revolutionise care delivery to the most vulnerable patients in society. However, this will need to be supported by innovative analytics and thoughtful consideration of how and when this information flows back to healthcare workers. There would obviously be little value in having a set of data delivered that had not been through a considered process of analysis to ensure that information was being used effectively. Indeed, it would be possible to imagine poor outcomes being the result of a failure to achieve this.
Shared care records
Shared care records, i.e., the various constituent parts of the health and care system seamlessly sharing information about patients, has the potential dramatically to reduce errors being made from a clinician’s perspective. However, patients will in due course be likely to expect all parts of their health and care experience to be included in this process, both within the NHS as well as any private or thirdparty healthcare companies and providers. They will want all of their health data to sit within their central shared care record, and so naturally this needs to be underpinned by clear and well understood structures about how this data is used, in order to maintain transparency, honesty, and trust with our patients.
There will also be inevitable questions about what constitutes appropriate stewardship and curation of shared care records. And there will also be challenges. For example, how can this all be done coherently and effectively in the context of a shared record that intersects with different parts of the health system, some of which speak different ‘health’ languages? Who will be responsible and accountable for the overall curation of these records? Who will ensure critical information is kept up to date and made consistent with information previously data inputted? We would hope that, as a part of the consultative process on initiatives like Understanding Patient Data, the mobilisation and inputs of private and third sector groups are brought in to play in order to provide complete and compelling answers to these questions and create the best possible, and most sustainable, solutions and systems for the future.
Supporting local and national decision makers
We don’t believe this needs to be limited to public sector healthcare leaders. Where there is a legitimate reason, underpinned by the right consents, confidentiality, and governance systems and processes, organisations providing care beyond the NHS, but still critically important parts of the UK healthcare ecosystem, should have access to shared cared records. This is necessary to ensure clinical and social care continuity, especially as the delivery of services via digital platforms and operators continue to increase. In the event of a further lockdown or emergence of a new COVID-19 variant, this will be vital.
A good example would be patients attending private hospitals, referred to by their private health insurer but via an NHS GP, for surgical procedures. Of course, they, and their physician, would benefit medically from shared access to this information. As per the draft guidance, “where access to data is granted, having met these high thresholds, it must always have the explicit aim to improve the health and care of our citizens, or to support the improvements to the broader system.”
Next steps
The plans included in this draft strategy are bold, ambitious, and necessary. But questions do arise. Are they achievable in the timeframes stated? How will the big propriety systems that are already contracted and providing services throughout the NHS come onboard? How will public and private sector organisations work together, without ever compromising or impairing the objective of improving patient outcomes?
The commitment to help innovators and health technology providers to work with health and care organisations is very much to be welcomed. Building the right architectures and standards to allow third parties readily to innovate and share those learning with the NHS, and supporting this with a clearer pathway to achieving effective whole-of-market strategies, is exactly what the proverbial doctor ordered.
We all want to achieve brilliant outcomes for patients and their care. Patients deserve no less. But those outcomes must be the right ones, as well as achievable and sustainable ones. As always, the devil is in the detail. So, we await with eager anticipation the next iteration of NHSX’s proposals.
Dr Dipesh Hindocha, Clinical Innovation Director at Doctor Care Anywhere, started his career as a locum GP following his Master’s in Medical Sciences and Clinical Practice. In 2018 he began working with Doctor Care Anywhere as a Clinical Lead, moving on to become Head of Clinical Product Development, before being promoted to Clinical Innovation Director in 2021.
Delivering transformational data analysis for infectious disease drug discovery
Artificial intelligence and the insights it can provide offer a route for infectious diseases to stay in the spotlight and draw interest from R&D investors worldwide. Liam Tremble at Poolbeg Pharma explores how this technology can be applied to data analysis
Pharmaceutical R&D is becoming increasingly expensive, and returns on investment (ROIs) have been falling over the last several decades, with static levels of new drug approvals since the 1980s. It is estimated to cost over $2 billion to develop a single product and over 90% of products which have a filed investigational new drug application (IND) go on to fail. ROI from the pharma sector has fallen from 10% in 2010 to just 2% in 2018 but has recently started to recover.1-2
No sector has been as severely impacted as the infectious disease market. Before COVID-19, almost two decades had passed without substantial improvement in treatment options for a range of viral illnesses, such as influenza.
Part of the dilemma of diminished returns is thought to be related to the consistent improvement of each new drug. Subsequently, making incremental improvements in new drug candidates has become more difficult, especially when coupled with the probability that the majority of intuitive drugs, considered likely to succeed, have already been discovered. Now, more than ever, there is a need to look beyond conventional drug development approaches to discover the next generation of therapeutics.
In recent years, researchers have been battling against reduced returns by using artificial intelligence (AI) to help guide drug discovery and development within the pharma sector. AI is designed to deal with big datasets from a variety of sources and formats, making it ideally equipped to deal with modern cutting-edge analysis techniques. These range from single-cell transcriptomic sequencing and proteomic analysis, which produce Big Data, to the diverse data types that together describe a disease phenotype.
AI is a high-power computing technique, which uses iterative ‘learning’ algorithms to interpret, learn, and discover from underlying patterns in ‘Big’ Data. In addition to traditional Big Data, modern AI providers incorporate machine reading comprehension (MRC) to allow algorithms to learn from unstructured text-based publications, creating formidable ‘intelligent’ machines, which are up to date with the latest literature.
The US is the undoubted leader in pharma AI, with over 184 companies and over $12 billion of capital invested in providing AI based tools to aid drug discovery, development, and medical treatment. In silico AI tools exist to identify disease specific targets, interventions against those targets, coupled with the absorption, distribution, metabolism, and excretion (ADME) and toxicology profiles of them, can determine their probability of success in the clinic. Pharma companies are using AI to expand their pipelines and to prioritise existing assets, with many major pharma companies refraining from commercial decisions on their pipelines without the input of AI-based predictive outcomes.
The role of AI in infectious disease is particularly pertinent due to the diverse range of factors that can impact the trajectory of infection and immunity. Although conserved responses have been identified in infectious disease, such as the decoupled interferon responses, which distinguish severe and mild viral infections, a multitude of factors such as age, human leukocyte antigen (HLA) type, immunological history, immunological status, volume of pathogen initially experienced, and host comorbidities, can all result in varied responses to a single pathogen.
‘Original antigenic sin,’ the process by which immunological history can impede antigen specific responses by ‘antigen trapping’ early in infection, has been identified by vaccine developers as a major obstacle in the design of universal vaccines. Particularly in elderly or immunocompromised individuals, it can be difficult to stimulate a durable vaccine response in a consistent manner due to the multitude of underlying factors. It is likely that elements of personalisation will be needed to drive the effector mechanisms for lasting immunity.
The non-biased ability of AI to integrate multi-omic data makes it an ideal platform to deal with these wide-ranging factors that affect and predict immunity, host response, and recovery in the face of a plethora of infectious diseases. It also makes it the ideal partner in helping to identify the next generation of pharmaceutical products to prevent and treat disease.
One of the significant challenges for AIbased discovery is the quality of the data input and the presence of high-powered comprehensive datasets that can be used to validate its findings. Commercial grade platforms often overcome this issue by the integration of gargantuan datasets taken from publicly available information. However, these datasets are often incomplete due to limited publishing of original datasets and data protection legislation, which regulate clinical data.
Despite the large-scale support for AIdriven improvement of healthcare delivery, progressive improvements in AI have largely outpaced the development of the regulatory framework surrounding it. Data protection legislation, designed over the past decade to counter the exploitation of personal data by corporate interests, often finds itself at odds with the principles of AI-based learning.
AI is beginning to revolutionise the delivery of healthcare. Its ability to integrate and infer from diverse data types, (such as imaging data, clinical notes, demographics, and lab results in real time to produce tools for diagnoses and prognosis), can aid clinical decision-making. The ethical implications of rapidly advancing AI-based contributions to medical treatments have stirred global bodies to develop guidelines and principles for the integration of AI in medical settings.
Integration of AI in the drug discovery and drug development process is impacted less by these ethical concerns. However, issues still exist, such as the propagation of ethnic bias in medical treatments due to underlying bias in datasets, which may manifest further differences in healthcare outcomes across underserved minorities and developing nations.
Global initiatives, such as the Human Vaccines Project, have been engaged in sustained efforts to characterise the immune system in exquisite detail, with the underlying belief that integration of diverse high depth datasets will produce prognostic and interventional products to improve health outcomes. Single-cell next-generation sequencing, full length protein microarrays, phage display, and cell phenotyping arrays produce Big Data, which can be layered onto host biology.
The power of AI will advance exponentially over the coming years. However, in order to accelerate insights, there is the opportunity to progress beyond the gradual accumulation of data snippets and to input bespoke high depth data from infectious disease. High depth analysis of clinical data is cost prohibitive, particularly when the insights of AI may be only the first step in the development of a new product.
In light of the COVID-19 pandemic, it is vital that national bodies recognise the potential of AI to prepare for and respond to future challenges. There has been a unique opportunity to provide bespoke data for AIdriven infectious disease research through human challenge trials, in which volunteers are inoculated with an infectious agent in carefully controlled conditions and monitored through health, sickness, and recovery. During this time, daily biological samples can be obtained, revealing local and systemic responses to the challenge on a real-time basis, which can then be coupled to an intervention.
Academic institutions such as Imperial College London and Oxford University have championed the technique in recent years, including challenge of healthy volunteers with SARS-CoV-2. A beneficial impact of the high depth data analysis of modern immunological techniques has been the reduction in the number of subjects required for powered and meaningful interpretations.
Recent evidence in COVID-19 has shown that germline mutations and HLA types can profoundly influence susceptibility to severe disease. The data highlight the importance of matched genetic, transcriptome, and immunology datasets for training AI algorithms. In the absence of matched HLA data, immunological trends can be misattributed to other mechanisms or signals that may not be detectable behind the background variability that these factors create. The noise of biology will always be present, but it is the responsibility of those using AI for research to produce clean data, which minimise confounding factors and facilitate the next generation of insights.
As a scientific community, we have great confidence in the ability of AI to lead the next generation of interventions in the war on infectious disease. However, while AI will not replace the requirement for basic research, our ability to integrate AI cutting-edge analyses into clinical datasets will speed up the development of novel interventions which can improve patient outcomes.
The potential for using AI analysis of biological data to quickly and cost effectively identify many more interesting and efficacious drug candidates for infectious diseases with serious unmet needs, is both very real and very exciting.
Liam Tremble PhD, Project Manager, R&D Operations at Poolbeg Pharma plc (London AIM: POLB) – a clinical stage infectious disease pharmaceutical company, which aims to develop multiple products faster and more cost effectively than the conventional biotech model. Liam, an immunologist, has worked at hVIVO, part of Open Orphan PLC, the provider of human challenge trials with a focus on strategic engagement to enhance recruitment for clinical trials. Prior to that, he was a researcher at the Cork Cancer Centre, Republic of Ireland. He completed his doctoral degree at University College Cork on the role of tumour associated macrophages in melanoma. liam.tremble@poolbegpharma.com