For as long I can recall in my years of working in healthcare or technology, there has been one question that has reared its head more than any other.
‘Is my data right for predictive analytics?’
More recently we have been in discussion with a few different countries who were looking at adopting our predictive solutions with a focus on mortality, outcomes and so on. We noticed that regardless of which end of the globe we were speaking to; the challenges were the same. The question of data quality was the key recurring factor, and with this question came another – so what can be done?
Yet, the lack of a strong and reliable data infrastructure can have far more repercussions than solely the inability to implement predictive analytics. Reliable, timely and accurate information is integral to the delivery of patient care, with all decisions – whether clinical, managerial or financial – needing to be grounded upon insights which are of the utmost quality. With this backdrop, it feels apt to share the work we have been developing around data assurance and quality with our partners, to move them to a place of robust data governance, a compelling data strategy and most importantly – reliable data assurance.
As organisations continue to accelerate their digital health maturity agenda, we are seeing a strong correlation between the need for advanced predictive algorithms (AI and ML) and the quality, integrity and overall internal assurance of data. Working with the digital health team at Royal Berkshire NHS Trust and a number of other strategic global partners and thought leaders, we are seeing these trends more frequently across multiple verticals in healthcare, life sciences and related sectors.
The need for Data Assurance Programmes is more in vogue now than ever, with some programmes consisting of a very comprehensive end-to-end review of data input, workflows / processes, system suitability / integration with EPR, data warehousing, post processing and reporting to provide assurance to leadership teams and external agencies/regulators. The NHS sees 1 million patients every 36 hours, and processes millions of patient’s data across thousands of GP surgeries, outpatient clinics, ED departments and other patient contact environments. One can only imagine the extent of the data quality and assurance challenges such a system would face with each Trust having to really provide itself with the necessary assurance needed to deliver superb clinical care, patient flow and outcomes.
Historically, the key areas of focus for data and the quality, integrity and assurance of that data has been focused on billing and payments. This was then followed naturally by national benchmarking, specifically as it related to case mix, patient safety and outcome challenges such as mortality. Those of us who have been working in the data trenches of the NHS for some time, might surely recall the Robert Francis Mid Staffs reports followed by the Sir Bruce Keogh reviews, in which data formed an essential vertebra for protecting patients from harm. The data integrity and assurance needed back then is just as, if not even more vital today.
As we embrace advancing our various digital agendas, data assurance will remain a key pillar in unlocking value for organisations and more importantly for patient care. It will be key then, for providers and us as a system to consider what an effective data assurance programme might look like, and begin to open conversations with other Trusts and industry experts to focus on tackling these challenges.
To offer your thoughts on this and take part in our day of round-table discussions and workshop with key industry leaders and NHS Trusts, please click here to register your interest for D&D’s Healthcare Data Assurance Summit.