The 12 days of digital patient safety

A blog by Clive Flashman, Chief Digital Officer

  • 20th December 2021

A couple of weeks ago, I presented some of the ideas I’ve had around digital clinical safety. This was seasonally branded, ‘The 12 days of Digital Patient Safety’. The 12 issues that were on my list comprised:

  1. AI – regulation, ethics and testing.
  2. Patient safety not built into the innovation process (co-design and co-production with patients is required).
  3. Patient safety (in use) not effectively built into the digital health compliance systems.
  4. Poor user experience (design).
  5. The safety of medical devices, e.g. remote hacking.
  6. Privacy and consent around data.
  7. Fragmentation of patient records and data.
  8. Lack of interoperability.
  9. Cybersecurity.
  10. Patient digital and health literacy.
  11. Clinician attitudes and knowledge of digital technologies.
  12. The barriers to EHR integration (and poor use of patient-generated data).

    There was only time on the webinar to cover points 2, 3, 6 and 10; I hope that we can have further session in 2022 where we can discuss the others.

    Patient safety not built into the innovation process

    Most digital healthcare-related startup founders are understandably focused on developing their business ideas, finding and establishing their new markets, and commercialising their offerings. Inevitably, they often do not have specialist clinical safety advice available to them at an early stage, and do not necessarily consider the patient safety implications of their new solutions.

    Once the offering is ready to come to market is has to pass a range of compliance and other hurdles (more on those later), which, for many founders, may be the first time that they stop to think about the safety implications of what they are proposing to implement. I think this should be raised at a much earlier stage with the founders by those who support them, such as the NHS England Clinical Entrepreneurs Programme and the AHSNs.

    If founders do not place patient safety at the heart of their innovations, then it will always be an afterthought, a bolt-on. Instead, it should be one of the key priorities that helps to drive innovation forward, placing the safety of the patient at the heart of the solutions created for them.

    Patient safety (in use) not effectively built into the digital health compliance systems

    There are a number of digital health-related hurdles that founders ought to address before they can be confident of bringing a safe and robust offering to the market. In no particular order these are:

    • Digital Technology Assessment Criteria (DTAC) from NHS X (details here)
    • The Evidence Standards Framework from NICE (details here)
    • Medical Device Regulations from the MHRA (details here)
    • DCB 0129 and DCB 160 from NHS Digital (details here)
    • Data security & protection toolkit (DSPT) from NHS Digital (details here)

    There are others such as OWASP, but the ones above are the fundamental compliance tools that all digital health startups ought to be considering and indeed addressing before they come to market in a scaled-up approach.

    You can see that these regulations come from four different NHS agencies, the questionnaires and processes overlap somewhat, and there is no linkage between them at all. Innovators are understandably confused about what they should do, in what order and how much dealing with this maze of compliance will cost them.

    Even with this multi-layered, multi-agency approach, it is obvious to me that there is something vital missing – the ability to monitor the safety of a digital health solution once it has been used by a large enough group of patients. This is (in theory) covered by DCB 0129, but very few purchasing healthcare organisations complete the monitoring required by this standard after the new digital technology has been implemented with patients.

    That is exactly what ought to happen and if DCB 0129 doesn’t make that as explicit as it needs to, then perhaps there should be a new DCB issued that does the sort of post-market surveillance that Pharma companies regularly do for their products, and that the MHRA monitors by use of the yellow card scheme.

    Privacy and consent around data

    Many of the Apps that we use on a day to day basis are free of charge. We download them quite happily from an App store, tick that we agree to the (30 pages of) terms and conditions without reading them, and then use the App in blissful ignorance of what is being done with the data we upload for greater insight into our own health and wellness.

    This data is valuable. It is the gold dust that app developers look for and others are keen to buy from them. It might interest you to know that the average selling price of a stolen credit card number on the dark web is $7, it is $42 for a stolen medical record. The price difference is telling.

    Occasionally, there are moves by the Department of Health and Social Care to collect and reuse more data, these have been met in the last few years with strong opposition from the public, such as the debacle around Care.Data.

    However, when using an App, many of us want it to know enough about us so that it can personalise the advice and recommendations that it gives us. The critical word in that sentence is ‘enough’. Who decides what is just enough information? The App developer? The user?

    This suggests that there is a continuum related to the gathering of user data, from complete privacy (minimum data collected) at one end, to full user contextualisation (maximum data collected) at the other. For the App to really understand the user’s situation and context, it needs to gather lots of data to enable it to personalise its outputs and make them as accurate as possible for each user. In itself, that is a reasonable use of our data, but what happens if the App developer then wants to reuse or sell on that data for purposes that the user is unaware of? It happens, and it shouldn’t.

    I believe that App developers should create a very simple list that shows how a user’s data will be processed and used, and for what purposes – in an easy to read format. This should be totally transparent to the user when they look to register for the digital solution.

    Patient digital and health literacy

    Digital health solutions have the power to transform the way that healthcare services are delivered globally. However, many people are excluded from using them because either they are digitally illiterate, or have health literacy issues, or both.

    10% of adults in the UK do not have access to the internet. Most health and care Apps are used via smartphone, and it may surprise you to learn that 17% of the UK adult population do not own one.

    Even for those people who can access the internet, and the health and care Apps through it, they may lack the basic digital skills to use them effectively. The Tech Partnership has defined a basic digital skills framework:

    1. Managing information: using a search engine to look for information, finding a website visited before or downloading or saving a photo found online.
    2. Communicating: sending a personal message via email, online messaging service, carefully making comments and sharing information online.
    3. Transacting: buying items, services from a website, buying and installing Apps on a device.
    4. Problem-solving: verifying sources of information online or solving a problem with a device or digital service using online help.
    5. Creating: completing online application forms, including personal details or creating something new from existing online images, music or video.

    Health literacy can be defined as the ‘ability to engage with health information and services’. This incorporates language, literacy and numeracy skills that are used in health settings. For managing health, this includes the ability to access, understand, evaluate, use and navigate health and social care information and services.

    It may shock you to read that in England, 42% of working-age adults are unable to understand and make use of everyday health information, rising to 61% when numeracy skills are also required for comprehension.

    Why is this important?

    People with low health literacy, compared with the general population are 1.5 - 3.0 times more likely to experience increased hospitalisation or death, and are more likely to have depression. This is a significant number of the population and an issue that really cannot be ignored.

    You may be interested to know that the reading age required for this article is 18 years old. The average reading age of the UK population is 9 years – that is, they have achieved the reading ability normally expected of a 9-ear-old. For comparison purposes, The Guardian newspaper has a reading age of 14 and the Sun newspaper has a reading age of 8. I clearly need to simplify my writing style!!

    For any digital health and care solutions that are made available to the general public, I would urge founders to check the readability of the text they use. If it is not set at around 11 years old, you are creating an inherent barrier to access which may limit the take up of your offering.

    Digital Image

    Share

    A platform for anyone with an interest in patient safety to share and learn from one another. Learn more.

    Sign up to our newsletter