Just over two years ago, on the 25th of May 2018, the UK government implemented the Data Protection Act across the country. This was the latest iteration of a long process of EU and UK government-driven attempts to improve data privacy in response to fears over the ability of companies to collect a large amount of personal data.
In the field of healthcare research and epidemiology, the implication of this new legislation has certainly been felt. GDPR training is now compulsory for all those who will be in contact with healthcare data, with further recommendations that staff should be refreshed on their GDPR responsibilities every year. Furthermore, a breach in GDPR leads to fines of up to £17.8 million (or 4% of annual global turnover – whichever is greater) for the company responsible, with further fines possible for the individual(s) responsible.
The world of data protection even 20 years ago, when the internet was in its infancy was completely different. Much data would be collected and stored in filing cabinets, photocopied for distribution and carried by employees on public transport. 40 years ago, junior statisticians would be allowed to browse old hospital case notes without supervision, where non-commercial medical researchers could freely access relevant patient records. Today, access to case notes requires lengthy communication with NHS trusts and NHS Digital, data sharing plans must be established well in advance with organisations such as Public Health England (PHE), and every clinical trial requires a rigorously described data management plan.
Lack of trust or lack of transparency?
Professor Peto argues that the regularly updated MRC guidelines on medical research are ‘No longer based on the assumption that medical researchers are trustworthy’. What Professor Peto describes has been explored in sociological research before: an increasingly managerialist approach within healthcare, with a focus on governance, monitoring and auditing, leads to healthcare professionals and researchers feeling that they are no longer trusted.
However, we need to consider other essential issues regarding trust here, such as whether the general public trust organisations to use their personal data in safe and responsible ways. The role of companies, especially social media, in collecting and utilising data in order to produce new forms of personalized advertising have resulted in data-driven trust deficits between the public and any organization, medical or otherwise, which seeks to use their personal data, especially given the ever-increasing list of data breaches, some of which receive high-profile media coverage. While the law makes special provisions for the use of data in medical research, such trust deficits have made the general public more sceptical about allowing access to personal data.
This trust deficit is also largely caused by transparency issues over what exactly is happening with the data that are collected. In the medical sector, many are rightfully worried as to how some of the most personal and sensitive data a person has (effectively their medical issues and general health status) could be used by other companies. In the US, for example, drug company Pfizer is already using electronic medical records to target the sale of new treatments to patients. Companies such as Google have sought to make deals with major healthcare providers in order to access patient health records, names and addresses without informing patients.
However, on the flip side, you have movements such as use MY data, a data-sharing advocacy group comprised of patients, aiming to highlight the positive impacts responsible use of healthcare data can make. Similarly, Understanding Patient Data work to improve the trust between patients, and those who use their data- working to bring transparency, accountability and public involvement to the way patient data is used. Such movements could be the key to improving the way data use is perceived within healthcare.
Time & effort
This trust deficit with the public has meant that the process of acquiring data has increased significantly. This in turn means that the time researchers need to spend acquiring data increases. However, sometimes new processes need to be incorporated into research to reflect societal changes. At the time, researchers may have felt these changes to be a slight against their trustworthiness and personal judgement as researchers, and an unjust barrier between research and better healthcare outcomes. We can easily imagine how it was not safe to assume that the personal judgement and positive intentions of researchers could single-handedly form the basis for protecting the interests and rights of medical research participants.
And indeed, although many researchers will be motivated by the desire to do good research that has a positive impact on their fellow human beings, researchers can also feel less altruistic pressures to perform their work, which can compromise how safely and rigorously they handle data. One major problem we in the field of medical statistics have observed during the COVID outbreak is an increase in the amount of poorly reviewed articles, often pushed out at a breakneck speed in order to be at the cutting edge of pandemic research. This has not just been a single instance- both the Lancet & the New England Journal of Medicine have retracted articles, advocating for the use of hydroxychloroquine and blood pressure medications respectively, as potential new treatments for COVID-19. The common denominator in both of these trials was the data sources they used, found to have major inconsistencies and were thus deemed unreliable. This fundamental error illustrates just how imperfect the research process can be when driven by incentives such as speed and impact. When these events make the headlines, it is not surprising that public trust might be eroded, or that people would want appropriate processes and oversight in place to protect everyone from these sorts of failings of the research endeavour.
What do you think? Let us know!
The views expressed are those of the author. Posting of the blog does not signify that the Cancer Prevention Group endorse those views or opinions.
Share this Page
Pingback: Beyond trust: collective understanding of the common good and patient data – Cancer Prevention Group Blog