Abstract
Despite strong support for health research in the UK, many people currently opt not to share their health data because they do not trust that their data will be handled appropriately. This lack of trust is widely acknowledged but what is missing from the literature is a theoretical understanding of what underlies this lack of trust. This article presents an original, empirically-based account of the source of this lack of trust based on interviews conducted with those who were unwilling to share their GP data with NHS Digital’s General Practice Data for Planning and Research (GPDPR) programme. The participants told us that they did not feel the data was being treated as ‘my data’. In this article, we explore what this concept means to our participants and argue that those responsible for sharing health data need to demonstrate commitment to patients’ values and interests as well as data security.
Keywords
Background
Health data is frequently presented as an invaluable resource, with the potential to transform our health services and, indeed, our lives. 1 However, despite widespread public support for sharing health data for research, 2 a significant proportion of the population opt not to share their health data for research purposes. 3 The small sample study presented in this article provides a snapshot of the views of those with concerns about sharing their health data, which supports the conclusions of others that trust is a key issue in debates around health data sharing 4 and goes beyond other findings, to make an original link between this and citizens’ attempts to assert control over their health data through the opt-out system.
We further present an original, empirically-based account of the source of this lack of trust for some people, which is the failure in relevant regulation to understand and account for what people mean when they talk about ‘my data’. We argue that in order to be worthy of citizens’ trust, the governance of health data must be designed on the basis of a sound understanding of ‘my data’.
Focusing on theories of property to conceptualise what ‘my data’ means is understandable, as the term can reasonably, at first glance, be assumed to be based in property concerns. Indeed, the concept of property works to understand ‘my data’ at one level in that it accounts for the control that people want to have over it. However, what this article shows is that on further examination of the reasons underlying people’s views, property provides a very thin account of ‘my data’ and an insufficient framework for fostering trust in data sharing. We therefore reject the property account and present an empirically informed conceptualisation of the deep connection that some people have with their health data and the range of values and interests engaged when their health data is used for research purposes. This is something that current efforts to gain citizens’ trust in health data fail to do and therefore means that they run the risk of perpetuating mistrust engendered in previous efforts to access health data for research at scale. This could further increase the scale of the opt-out from sharing health data for research purposes, making the dataset less comprehensive and representative, increasing health inequalities and preventing the potential benefits of health data research being realised.
The trigger for our study was an attempt in England to increase access to patient data for research. 5 In May 2021, NHS Digital 6 announced that they would collect patients’ primary care or ‘General Practice’ (GP) data so that it could be used in a non-identifying form in medical planning and research in their GP Data for Planning and Research (GPDPR) programme. Patients who did not want their data to be included in the programme had until 23rd June 2021 (subsequently postponed to 1 September 2021) to register an opt-out by printing, completing, and returning a form to their GP surgery. 7 By June 2021, over a million people had registered an opt-out in this way, 8 prompting NHS Digital to delay the programme to provide more time to speak with people about their concerns. 9
In response to this, as part of our GP Data Trust pilot study, we carried out interviews to investigate why people opted out of sharing their GP data. We also wanted to explore the nature of the concerns people had and how law and regulation might respond in order to alleviate those concerns and enable research using health data and to be as inclusive and comprehensive as possible. 10 Without an understanding of the barriers to patients choosing to share their health data, attempts to facilitate this through regulation are likely to remain ineffective. 11 It is well established that despite an in-principle support for sharing, people have concerns about doing so. 12 What is missing from the literature is an empirically based and theoretically informed understanding of the nature of the concerns of those who currently opt out of sharing their health data and what underlies the significance of these concerns. This article fills this empirical and conceptual gap.
All participants in our interviews expressed a desire to contribute their data to medical planning and research but had reluctantly opted out of doing so due to a lack of trust that their data would be managed in a way that was acceptable to them. This lack of trust is a recognised and persistent problem in many aspects of public life, including in health data sharing, 13 despite policy makers’ goal of regaining public trust. 14 Wolfensberger and Wrigley’s concept of trust in healthcare, which we adopt here, argues that there are two aspects to trust: trust in competence (the person has the required skills and knowledge) and trust in commitment (the person is committed to protecting the trustor’s values and interests). 15 Efforts to regain trust in health data sharing have so far been focussed on indicating trustworthiness in relation to competence, that is, that data will be kept secure, privacy will be protected, and data-protection laws will be complied with. For example, the emerging development of Secure Data Environments (SDEs) in England 16 is based on a structure whereby authorised researchers can access a defined dataset and carry out their analyses without the data leaving the security of the platform, minimising the possibility of data leakage or theft. 17 However, our data 18 shows that while trust in competence is important to patients, it is unlikely to be sufficient to convince many of those who are currently opting out of sharing their health data to reverse this decision. 19
This is because trust goes beyond competence in Wolfensberger and Wrigley’s second strand, that is, an expectation that the person you trust is committed to you and has concern for your interests and wishes. This rests on expectations about the trustee’s value orientation, their ethics, integrity, and motives. 20 In the context of healthcare, it is an expectation that a doctor will respect their patients’ values, even if they clash with the doctor’s own values, so that the doctor acts in a way the patient approves of. 21 Trust in commitment is particularly important in a healthcare context as the decisions healthcare professionals make require an understanding of patients as individuals. 22 Trusting your doctor goes beyond trusting that they will be proficient and possess necessary skills; it also involves trusting that they are committed to acting in your interests and in line with your wishes, values, and priorities. 23 In order to demonstrate an equivalent commitment in the health data–sharing context, those responsible for sharing health data must display an understanding of how health data is connected to patients’ interests and values. As Muller et al have argued, this is an important part of the social licence for health data research. 24 The existing literature also emphasises the context-dependent nature of trust, and the need to be clear about who is (or is not) doing the trusting. 25 This article adds to the existing literature by providing empirical evidence of the role that commitment to patients’ values and interests plays in their consideration about opting out of health data sharing in the UK. The analysis of what underlies its significance for these individuals provides new and critical insights for policy and regulation of this activity in the research context.
As we go on to establish, the concept of ‘my data’, as used by the participants in our study, cannot be fully explained for by a property account. 26 They described a sense that their health data both recorded and shaped their embodied existence and, in so doing, became part of their identity and individuality. 27 Seen in this way, health data becomes an extra-corporeal part of the individual, a constituent part of them, intimately connected with what it means to be (that) human. This sense of ‘constitutive belonging’ rather than mere ownership of health data survives anonymisation and causes patients to have particular ethical concerns regarding the uses to which their data is put. Understanding health data as ‘my data’ in this constitutive sense indicates that, in addition to data security, the governance of health data sharing has a role to play in respecting the interests and priorities of data providers.
We argue that a lack of trust is a significant barrier to achieving the potential beneficial outcomes of health data sharing. It stems from health data not being treated as ‘my data’, which, our participants revealed, can have a specific meaning in this context. This understanding of ‘my data’ needs to be reflected in regulatory frameworks overseeing the sharing of health data for research in order for them to achieve their promise. In the Methods section, we outline the methods used in the study. The findings are then presented in the Results and Discussion section, where we explore the key theme identified in the analysis: the connection between trust and treating health data as ‘my data’ and our present conceptual understanding of people’s relationship with their health data based on our participants’ contributions. We conclude with the argument that in order to restore trust in health data sharing, regulation of this activity must have commitment to respecting the values and wishes of data providers at its core. It is hoped that such an approach will enable more people to share their health data for research purposes in line with their wishes and facilitate their expressed desire to contribute to improvements in healthcare through health data research.
Methods
In this study, we undertook qualitative semi-structured interviews with patients 28 and stakeholders such as campaign groups and patient representatives who work to improve data sharing from a patient perspective. The purpose of interviews in qualitative research has been defined as ‘. . . obtaining descriptions of the life world of the interviewee in order to interpret the meaning of the described phenomena’. 29 Interviews were chosen as a data-gathering tool because they permit an in-depth exploration of participants’ experiences by which researchers can seek to understand the participants’ perspective and meanings. 30 We explored the concerns some patients have about sharing their health data and their reasons for opting out of the GPDPR programme, as well as the features they would like to see in a system overseeing the sharing of their health data. Adopting a semi-structured format allowed us to develop a novel insight into what it meant to patients to share their health data for research and planning purposes and to opt out of sharing it in the proposed way, as well as providing leeway for following up on aspects the interviewer deemed important. 31 While our aim was to reflect on the views of patients, we also gathered the views of professionals and stakeholders, including their personal views, and interpreted what they said for what this means about how patient health data sharing should be regulated.
Participants and Procedure
Semi-structured interviews were conducted with patients who had opted out of the GPDPR programme (n = 10), GP practice staff (n = 2), and relevant stakeholders (n = 6). Participants were recruited from three key sources: respondents to an online survey of patients’ views on health data sharing who indicated a willingness to be interviewed; the researchers’ professional contacts; and interested groups who promoted the project to their members. Forty-five respondents to the patient survey indicated that they had opted out of the GPDPR programme and would be willing to take part in an interview. Six of those respondents were interviewed. A further four patients who responded to advertisements via Twitter and promotion by interested groups were interviewed. Interested stakeholders were identified through professional contacts with those working in the area of health data, as were GP practice staff. 32 The sample was largely a convenience sample based on who responded to the surveys and advertisements. 33 No demographic criteria were applied as the demographic distribution of those who opted out of the GPDPR programme is not known, and so the study could not be designed to be statistically representative of the group. This is a limitation of the study, as it only captures the views of those who volunteered to be interviewed, and it is possible that those with particular concerns about their data being used for research would also not volunteer to be interviewed. Therefore, the study cannot represent the entirety of the reasons why people opted not to share their health data for research purposes but does provide an insight into some of those reasons. A non-representative sample is in keeping with the Grounded Theory approach to analysis. 34 Each interview lasted approximately 1 hour.
The study received ethical approval from the University of Manchester Research Ethics Committee through its Proportionate Review and was conducted in accordance with the University of Manchester Research Ethics Policy.
In the interviews, we asked patients to describe how they were informed about the GPDPR programme and the steps they took to opt out. Patients were also asked to reflect on the reasons why they opted out and the features they would like to see in a system for sharing their health data. The aim was to interpret what opting out meant for these patients and the concerns that reflected. Professionals were asked about their experience of the GPDPR programme, whether they had any concerns over the sharing of GP data, what features they would like to see in a system for sharing patient GP data, and their personal views on the GPDPR programme. Recruitment to interview ceased once data saturation was established (no new codes after two interviews).
Analysis
Interviews were audio recorded with permission from the participants, and transcripts were produced from the recordings. NVivo software (QSR International) was used to facilitate analysis. A grounded theory approach was used to carry out an inductive thematic analysis using the transcripts, identifying and coding emergent themes. 35 This approach was chosen in order to be as open as possible about what participants were telling us including their experience and feelings, rather than searching for evidence of a pre-selected theory. It allowed us to gather views in the absence of influence from existing theories and policy approaches on health data gathering for research, and to explore what, if anything, is missing from those from a health data subject perspective.
A comparative analysis between different instances of a code and between different thematic codes was conducted to reaffirm or adjust subsequent coding. In this way, the coding structure was iteratively refined and verified. Memos were used to record thoughts about the emerging codes and the relationships between them. The data and emergent codes were discussed within the research team to conduct interim analysis, and codes were grouped into common themes that explained the observations from the data using mind maps. In this way, a core theme or theoretical framework was developed to explain what it meant for people to opt out of sharing their health data.
Results and discussion
In this section, we begin by outlining a summary of our findings before exploring the data in more depth. In particular, we consider the meaning and further develop the concept of ‘my data’.
Summary
The core concept identified by participants in the study was the connection between trust and the concept of ‘my data’. Participants did not trust the NHS as a whole to handle their data appropriately. This lack of trust was linked to past failures such as the care.data scheme, 36 personal experiences of data being mishandled, and a mistrust of commercial organisations seeking to make a profit from health data gathered by the NHS. Similar concerns have also been expressed in relation to the involvement of the American company, Palantir, in the running of the NHS Federated Data Platform 37 and regarding reports that Biobank shared patient data with insurance companies despite indicating that they would not do so. 38 However, the main reason for the lack of trust in the case of the GPDPR programme was the perception arising out of the way that the programme was presented, that the data was not treated as belonging to patients or as being ‘my data’.
Participants told us that they had received very little information about the GPDPR programme. Some participants heard about it in the press or through social media, and those who subsequently undertook their own research found very little information. In particular, there were few details about how the data would be safeguarded, with what organisations it would be shared, who would be responsible for it, and what the anticipated benefits of the programme would be. Interviewees found that GP staff were commonly unaware of the details of the programme and were not able to answer questions. The short timeframe to register an opt-out – patients were only given 1 month to opt out before data collection would begin – led to feelings of being pressured and rushed. This, combined with the opt-out rather than opt-in nature of the decision, caused participants to feel that they were not given the opportunity to make an informed, considered decision, with an appropriate level of control. Rather, it appeared that the system assumed an entitlement on behalf of regulators and researchers to access and use data without interaction with its subjects. This was important in developing the sense that health data was not being seen or treated as ‘my data’. It was this sense that their connection to the data had not been understood or respected that underpinned the reported lack of trust in the GPDPR programme and, ultimately, led to many people’s decision to opt out of sharing their health data.
In addition to explaining why participants opted out of sharing their health data in the GPDPR programme, the core concept of the connection between trust and what it means to be ‘my data’ also explains the features that would indicate trustworthiness in health data sharing to our participants: a trustworthy system for health data sharing would recognise health data as ‘my data’.
Patients are not alone in seeing the data as ‘my data’ – researchers use similar language when talking about the data their research is based on. 39 However, its use appears to represent different claims about the relationship of the agent to the data in different contexts. As has been recognised in the context of data sharing between researchers, understanding the meaning that underpins the concept of ‘my data’ is crucial if we are to tackle the barriers to health data sharing. 40
We now explore the data from our study in more depth, examining in section 3(b) what it tells us about what ‘my data’ means from a patient’s perspective, and in section 3(c), how health data sharing can be regulated to reflect its particular status.
‘My data’
Patient ‘ownership’ of data is frequently acknowledged and assumed in debates around health data sharing. For example, the then UK Under-Secretary for Public Health, Jo Churchill, explained the need for a pause in the GPDPR programme to the House of Commons stating that:
Patient data is of course owned by the patient and we are absolutely determined to take people with us on this journey.
41
However, as explained below, the meaning of ‘ownership’ of data is not as straightforward as this statement implies. The concept of ‘my data’ is much more than a shorthand for referring to the data about an individual. Rather, for patients, it is used to encapsulate the relationship they have with that data, what it means to them, and the rights they have over it. 42 Regulating without a nuanced understanding of this relationship will lead to missed opportunities to build and maintain trust in health data sharing. In the next section, we analyse the tempting conceptualisation of health data as property and argue that a regulatory approach that adopts this conceptualisation of ‘my data’ perpetuates pre-existing misunderstandings in a way that puts public trust at risk.
‘My data’, my property?
One common interpretation of ‘my data’ is a sense of property ownership. 43 This interpretation is disputed, with some arguing that data is not capable of being property, 44 but do our participants see ‘my data’ as their property?
Honoré describes 11 constituent rights and duties of the ownership of property:
. . . the right to possess, the right to use, the right to manage, the right to the income of the thing, the right to the capital, the right to security, the rights or incidents of transmissibility and absence of term, the prohibition of harmful use, liability to execution, and the incident of residuarity.
45
Ownership of property can thus be seen as a set of legal rights such as being able to control, give away, sell, or destroy the property. Not all of these strands are required to establish ownership, but consideration of them can help resolve competing ownership claims. 46
Central to our participants’ concept of ‘my data’ is the strand of Honoré’s bundle relating to the right to manage (or control) the data. 47 In their view, they had the right to decide whether their health data could be shared for planning and research purposes. The need to give individuals a choice over the sharing of their health data is widely acknowledged, 48 and the GPDPR programme did give individuals the option to opt out of sharing their health data. However, our data shows that the one-off, binary choice offered was not sufficient to gain the trust of our participants. As we show below, by not giving them the information, or the time and support they needed to make the decision, patients’ right to manage the aspect of ownership was not being respected, and this reflects a deeper disregard for their connection to ‘my data’.
Right to manage
The view that patients have the right to manage their health data is recognised in law. For example, the UK General Data Protection Regulation (UK GDPR) 49 gives patients rights of access, rectification, erasure, processing restriction, data portability, objection, and rights related to automated decision-making including profiling in respect of their personal information. These rights to control the extent to which others can access and use the data are what Harris refers to as ‘trespassory rules’ 50 and could support the view that health data (being an important subset of personal information) 51 is the property of the patient. The existence of such trespassory rules does not prove that a property interest exists (indeed they may have been granted precisely because there is no property interest conferring such rights), but they do indicate that an individual has some form of connection to the data, and that may or may not be one of the properties.
However, these trespassory rules are limited; patient consent to the processing of health data is not always required. 52 Notably, the requirements of the UK GDPR and the common law duty of confidentiality only apply to identifiable personal information, which is data relating to an individual who could be identified from the data or in combination with other data that can be accessed by that organisation. 53 As anonymised data lies outside of these protections, patients do not have the same legal rights to manage their anonymised data.
Even where there is no legal requirement for patient consent, there might be other reasons for giving patients the opportunity to manage their data. The fact that patients were given the opportunity to opt out of their GP data being included in the GPDPR programme is an acknowledgement that patient support for research as well as the relationship of trust between patients and healthcare professionals would be put at risk if patients’ wishes regarding their health data were ignored. This is because patients commonly expect to have the right to control what is done with their data, and even if they accept that there are limitations on this right, they expect to have that right respected wherever possible. In this way, even when patients have no blanket legal right to restrict use of their data in line with their consent, there are still good moral and pragmatic reasons to provide easily accessible opportunities to provide it. Failing to do so risks diminishing the trust patients have in their healthcare professionals. This concern was expressed by our participants. For example:
139 (patient): . . . because you can’t have a situation where somebody won’t go and talk to a GP . . . the GP is the start of everything in this country and they think, I’m not going to go to the GP because he’ll write something in there and they’ll know about it. You’ve only got to look at what’s happened with what was it, the . . . was it hospitals were asked to report on people that they thought they were going to be migrants or something . . . and everybody kicked back and said, no, I don’t think we want that.
Respecting patients’ interests in making decisions about their data means giving them the opportunity to make an informed, considered decision about the sharing of their health data. Our participants told us that they did not feel that this was the case with the GPDPR programme. There was a strong feeling from the interviewees that not enough information had been provided by NHS Digital about who their data would be shared with or processed by, what their data would be used for, and who would benefit from it.
54
130 (stakeholder): I guess my concerns were kind of twofold really. I mean, one of which is that the governance around it felt quite obscure. It didn’t really feel very transparent. It was difficult to find specific information on the website about how our data was going to be used, what it could be used for and what it couldn’t be used for. And I say that as somebody with some inside knowledge. I mean, at this point in time, I know somebody who sits on one of the panels . . . a GP that sits on one of the panels that actually reviews applications for data. But even so, I just kind of feel like there was very little clarity about the way decisions are made about what data to share and with whom and for what purposes. I think that there needed to be a wider conversation about that before committing to sharing the data in this way. 135 (patient): So I think the fact that it was done not very clearly, not very openly, not informing people that this was happening, at a time when I felt particular mistrust in our government . . . that, I thought, actually, I don’t feel comfortable with sharing my personal data because I don’t know what’s happening to it.
Some participants undertook their own research into the proposals in the GPDPR programme but found the information unclear and still felt as though they were not being told all the facts. This was compounded by a lack of training for GPs and practice staff regarding the GPDPR programme:
101 (patient): I don’t think anybody from the organising system ever went into a GP practice and stood there to imagine the process of how it would happen at the front desk.
This lack of information led to people looking to other sources for information, including friends and family members who work in the NHS. Many of those on the ‘inside’ of the system expressed concerns about the GPDPR, and this had a ripple effect, causing their friends and family to opt out. One participant described asking GP staff she knew about their personal view of the programme as her ‘litmus test’ for whether she should opt out or not.
Participants were concerned that if they did not opt out, their data would be shared with companies, possibly with commercial aims, whose ethos the participants disagreed with; that they would be harmed, for example, by insurance companies using the information who would increase their premiums; or that society would be harmed by commercial companies profiteering to the detriment of the NHS. The lack of information they received concerning the GPDPR programme increased these fears.
A dearth of information combined with the fact that the opt-out decision assumed an entitlement of others to access their data led to suspicion on the part of our participants about the motives of the proposed data gathering. This, together with the short time people had to opt out before data collection would start, caused many of the participants to feel that something was being hidden from them; that there was secrecy for a reason:
119 (patient): . . . because my back was already up and because it’s been rushed and because, you know, I started to feel probably a bit of paranoia coming in here, because I was thinking, well, hang on a minute, you’re making me do this quickly, you’re making me opt out, you’re going to do it anyway, you’re not telling . . . you’re making it really complicated, you’re moving it from anonymised to pseudonymised and I’ve no idea who then can unravel it. Yes, I can opt out but actually my data doesn’t come with me, so I just got . . . think a bit of paranoia set in and I just thought, no, I want to know now, because of that, I want to know where it’s going, what organisations are going to use it
Some participants referred to it as their health data being accessed ‘by stealth’ or in an ‘underhand’ way. These suspicions were linked to a feeling that the GPDPR programme did not treat the data as ‘my data’ because insufficient respect had been shown for the decision-making authority which patients felt entitled to about their data:
112 (patient): Because, totally full blank, they didn’t ask me, they didn’t say anything about it and suddenly it’s like, oh we’re going to use all your information for this, this and this. And I was like, no, you’re not, you ask me nicely. That’s as point blank as it goes. I did it the same with when they wanted to use my body parts if I died, the thingy. So, I just went, no, you’re not, because it’s taking away my liberty. Taking away my control. Taking away my empowerment, because they can just do what they like with it. [. . .] I don’t want to be told, yeah, we’re just going to use your information, you’ve got nothing to. . .you can’t do nothing about it. 112 (patient): I haven’t been even asked. . . . Because I have a right . . . I don’t belong to the doctor, I belong to myself, so I’m a person. So it’s like them coming in, getting my arm and taking some blood out of me. 139 (patient): I remember it had a deadline and again the deadline didn’t seem to me to be anything that I now would call informed consent. That’s a phrase I’ve learnt later but it was almost, we’re taking your data, you’ve got no say in it and tough.
It is important to note that the GPDPR programme met the relevant legal requirements; the NHS can legally process health data as part of a task necessary for reasons of public interest in the area of public health without obtaining patient consent, 55 and even if consent had been required, the law does not specify what information, time frame, or safeguards are required for it to be a valid consent. 56 Therefore, it is clear that for these participants, compliance with the extant regulation of health data sharing (demonstrating trustworthiness in relation to competence) 57 is not sufficient to ensure that their decision-making authority is appropriately respected. 58
The lack of information they received, the short timeframe they were given to opt out, and the opt-out basis of the decision all mean that what patients perceived as their right to make an informed, considered decision was not respected. These were the reasons for which they felt their health data was not treated as ‘my data’.
Returning to Honoré’s bundle of rights, while our participants considered their health data to be ‘my data’ in terms of the right to manage the data, they did not consider it to be ‘my data’ on the basis of the right to income.
Right to income
For almost all the participants, the feeling that it was ‘my data’ was not linked to a right to be compensated for the value of that data. The monetary value inherent in health data was seen as belonging to society; participants felt that any benefits achieved with the data should be available to the public on fair terms, rather than simply making a profit for commercial organisations.
119 (patient): Because this is really valuable stuff and where’s it going, and it’s my data at the end of the day, so I don’t necessarily expect to be given £50 in ten years’ time for my contribution, but I want to know it’s been put back into the NHS or going into somewhere, but none of that was clear.
There is an evident acceptance that sharing health data for research is unlikely to benefit the individual directly; health research often takes many years to produce results, 59 and then the benefits are only to those affected by the condition. However, participants were keen to contribute to this societal benefit, particularly where they understood what others could be spared from facing. This is seen in the desire to contribute to research into conditions they, or a family member, had suffered from to prevent others from suffering in the same way. This innate sense of solidarity and desire to further the common good is part of the underlying principles of the NHS itself, stemming from its funding through general taxation. 60 It is an important motivation for patients to contribute their health data to research and planning and so should be facilitated, rather than being diminished by the regulatory framework surrounding health data sharing.
While the potential value of the data was seen as being a public or civic asset, the data itself was still viewed as belonging to the individual patient, in the sense that they felt that they had, or should have, a right to control its use. This could support the interpretation of ‘my data’ as a claim to property ownership. As we now go on to consider however, there are important reasons why the concept of property is an insufficient approach to account for what our participants mean by ‘my data’.
Problems with the property account
Honore’s strands of property ownership do not apply to data in a straightforward way because of its intangible nature. Data can be reproduced without being diminished, and that reproduction can be unlimited and potentially unknown to the data subject or data holder. For example, while the use of my Tesco clubcard lets Tesco know how many packets of chilli peanuts I have bought in the last year, peanut suppliers, shopping analysts, and public health researchers could also have that information without any change in my supermarket records or any other sign that would be obvious to me. However, this does not preclude data from being considered property in a number of contexts, for example, in the purchase of a commercial enterprise, data is commonly treated as property, and the law of intellectual property protects data as property. 61 In other words, subject to various practical constraints and regulatory requirements, data can be considered property whenever the law decides to treat it as such. 62 We therefore turn to the question of whether this approach would reflect the views of our participants about their data.
One way in which the law currently frustrates respect for our participants’ concept of ‘my data’ is its exclusion of anonymised data from their control. As explained earlier, a claim by patients that health data was their property could be based on the trespassory controls they have over that data. However, individuals do not have the same legal right to control information about themselves once it has been sufficiently anonymised. If the individual to whom the data relates cannot be re-identified using reasonably available means, the data is not considered confidential information and so can be processed without the data subject’s consent. 63 Within the legal framework then, as patients do not have the same trespassory controls over anonymous data about them, it is not the property of the patient in the same way that their confidential health data is.
However, for many of our participants, the fact that the data would be anonymised did not weaken the connection of the data to the individual.
106 (patient): It comes down to, regardless of the anonymisation, my data is my data and I feel that even if it’s anonymised, I still have a right to say how it’s used and where it goes. I’m a firm believer in that. In the same way that when I give my money to a bank, I can say to them, you’re not putting it in that fund, you can put it in that fund. Even though when it goes into that fund, it is effectively anonymous, there is a link to it, to identify it as mine but the people who get the money don’t know it comes from me. And the same way here that, you know, my data has to . . . even though it’s anonymised, you have to have a link back to the fact that it’s me. 139 (patient): I think the way I looked at it was they tried to make it that the data belonged to the NHS and therefore we can do anything with it and it’s not, it’s mine and the only person who has a say about that is me.
Therefore, anonymisation of data does not signify an end to it being ‘my data’; for them, the intrinsic connection to the data remains. 64 This is something that is not reflected in the current regulatory approach. The GPDPR opt-out enabled patients to stop their primary care data being shared with NHS Digital, and the NHS national data opt-out enables patients to stop their confidential information (that identifies them) being used for research and planning purposes, but neither opt-out would enable patients to prevent their data being used once it has been de-identified. 65 Therefore, patients are not given control in law over anonymised data about them.
Another difficulty is that property is often associated with the labour that has produced it. As Montgomery has argued, this would support a claim of ownership by the health services, to a greater extent than a claim by patients.
66
Although the patient provides the information and any samples, the health professional gathers the information and uses their expertise and skill to make it intelligible and useful.
67
As Liddell et al put it:
Both the patient and the professional are necessary, but not sufficient, to generate health information. Without the patient, there is no information at all; but without professionals there is negligible or no health information.
68
Therefore, if the right of patients to control their health data is founded on the data being their property, that right could be curtailed by the property claims of health professionals. Thus, a property claim may not give patients the control they want to have over their health data.
A further potential challenge in relying on a property account of ‘my data’ is that, under the principles of property law, property can be transferred, and so a patient could give away or sell their data, and it would cease to be theirs. 69 Patients could be pressured in to doing so or simply change their minds at a later date without any way to regain control of the data. However, this contradicts what our participants told us about their relationship to their health data. For them, the right to control the data about them was inalienable, an enduring connection with the data. In the context of property in body parts, Laurie addresses this point by suggesting that legal restrictions could and should be put in place to prohibit the waiver of such property rights. 70 This proposition has some merit in relation to health data in reflecting what our participants see as an unbreakable connection to the data. However, there would still be something missing from a property account of ‘my data’. Such restrictions on property rights might reflect the relationship individuals have with other personal data, but this would not be sufficient to reflect our participants’ relationship with their health data. In the next section, we set out the ways in which our participants saw their health data as distinctively important and deeply connected to their own identity.
The nature and implications of ‘my data’
The findings of our empirical work show that, when applied to the regulation of health data sharing for research, the property model perpetuates pre-existing failures to understand how people relate to their own health data. As we now discuss, if trust in the system is to be engendered, it is critically important that relevant regulation is formulated to reflect understanding of the ways in which people see and relate to their health data.
Particular nature of health data
One of the questions that participants were asked was whether they felt differently about sharing their health data, and GP data in particular, compared to other forms of data about them. It was clear that health data was seen as particularly personal, and GP data especially so:
111 (patient): No, they’re totally different. Your health data, whether it be in primary or secondary care is one side. The other side, your data, if you subscribe to Nectar or Tesco’s or whatever that’s another side of research, I think. But I think your health data is very personal to you, which is why I would want additional protections there. 137 (patient): health data is probably the most personal data that exists for me.
GP data was commonly seen as containing a fuller picture of an individual than other health data such as from secondary care.
151 (GP practice staff): I think it’s probably the most comprehensive source of information that you’ve got on a patient because you have the secondary care . . . you know, you should have the secondary care information pulled into it. It should be the, kind of . . . you know, the person’s general primary care . . . things from here there and everywhere are pulled in, whereas other things will give you snapshots. 177 (stakeholder): So, GP data is different because your GP . . . if you go to a hospital, you’re under a specialist, they treat your left thumb, they treat your toe, they treat your broken bone, they treat whatever organ it is that’s damaged. And they don’t really care about the rest of your body, they just like enough of it not to be bleeding all over the carpet. Your GP cares about everything. So your GP has the whole . . . they have a lifetime history. Your GP record follows you from practice to practice. It’s full history, they find out what happened in the hospital. You generally tell your GP things that you don’t tell other people. Mental health particularly, especially if you’re not in chronic mental health conditions.
This highly personal nature of GP data and the connection to an individual’s unique experience of the world indicate that this is not ‘my data’ in the sense of property ownership but in a much deeper connection to identity and individuality. As we now go on to discuss, this is what Floridi refers to as constitutive belonging. 71
Constitutive belonging
The information provided by our participants supports the view that data relating to individuals, and particularly health data, is part of our identity and individuality – the data that exists about us expresses and influences who we are. This is not just in the eyes of others but also in our own eyes. 72 Health data interprets and changes how we interact with the world, with ourselves, and with others.
As Floridi puts it:
‘Your’ in ‘your information’ is not the same as ‘your’ as in ‘your car’ but rather the same ‘your’ as in ‘your body’, ‘your feeling’, ‘your memories’, ‘your ideas’, ‘your choices’, and so forth. It expresses a sense of constitutive belonging, not of external ownership, a sense in which your body, your feeling, and your information are part of you but are not your (legal) possessions.
73
The connection signified by ‘my data’ is closer to ‘my body’ or ‘my memories’ than it is to ‘my car’; the data is part of us and our sense of self. 74 The parallel Floridi draws to body parts has been seen in the data presented earlier, with Participant 112, for example, comparing her data being shared in the GPDPR programme to someone taking blood out of her arm.
Not all personal data, or data about you, is constitutive data. Names, National Insurance Numbers, and PIN numbers are arbitrary, and while they are about you, they do not constitute you. For Floridi, constitutive data is data that you make yourself through your interactions in the world such as your intimate beliefs or unique emotional involvement. 75 This explains why there is a particularly strong sense of health data being ‘my data’. The very personal nature of health data, and GP data in particular, telling of an individual’s bodily experience of the world, produced through interactions in a relationship of trust, makes this sense of constitutive belonging particularly strong. 76 However, it is also possible that health achieved without the individual patient exercising her agency, such as diagnoses based on genetic testing of a family member, is equally considered constitutive in the same way because it may still tell what it means to be that individual and informs how she experiences the world. 77
This tells us that it is not the role of the individual in producing the data that makes it constitutive; it is the use of that data by that individual to reflect on her identity that does so. 78 On this view, any personal data could be constitutive if individuals view it as forming a part of their identity. The highly personal nature of health data that tells of, and impacts on, an individual’s life experience as an embodied being makes it particularly likely to be seen as constitutive. For those reflecting on how to establish a trusted system of health data sharing, the key here is not what individuals should consider to be constitutive health data, but what individuals do consider to be constitutive health data. It is a further reason why it is important for patients to have some level of control over their health data on an ongoing basis; only they can determine what they view as part of ‘my data’ in a constitutive sense.
A concept of ‘my data’ based on constitutive belonging explains why our participants felt that it was still ‘my data’ after it had been anonymised; if your kidney is removed, there is still a strong sense in which it remains ‘your kidney’ even if it is not distinguishable from another kidney (or perhaps even if it is placed in another person’s body). 79 Similarly, if a memory from your life was made into a film, it would still be your story even if no one else knew that it was yours. This understanding poses a clear challenge, as well as opportunity, for regulation of health data sharing – providing mechanisms to reflect and respect people’s connection with their data can be complex, 80 yet doing so can contribute to greater public confidence in and contributions to research using health data. As we now set out, a concept of ‘my data’ based on constitutive belonging also explains why our participants wanted to have control over the purposes their health data is used for.
Ethical concerns, moral complicity, and dignity
One of the reasons that participants wanted to control the use of their data (and felt that they had a right to do so) was that they were concerned about the purposes for which their data might be used. This was expressed by some as a concern that their data would be accessed by companies, possibly with commercial aims, whose ethos the participants disagreed with, and by others as a concern that it would be used for purposes they were not comfortable with, or found unethical. This was linked to concerns that they, as an individual, might be harmed by misuse of the information, such as by increased insurance premiums or being discriminated against because of their mental health, as well as concerns that the NHS would not receive fair value for the data. But there was something more to these concerns for some participants. Some were troubled by what their data would be used for even where that data had been anonymised, and so there was no direct risk to them as individuals. Furthermore, there was a concern that the use of the data would go against their religious or ethical beliefs, separate to whether any harm would result. One participant expressed this as a concern that their health data would be treated as lacking humanity and without any connection to an individual’s beliefs:
139 (patient): So should you then go back to the beginning and go and ask those people can we use your data again but it’s now for this and give them a choice to say, actually no, I’m not interested in that or maybe it’s against my religious beliefs or whatever, things that I hold rather than just treating it as a set of data? There is a real thing that’s sat behind it and part of the problem that data has is it doesn’t carry any of that nuance. It doesn’t carry any of that humanity or stuff behind it so we have to be . . . well, for me there are certain things that I would never let my data be used for even in a secondary or a tertiary use.
Once the concept of ‘my data’ is understood as an expression of constitutive belonging, it is not surprising that patients would be concerned about the ethics of the ends to which their data is put. Similar concerns have been expressed in relation to the use of human tissue samples in research.
81
The use of individuals’ health data is connected to respect for them as persons; the data is not simply a thing, but a part of who they are. As Floridi explains:
The self-constituting interpretation suggests that your informational sphere and your personal identity are co-referential, or two sides of the same coin. There is no difference because ‘you are your information’, so anything done to your information is done to you, not to your belongings.
82
Therefore, if a person’s health data is used for purposes she finds ethically objectionable, they are being used for purposes they find ethically objectionable. This can cause feelings of moral complicity in that wrongdoing. 83
Complicity is described by Kutz as the connection of an individual to harms and wrongs mediated by other agents. 84 In his discussion of the facilitation of harmful enterprises, Kutz draws on the example of a gun seller who sells a gun to someone who has indicated that he will use it an armed robbery; the seller is complicit in the resulting harm because even though the individual would have been able to purchase a gun elsewhere, the seller’s actions made a difference to the commission of the crime and form part of the explanation of how the event came about. 85 In the case of an individual’s health data being used for unethical purposes, the sense of moral complicity could come from her actions (knowing about the possibility of unethical use) or from knowing that a part of her had been involved in the enterprise by others. Consider these possible scenarios:
- The individual shares her data knowing that it will be used in this way;
- She shares her data not knowing that it will be used in this way;
- She does not share her data, but it is used anyway.
In the first two scenarios, the complicity relates to a sense that her actions have contributed to the harm (whether she knew that they would or not), but in the third scenario, the complicity is connected to a feeling that she is connected to the unethical purpose because part of her has been put to that use. Using a person in this way without her consent is a violation of her human dignity. 86
Protecting human dignity means protecting what makes us human, which for this purpose we take to be the ability to make choices about who we are and what we do with our lives, including the uses to which our data is put. As Floridi puts it:
Our dignity rests in being able to be the masters of our own journeys, and keep our identities and our choices open. Any technology or policy that tends to fix and mould such openness risks dehumanising us . . .
87
Similar observations have been made in relation to the use of body materials left over after diagnostic procedures:
Each mature person should be the author of his or her own life. Each person has values, plans, aspirations, and feelings about how that life should go. People have values which may collide with research goals [. . .]. To ask a person’s permission to do something to that person is to involve her actively and to give her the opportunity to make a project a part of her plans. When we involve people in our projects without their consent we use them as a means to our own ends.
88
It is this threat to human dignity that underlies the regulation of research involving human tissues and is part of why families involved in the retained organ scandals in the UK, for example, were so distressed at learning that organs from their babies had been retained and used without their knowledge or consent.
89
It is hardly surprising then that individuals would feel complicit where their data is used for unethical purposes. After all, this is the other side of feeling good when their data is used for positive ends; something that is used to encourage patients to share their data. This feeling that you are part of an endeavour by contributing data was expressed by one of our participants:
112 (patient): But it’s . . . I want to make a difference. It’s like the starfish story, you might be that one little person who just puts that starfish back into the water, but it’s making a difference to that person. I want to make a difference to the world, but it’s got to matter, it’s got to be meaningful. If I’m going to contribute, let my data be used, it’s got to be meaningful to me.
To facilitate this positive sense of moral complicity in endeavours which individuals support and avoid a negative sense of moral complicity in endeavours they consider unethical, individuals must be kept informed about the use of their health data and given the opportunity to object to that use. 90
While granting patients (appropriately constructed) property rights over their data could ensure that patients are given a say over the future uses of their data, the concept of property does not reflect this sense of constitutive belonging or the potential for moral complicity. Our participants care about how their data is used in a different way compared to their care about how their car is used. Therefore, while it may be convenient to adopt the familiar language of property when talking about health data, the concept of proprietary ownership appears inadequate to reflect the nuanced views of our participants when they talk about ‘my data’. 91 This, we suggest, is why simply equating health data to other forms of property (for instance, reassuring patients that a SDE is guaranteed) is unlikely to gain the type of trust needed to persuade those who currently opt out of sharing their health data to think again.
The current policy focus of ensuring anonymisation and avoiding potential harms from identification of the data subject (such as financial penalties, unwanted marketing, and discrimination) does not reflect the constitutive view of health data with its strong (and enduring) connection to the values and beliefs of an individual. 92 Ethical concerns and the potential for feelings of moral complicity are a fundamentally different type of harm. Those responsible for the regulation of health data sharing need to understand and address these issues if they are to gain the trust of those who currently opt not to share their health data for research purposes.
Conclusion
The central tenet of this article is that if we want more people to be able to trust in and contribute their data to health research and planning, we must listen to those who currently feel unable to do so, with a view to understanding the barriers that prevent them. A lack of trust has long been recognised as a significant barrier to achieving the many benefits that sharing health data can offer. In a speech to the NHS Innovation Expo in Manchester, in 2015, the then Secretary of State for Health said:
Exciting though this all is, we will throw away these opportunities if the public do not believe they can trust us to look after their personal medical data securely. The NHS has not yet won the public’s trust in an area that is vital for the future of patient care.
93
Our data shows that a lack of trust in data security is only part of the problem; there is also a lack of trust that the data, and by extension patients, will be treated with dignity and respect.
If patients trusted the NHS to handle their data securely, this would be trust in the competence of NHS; that they had sufficient procedures and safeguards in place to ensure that patients would not be identified from the data and that there would be no unauthorised access to the data. However, this is unlikely to provide reasons for patients to trust that the NHS is (and those with whom health data is shared are) committed to protecting patients’ values and interests. Trusting that an agent will act in a particular way involves more than trust in the ability or competence of the agent, but also expectations regarding the agent’s value orientation, their ethics, integrity, and motives. 94 It is hard to see how patients could expect the NHS to have appropriate values, ethics, and motives if it does not demonstrate an understanding of what health data means to patients.
One of the features of health data-sharing is that it is difficult, if not impossible, to know in advance all the future uses to which the data might be put. This makes the expectation regarding the value orientation of the agent particularly important. It is not simply a case of patients trusting the NHS to carry out certain actions but, rather, a longer-term and more onerous expectation that someone will evaluate future situations in line with patients’ ethical standpoints. This requires confidence that the organisation managing the data understands the significance patients attach to that data.
Our research found that the participants in our study opted out of sharing their health data in the GPDPR programme because they did not feel that the data was treated as ‘my data’. By this they meant that what they saw as their right to make decisions about their health data was not respected by the one-off, binary, opt-out choice, offered with little accompanying information and a short timeframe for action. Crucially, the GPDPR opt-out did not give individuals the opportunity to express their wishes about the purposes for which their data was used and so did not demonstrate a commitment to patients’ values and interests. This right to control that participants described does not stem from a property concept of ‘my data’, as has previously been assumed, but rather from a sense that someone’s health data forms part of their identity regardless of whether it has been anonymised or can be linked back to them. In this way, our study offers empirical support for Floridi’s view of the connection between individuals and their data as one of constitutive belonging and demonstrates the significance of this for programmes seeking to encourage health data sharing.
Understanding ‘my data’ in this constitutive sense explains why, for some patients, trust in competence to keep data secure would not be sufficient to enable them to share their health data. Equally, if not more important, is an ability to trust in the commitment of the agent to protect the patient’s data dignity by not using the data, including anonymous data, in ways she would object to. This constituent part of trust has been lacking in previous attempts to restore trust in health data sharing through regulation. It is hoped that the contribution of our participants in elucidating what has been missing will lead to a regulatory framework that attends to their concerns and enables more people to share their health data for research purposes in line with their wishes and values. 95
Footnotes
Acknowledgements
The authors would like to thank the participants in the GP Data Trust project for giving their time and sharing their views and experiences with us. Thanks also go to Nigel Burns for the invaluable role he played in supporting the project.
Author’s Note
Caroline A.B. Redhead is now affiliated with Manchester Metropolitan University, UK.
Author contributions
CB conducted the interviews, carried out the initial analysis of the data, drafted the manuscript, and incorporated revisions. JA, SH, CR, SD, and JC contributed to the analysis of the data, checking the themes identified, and reviewing the manuscript. SH advised on the design of the project. SD was the principal investigator for the project and designed the project. All authors approved the final manuscript.
Data availability
The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.
Declaration of conflicting interests
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Funding
The authors disclosed receipt of the following financial support for the research, authorship, and/or publication of this article: This work was produced as part of a project funded by the Data Trusts Initiative.
Ethical considerations
The study received ethical approval from the University of Manchester Research Ethics Committee through its Proportionate Review process (Ref: 2022-15218-25912) and was conducted in accordance with the University of Manchester Research Ethics Policy.
Consent to participate
The consent to participate of all interview participants was recorded in an audio recording by the interviewer reading out a consent form, as approved by the University of Manchester Ethics Committee.
Consent for publication
All interview participants gave their consent for publication by agreeing to the statement: ‘I agree that any data collected may be published in anonymous form in academic books, reports or journals’. This consent was recorded by the interviewer.
ORCID iDs
1.
B. Goldacre and J. Morley, ‘Better, Broader, Safer: Using Health Data for Research and Analysis. A Review Commissioned by the Secretary of State for Health and Social Care’, Department of Health and Social Care, 2022, available at https://www.gov.uk/government/publications/better-broader-safer-using-health-data-for-research-and-analysis (accessed 20 May 2025); A. Darzi, ‘Independent Investigation of the NHS in England’, Department of Health and Social Care, 2024, available at https://www.gov.uk/government/publications/independent-investigation-of-the-nhs-in-england (accessed 20 May 2025); C. Sudlow, ‘Uniting the UK’s Health Data: A Huge Opportunity for Society’, 2024, available at
(accessed 19 March 2025).
2.
4.
T-P. van Staa, B. Goldacre, I. Buchan, and L. Smeeth, ‘Big Health Data: The Need to Earn Public Trust’, British Medical Journal 354 (2016), i3636; M. Sheehan, P. Friesen, and A. Balmer, ‘Trust, Trustworthiness and Sharing Patient Data for Research’, Journal of Medical Ethics 47 (2021); Department of Health and Social Care, ‘A Guide to Good Practice for Digital and Data- Driven Health Technologies’, 2021, available at https://www.gov.uk/government/publications/code-of-conduct-for-data-driven-health-and-care-technology/initial-code-of-conduct-for-data-driven-health-and-care-technology (accessed 19 March 2025); The Academy of Medical Sciences, Our Data- Driven Future in Healthcare: People and Partnerships at the Heart of Health Related Technologies’, 2018, available at
(accessed 19 March 2025).
5.
We focus on England and at times the UK as that is the jurisdiction of the attempted data-sharing programme, but our findings about people’s relationship to their data might easily apply elsewhere and are reflected, for example, in European approaches to incorporating transparency measures in data-sharing regulation, European Commission Proposal on the European Health Data Space (EHDS),
(accessed 19 March 2025).
6.
Now part of NHS England.
7.
The process was confusing and complicated, which led to some people being unsure about whether or not they had opted out. The existence of other opt-outs relating to health data sharing and the NHS App added to the confusion, General Practice Data Trust (GPDT) Pilot Study: Report on Patient Focus Groups (2023) available at
.
8.
9.
NHS England, ‘General Practice Data for Planning and Research’, available at https://digital.nhs.uk/data-and-information/data-collections-and-data-sets/data-collections/general-practice-data-for-planning-and-research/about-the-gpdpr-programme (accessed 19 March 2025). NHS England states that it is currently working with the British Medical Association (BMA), Royal College of General Practitioners (RCGP), and the National Data Guardian (NDG), as part of the GP Data Check and Challenge Advisory Group, to redesign the GPDPR programme. NHS England, ‘Data and Clinical Record Sharing’, available at
(accessed 19 March 2025).
10.
The question of how a data trust could address these concerns is considered by the authors in a separate article. See C. A. Redhead, C. Bowden, J. Ainsworth, N. Burns, J. Cunningham, S. Holm, and S. Devaney, ‘Unlocking the promise of UK health data: considering the case for a charitable GP data trust’, Medical Law Review, 33(1), (2025) p.fwae043.
11.
W. G. van Panhuis, P. Paul, C. Emerson, J. Grefenstette, R. Wilder, A. J. Herbst, D. Heymann, and D. S. Burke, ‘A Systematic Review of Barriers to data SHARING In Public Health’, BMC Public Health 14 (2014), p.1144.
12.
National Data Guardian. National Data Guardian Report, 2024, available at https://www.gov.uk/government/publications/national-data-guardian-2022-2023-report/national-data-guardian-2022-2023-report (accessed 20 May 2025); M. Aitken, J. de St. Jorre, C. Pagliari, R. Jepson, and S. Cunningham-Burley, ‘Public Responses to the Sharing and Linkage of Health Data for Research Purposes: A Systematic Review and Thematic Synthesis of Qualitative Studies’, BMC Medical Ethics 17(1) (2016), p. 73; Understanding Patient Data, ‘GP Record Data: Public Perspectives and Information Needs’, January 2025, available at
(accessed 24 September 2025).
13.
S. Ghafur, J. Van Dael, M. Leis, A. Darzi, and A. Sheikh, ‘Public Perceptions on Data Sharing: Key Insights from the UK and the USA’, The Lancet Digital Health 2(9) (2020), e444-e446; J. Meszaros and C-H. Ho, ‘Building Trust and Transparency? Challenges of the Opt-Out System and the Secondary Use of Health Data in England’, Medical Law International 19(2) (2019), pp. 159–181.
14.
15.
M. Wolfensberger and A. Wrigley, Trust in Medicine: Its Nature, Justification, Significance and Decline (Cambridge: Bioethics and the Law. 2021), pp. 125 and 136–137.
16.
17.
This is the focus of the solutions proposed in The Goldacre Review, Goldacre and Morley, ‘Better, Broader, Safer’.
18.
We use the term ‘our data’ to refer to the data we have collected as part of the study. The meaning researchers attach to this term is discussed in A. Sorbie, W. Gueddana, G. Laurie, and D. Townend, ‘Examining the Power of the Social Imaginary through Competing Narratives of Data Ownership in Health Research’, Journal of Law and the Biosciences 8(2) (2021), lsaa068.
19.
As Liddell et al. argue, ‘data science platforms cannot afford to be merely “technical,” but need also to account for legal and ethical issues that surround data development and use’. K. Liddell, D. A. Simon, and A. Lucassen, ‘Patient Data Ownership: Who Owns Your Health?’ Journal of Law and the Biosciences 8(2) (2021), lsab023. See also M. Graham, R. Milne, P. Fitzsimmons, and M. Sheehan, ‘Trust and the Goldacre Review: Why trusted Research Environments Are Not about Trust’, Journal of Medical Ethics 49(10) (2023), pp. 670–673.
20.
L. Gilson, ‘Trust in Health Care: Theoretical Perspectives and Research Needs’, Journal of Health Organisation Management 20(5) (2006), pp. 359–375; Wolfensberger and Wrigley, Trust in Medicine.
21.
Wolfensberger and Wrigley, Trust in Medicine.
22.
As highlighted in the patient-centred approach to healthcare championed in Montgomery v Lanarkshire [2015] UKSC 11.
23.
Wolfensberger and Wrigley, Trust in Medicine.
24.
S. Muller, S. Kalkman, G. J. M. W. van Thiel, M. Mostert, and J. J. M. van Delden, ‘The Social Licence for Data-Intensive Health Research: Towards Co-Creation, Public Value and Trust’, BMC Medical Ethics 22(5) (2021), p. 110.
25.
O. O’Neill, ‘Trust, Trustworthiness, and Accountability’, in N. Morris and D. Vines, eds., Capital Failure: Rebuilding Trust in Financial Services (Oxford: Oxford University Press, 2014); D. Rensnik, ‘Scientific Research and the Public Trust’, Science and Engineering Ethics 17(3) (2011), pp. 399–409; Z. Sheikh and K. Hoeyer, ‘“That Is Why I Have Trust”: Unpacking What “Trust” Means to Participants in International Genetic Research in Pakistan and Denmark’, Medicine Health Care and Philosophy 21 (2018), 169–179.
26.
Liddell et al., ‘Patient Data Ownership’; G. Laurie, ‘Privacy and Property? Multi-Level Strategies for Protecting Personal Interests in Genetic Material’, Policy Research Initiative, Canada, 2003.
27.
The role of health information in constructing our narrative identities is discussed in E. Postan, ‘Defining Ourselves: Personal Bioinformation as a Tool of Narrative Self-Conception’, Journal of Bioethical Inquiry, 13(1) (2016), pp. 133–151.
28.
By patients we mean those who have had information about their health recorded by any health provider or other agency recording health-related information. This term includes those who may not be currently receiving any health advice or treatment.
29.
S. Kvale and S. Brinkmann, Interviews (London: Sage, 2015), p. 6.
30.
U. Flick, Doing Grounded Theory (London: Sage, 2018), chap. 2.
31.
P. Leavy, The Oxford Handbook of Qualitative Research (Oxford: Oxford University Press, 2014), p. 437.
32.
We had hoped to speak to more GPs but had a disappointing response to our requests for interviews, most likely due to the pressures on GPs’ time and scheduling.
33.
A. Galloway, ‘Non-Probability Sampling’, in K. Kempf-Leonard, ed., Encyclopedia of Social Measurement (New York: Elsevier, 2005), pp. 859–864.
34.
Flick, Doing Grounded Theory.
35.
B. G. Glaser and A. L. Strauss, Discovery of Grounded Theory: Strategies for Qualitative Research (Somerset: Taylor & Francis Group, 1999); Flick, Doing Grounded Theory.
36.
Care.data was the previous attempt by the NHS to centralise patient data. See S. Sterckx, V. Rakic, J. Cockbain, and P. Borry, ‘You Hoped We Would Sleep Walk Into Accepting the Collection of Our Data: Controversies Surrounding the UK care.data Scheme and their Wider Relevance for Biomedical Research’, Medical Health Care Philosophy 19(2) (2016), pp. 177–190.
37.
Editorial, ‘The Guardian View on the NHS and Palantir: The Case for This Data Deal Looks Weak’, The Guardian, 28 November, 2023.
38.
S. Das, ‘Private UK Health Data Donated for Medical Research Shared with Insurance Companies’, The Guardian, 12 November 2023.
39.
Sorbie et al., ‘The Power of the Social Imaginary’.
40.
Op. Cit.
41.
42.
Sorbie et al., ‘The Power of the Social Imaginary’.
43.
See Liddell et al., ‘Patient Data Ownership’ and Laurie, ‘Privacy and Property?’
44.
45.
A. M. Honoré, ‘Ownership’, in Oxford Essays in Jurisprudence, A. G. Guest (ed) (Oxford: Oxford University Press, 1961), p. 107.
46.
There have been a variety of claims to ownership of health data, see Liddell, Simon, and Lucassen, ‘Patient data ownership’ and J. Montgomery, ‘Data Sharing and the Idea of Ownership’, The New Bioethics 23(1) (2017), pp. 81–86.
47.
Space does not permit a consideration of all strands of the ownership bundle. We will instead focus on the right to manage and the right to income strands, as our participants told us that these were important to them.
49.
United Kingdom General Data Protection Regulation, Assimilated Regulation (EU) 2016/679.
50.
J. W. Harris, ‘Who Owns My Body?’ Oxford Journal of Legal Studies 16(1) (1996), pp. 55–84.
51.
UK GDPR, at Article 9(1).
52.
See UK GDPR at Art 6(1) and Art 9.
53.
See UK GDPR at Recital 26, and R v Department of Health ex parte Source Informatics Ltd [2000] 1 All ER 786.
54.
A similar failing has been noted in relation to the previous attempt to capture primary care data for planning and research, care.data; P. Carter, G. T. Laurie, and M. Dixon-Woods, ‘The Social Licence for Research: Why care.data Ran into Trouble’, Journal of Medical Ethics 41(5) (2015), pp. 404–409.
55.
UK GDPR, at Art 9(2)
56.
However, for processing identifiable personal information (not anonymised), the principles of the UK GDPR would still apply requiring that information is made available regarding how patient data is being used.
57.
Wolfensberger and Wrigley, Trust in Medicine.
58.
This supports the concept of a social licence for health data research. See Carter et al., ‘The social Licence for Research’ and Muller et al., ‘The Social Licence for Data-Intensive Health Research’.
59.
Z. S. Morris, S. Wooding, and J. Grant, ‘The Answer Is 17 Years, What Is the Question: Understanding Time Lags in Translational Research’, Journal of the Royal Society of Medicine 104(12) (2011), pp. 510–520; S. R. Hanney, S. Castle-Clarke, J. Grant, S. Guthrie, C. Henshall, J. Mestre-Ferrandiz, M. Pistollato, A. Pollitt, J. Sussex, and S. Wooding, ‘How Long Does Biomedical Research Take? Studying the Time Taken between Biomedical and Health Research and Its Translation into Products, Policy, and Practice’, Health Research Policy Systems 13(1) (2015).
60.
61.
Liddell et al., ‘Patient Data Ownership’.
62.
For a thorough examination of whether health data can and should be considered property, see Op. Cit.
63.
UK GDPR at Art 4(1), and R v Department of Health, ex parte Source Informatics Ltd [1999] EWCA Civ J1221-65
64.
A. Clarke, A. Mitchell, and C. Abraham, ‘Understanding Donation Experiences of Unspecified (Altruistic) Kidney Donors’, British Journal of Health Psychology 19(2) (2014), pp. 393–408; S. Parry, ‘(Re) Constructing Embryos in Stem Cell Research: Exploring the Meaning of Embryos for People Involved in Fertility Treatments’, Social Science & Medicine 62(10) (2006), pp. 2349–2359; Carter et al., ‘The Social Licence for Research’.
66.
Montgomery, ‘Data Sharing and the Idea of Ownership’.
67.
Op. Cit.
68.
Liddell et al., ‘Patient Data Ownership’.
69.
Montgomery, ‘Data Sharing and the Idea of Ownership’.
70.
Laurie, ‘Privacy and Property?’
71.
L. Floridi, The Fourth Revolution: How the Infosphere Is Reshaping Human Reality (Oxford: Oxford University Press, 2014).
72.
E. Postan, Embodied Narratives: Protecting Identity Interests through Ethical Governance of Bioinformation (Cambridge: Cambridge University Press, 2022).
73.
Floridi, The Fourth Revolution.
74.
Op. cit.
75.
Op. cit.
76.
In contrast to other studies, this was not the view of biosamples expressed by participants in one study: K. S. Steinsbekk, L. O. Ursin, J. A. Skolbekken, and B. Solberg, ‘We’re Not in It for the Money-Lay People’s Moral Intuitions on Commercial Use of “their” Biobank’, Medicine, Health Care and Philosophy 16(2) (2013), pp. 151–162.
77.
Unexpectedly receiving information about one’s genetic heritage can have a significant impact on one’s sense of self; C. A. B. Redhead and L. Frith, ‘Donor Conception, Direct-to-Consumer Genetic Testing, Choices, and Procedural Justice: An Argument for Reform of the Human Fertilisation and Embryology Act 1990’, Medical Law Review 32(4) (2024), pp. 505–529; L. Gilman, C. Redhead, N. Hudson, M. Fox, P. Nordqvist, F. MacCallum, J. Kirkman-Brown, and L. Frith, ‘Direct-to-Consumer Genetic Testing and the Changing Landscape of Gamete Donor Conception: Key Issues for Practitioners and Stakeholders’, Reproductive Biomedicine Online 48(1) (2024), p. 103421.
78.
Our thanks to Emily Postan for raising this point in response to a presentation of an earlier version of this article.
79.
For a discussion of the concepts of ownership, gift-giving, and alienability, in relation to organ donation, see K. Zeiler, ‘Neither Property Right nor Heroic Gift, neither Sacrifice nor Aporia: The Benefit of the Theoretical Lens of Sharing in Donation Ethics’, Medicine, Health Care and Philosophy 17 (2014), pp. 171–181.
80.
C. A. Redhead, C. Bowden, J. Ainsworth, N. Burns, J. Cunningham, S. Holm, and S. Devaney, ‘Unlocking the promise of UK health data: Considering the case for a charitable GP data trust’, Medical Law Review 33(1) (2025), fwae043.
81.
C. Lewis, M. Clotworthy, S. Hilton, C. Magee, M. J. Robertson, L. J. Stubbins and J. Corfield, ‘Public Views on the Donation and Use of Human Biological Samples in Biomedical Research: A Mixed Methods Study’, BMJ Open 3(8) (2013), e003056.
82.
Floridi, ‘The Fourth Revolution’.
83.
S. Sterckx and J. Cockbain, ‘The UK National Health Service’s “Innovation Agenda”: Lessons on Commercialisation and Trust’, Medical Law Review 22(2) (2014), pp. 221–237; C. McMillan, E. Dove, G. Laurie, E. Postan, N. Sethi, and A. Sorbie, ‘Beyond Categorisation: Refining the Relationship between Subjects and Objects in Health Research Regulation’, Law, Innovation and Technology 13(1) (2021), pp. 194–222.
84.
C. Kutz, Complicity: Ethics and Law for a Collective Age (Cambridge: Cambridge University Press, 2000).
85.
Op. cit.
86.
A. de Hingh, ‘Some Reflections on Dignity as an Alternative Legal Concept in Data Protection Regulation’, German Law Journal 19(5) (2018), pp. 1269–1290.
87.
L. Floridi, ‘On Human Dignity as a Foundation for the Right to Privacy’, Philosophy and Technology 29(4) (2016), pp. 307–312.
88.
P. J. van Diest and J. Savulescu, ‘No Consent Should Be Needed for Using Leftover Body Material for Scientific Purposes’, British Medical Journal 325(7365) (2002), pp. 648–651.
89.
V. M. Sheach Leith, ‘Consent and Nothing but Consent? The Organ Retention Scandal’, Sociology of Health and Illness 29(7) (2007), pp. 1023–1042; E. Hanna and G. Robert, ‘Ethics of Limb Disposal: Dignity and the Medical Waste Stockpiling Scandal’, Journal of Medical Ethics 45(9) (2019), pp. 575–578.
90.
Sterckx and Cockbain, ‘The UK National Health Service’s “Innovation Agenda”’. This is a requirement of the UK GDPR, under Article 5(1)(b) for the processing of personal information (not anonymous data).
91.
Delacroix and Lawrence similarly argue that the ownership is unlikely to provide the level of control wished for and is a poor answer to the vulnerabilities at stake. S. Delacroix and N. D. Lawrence, ‘Bottom-Up Data Trusts: Disturbing the “One Size Fits All” Approach to Data Governance’, International Data Privacy Law 9(4) (2019), pp. 236–252.
92.
A similar argument based on potential narrative identity interests and reputation-related interests is made in Kutz, Complicity, p. 208.
93.
Secretary of State for Health Speech at NHS Innovation Expo, September 2015.
94.
Wolfensberger and Wrigley, Trust in Medicine; Gilson, ‘Trust in Health Care’.
95.
For consideration of the forms such a framework might take, see C. A. Redhead, C. Bowden, J. Ainsworth, N. Burns, J. Cunningham, S. Holm, and S. Devaney, ‘Unlocking the promise of UK health data: considering the case for a charitable GP data trust’, Medical Law Review, 33(1), (2025) p.fwae043.
