The COVID-19 pandemic occurs at a time when there is also a massive availability of ICT tools and technologies. As a result, all over the world, programmers, engineers, computer scientists and data specialists are working alongside medical researchers and epidemiologists to develop tools that can help track the spread of the virus, minimise its growth, and support the vulnerable. It would be an irresponsible government that did not wish to exploit all available resources at its disposal to protect its population and support global efforts to counter the negative effects - both economic and medical - of the pandemic.

However, there is a risk that the urgency of the current crisis leads to rushed technological development which may neglect ethical safeguards and responsibility approaches to ensure any new tools act in the best interests of society and its citizens. In recent years, public concerns have emerged over the ubiquity of personal data collection and digital surveillance in our daily lives, alongside the potential for the misuse of personal data. We can see reasons to express similar concerns over some of the current technological responses to COVID-19.

Two examples:

NHSX, the government unit for digital transformation and implementation in the NHS, is working on an app for tracking symptoms in the UK population. It is just one of many organisations across the globe working on such systems. The proposed NHSX app will use Bluetooth on smartphones to record a user’s close proximity contacts with other app users. Each user monitors their own symptoms and when reporting a self-diagnosis to the app, is asked to self-isolate within their household. The user’s close proximity contacts are anonymously notified and advised to self isolate. The University of Oxford team sharing an epidemiological model to help configure the app, state that it would need around 55% of the total UK population to use the app to suppress the epidemic (combined with the shielding of the over 70s).

UK citizens are being encouraged to register themselves on a gov.uk site if they are extremely medically vulnerable. Being placed on this register can enable them to get access to supplies and other resources to help to meet their needs. The UK government has been handing over some of the information gathered from the register, and other sources, to major supermarkets. This helps the supermarkets identify who to prioritise for grocery deliveries. Although the gov.uk site does not mention that the information individuals provide may be used in this way, the passing over of content in this kind of emergency situation is permissible within data protection law. The supermarkets will be expected to comply with strict data security standards whilst they hold the data and then to delete it once it is no longer needed.

Each of these innovations can be seen as a practical solution to a vital and time critical need; plus they are governed by data protection regulations and processes of ethical oversight. However, there are also reasons for caution. Privacy is about more than just secure storage of data. Inferences can be made about us from the information we share; even after supermarkets have deleted the information received from the government, they will have information about citizens’ purchasing habits and delivery records during the pandemic that will enable them to draw significant conclusions about their health and social status. Similarly, data collected about us in a contact tracking app might reveal where we are going and who we are with, enabling inferences to be drawn about our interpersonal, political and religious behaviours. A further, broader, concern is that the development of technological responses to COVID-19 rests on an assumption that individuals within a population share the same levels of digital access and digital literacy. Even in a wealthy country such as the UK this is not the case. Diverting a huge amount of resources and attention to technological solutions risks leaving behind those who cannot or do not access the internet regularly, or who cannot afford to have smartphones, as well as distracting from other possible solutions. For instance, if much of our response to COVID-19 rests on the success of a contact tracking app, we may leave a significant number of people unprotected - as well as creating new data protection risks for smartphone users who lack knowledge about privacy settings.

For these reasons, some professionals concerned with upholding citizens’ rights beyond the immediate emergency have begun work on legal safeguards and protections. Others are sounding a note of alarm about technological responses to social and medical challenges and pointing to alternative solutions - such as the contact tracking app being developed by researchers at MIT and allied universities that doesn’t record a user’s location or share information about the user with the government.

Responsible Research and Innovation (RRI)

Whilst the urgency of the situation requires a rapid response, we must also be diligent about the introduction of technologies that can have a huge impact on entire populations and undermine safeguarding of personal privacy, equality of access to resources, and avoidance of exclusion.

One method for trying to ensure better outcomes for technology is to use responsible research and innovation (RRI), which provides approaches and methods for tackling challenges such as this. It specifically seeks to align scientific approaches with the needs of society to guide the design and implementation of technologies, through a series of careful considerations that need to be carried out iteratively as the needs and context develop.

i) The first of these is to anticipate. This does not seek to predict the future, but rather to prepare for it. For example, in the context of combating the spread of the virus, it may be deemed necessary to collect much more data on citizens, or to share sensitive data (eg as healthcare providers have shared vulnerable patients’ data with supermarkets to ensure they can receive home deliveries). But we need to think ahead to how that data itself is vulnerable - how will it be protected? how will people’s privacy be respected? what will happen to that data eventually? As a society, how do we respond to exhortations to spy on and report our neighbours - and if we decide that this, too, is necessary, how could we come back from that? The government may well be lacking an exit plan for the pandemic; however, gaining quick fixes at the cost of damaging fundamental citizens’ rights can bring serious consequences.

ii) It is therefore necessary to reflect, carefully, on issues including approaches taken in other countries, and the long term implications of what we are planning to do. If Google and Apple join forces to install contact-tracking apps on people’s phones without their consent, when does that update get uninstalled? It is easy to think of ways in which information about where citizens have been and who they have been with could be hacked or misused. The tradeoff between surveillance and public health has been the major concern in this. Several have been attempting to address this balance by experimenting with a decentralised approach, allowing individuals to preserve their individual rights and avoiding a centralised collection of data. However, lessons from countries like Singapore demonstrate strongly that applying tech to COVID-19 can be much more complex - a lack of uptake can invalidate the whole approach. Technologies may bring unprecedented consequences; and technologies alone cannot provide all the answers to combat this pandemic.

iii) It is therefore also necessary to engage - to hold discussions with those affected, to consult with those who have expertise, not just in the technological arena, but also those who can offer insight into how interventions might affect particular groups, or have undesired effects.

iv) Finally, as we gather all these insights such as how to reduce negative effects as far as possible and how to manage risks through consultations with publics and experts we can act to develop the technology more responsibly.

An RRI approach to COVID-19

In the case of the current pandemic, an RRI approach might therefore look something like this:

i) We need to safeguard the vulnerable, protect people’s data and privacy, so we must anticipate what negative outcomes there might be from rolling out technologies like contact-tracing, and how to ameliorate or balance those risks during and after the pandemic. ii) We need to reflect on the societies that essentially surveil their citizens and consider carefully if that has proven to be effective and an acceptable infringement of their basic rights to privacy and autonomy. There are ways in which contact-tracing, for example, can be carried out without infringing on citizens’ rights to privacy; even though the approach would face the same critical challenge of reaching the sufficient uptake to ensure the effectiveness of approach. iii) We need to consult not just with experts in “molecular evolution, epidemiology, clinical science and practice, modelling emerging infectious diseases, behavioural science, statistics, virology and microbiology” but also those who understand how people live under lockdown and the societal impact of pandemics including: educators, ethicists, sociologists, historians, lawyers, charities, non-governmental organisations, healthcare providers, psychologists, as well as citizens themselves. iv) We need to act - this is unquestionable - but it must be done with due consideration.

A Responsible Research and Innovation approach would therefore seem to be extremely valuable in trying to ensure that the use of technology in response to a societal need is carried out thoughtfully and with regard for the short, immediate and long-term implications. It is therefore extremely concerning to see how little attention is being given to engaging with citizens, hearing their concerns, discussing their legitimate worries. Policymakers do not have a good record of allowing public concerns to influence policy, despite repeated demands that they do so, and business in the UK Parliament is still disrupted, so that there is little opportunity for democratic challenge to government action.

There is a clear and present need for questions such as those above to be addressed transparently, fully and with accountability. We are calling on the government and technology providers to ‘show their working’, so that in this and other democratically accountable nations, citizens can see that their long term needs are understood and respected. Because these decisions are not being taken just for today. How we act in this emergency will have long-term consequences for future pandemics and for society.