Tracking, the collection of data about user behaviour, is widespread on the web. For this reason, the idea of a “Do-Not-Track” (DNT) setting emerged a little more than a decade ago, in 2009. This system gives users a simple choice to reject and accept online tracking on websites. Despite great ambitions, DNT has failed, whilst the practice of tracking continues. I want to explore with you why, but also how DNT has managed to change user privacy for the better.
Over the past few weeks, digital contact tracing has entered centre stage as a crucial element in strategies for exiting COVID-19 lockdown. But in order for digital contact tracing to succeed, people need to understand what it is, how it works, and the key challenges that need to be overcome. This short blog post is one attempt to do these three things, and to provide links to other articles for more information.
Digital contact tracing is simply the name given to a new kind of application that uses people’s mobile phones to keep track of whom they’ve been in close proximity to in the recent past. It relies on the assumption that most people carry a smartphone with them, and uses this assumption to translate the problem of keeping track of people to keeping track of their smartphones instead.
One naive appoach might be to keep track of all the places a person has been, and to find others who were in the same places at the same time. But this approach poses significant privacy concerns, because you can tell a lot about people based on where they go, and how long they linger in each place. It also captures more data than necessary: note that, in order to determine that you’ve been near someone, you don’t necessarily need to know where you were when that happened.
How it Works: The DP3T/Apple-Google Approach
Instead, the approach being rolled out by Apple and Google is inspired by a technique from a multi-institution academic project called DP3T, which I have had the pleasure of having a little bit of involvement in. It consists of having phones sense and identify each other using Bluetooth Low Energy (BLE); a variation of the same Bluetooth wireless technology now commonly used to connect wireless headphones, keyboards and mice. BLE is, in some sense, ideal for this kind of application; as its name suggests, it is designed to consume little power (important for those battery-hungry phones), and is designed for low-bandwidth, “discovery” applications, like Apple iBeacons used in some retail shops and public spaces.
Here is the basic idea of how it works:
Smartphones start acting as beacons, announcing themselves to other smartphones nearby using BLE.
At the same time, they start listening for other nearby beaconing smartphones, noting down in a little “contact” diary of each smartphone it has seen.
When a person becomes sick, they are instructed to get a test (or self-test), and to inform a public health authority digitally through an app. This health authority, in turn, confirms this case, and relays this information to all other people’s smartphones as a confirmed infected patient.
Every time the health authority announces a new infected patient, their smartphone checks its private “contact diary” to see if they have any matches. If they have, this means they’ve been near a person who has declared themselves sick–implying they might be infected, and should probably self-isolate. If not, they’re assumed to be safe.
This is a simple description of the basic technique, but in practice there’s a bit more to it. One of the crucial details omitted in this description is how people’s privacy is preserved in the process. We’ll get back to this later.
Main Track Papers
‘I Just Want to Hack Myself to Not Get Distracted’: Evaluating Design Interventions for Self-Control on Facebook ( Ulrik Lyngs, Kai Lukoff, Petr Slovak, William Seymour, Helena Webb, Marina Jirotka, Max Van Kleek and Nigel Shadbolt)
Beyond being the world’s largest social network, Facebook is for many also one of its greatest sources of digital distraction. For students, problematic use has been associated with negative effects on academic achievement and general wellbeing. To understand what strategies could help users regain control, we investigated how simple interventions to the Facebook UI affect behaviour and perceived control.
Informing the Design of Privacy-Empowering Tools for the Connected Home (William Seymour, Martin Krämer, Reuben Binns and Max Van Kleek)
Connected devices in the home represent a potentially grave new privacy threat due to their unfettered access to the most personal spaces in people’s lives. Prior work has shown that despite concerns about such devices, people often lack sufficient awareness, understanding, or means of taking effective action. To explore the potential for new tools that support such needs directly we developed Aretha, a privacy assistant technology probe that combines a network disaggregator, personal tutor, and firewall, to empower end-users with both the knowledge and mechanisms to control disclosures from their homes.
Late Breaking Work
Further Exploring Communal Technology Use in the Home (Martin Krämer, Ulrik Lyngs, Helena Webb and Ivan Flechais)
Factoring User Experience into the Security and Privacy Design of Smart Home Devices: A Case Study (George Chalhoub, Ivan Flechais, Norbert Nthala, Ruba Abu-Salma and Elie Tom)
Does Siri Have a Soul? Exploring Voice Assistants Through Shinto Design Fictions (William Seymour and Max Van Kleek)
Student Research Competition and Doctoral Consortium
A Design Philosophy for Agents in the Smart Home (William Seymour)
Workshops and Workshop Papers
Human-Centered Approaches to Fair and Responsible AI (Min Kyung Lee, Nina Grgić-Hlača, Michael Carl Tschantz, Reuben Binns, Adrian Weller, Michelle Carney and Kori Inkpen)
The ReDD Workshop: A Template for Supporting People in Regaining Control Over Digital Device Use (Ulrik Lyngs, Maureen Freed, Kai Lukoff, Max Van Kleek and Petr Slovak)
Responsibility and Privacy: Caring for a Dependent in a Digital Age (Martin Krämer, William Seymour and Ivan Flechais)
The COVID-19 pandemic occurs at a time when there is also a massive availability of ICT tools and technologies. As a result, all over the world, programmers, engineers, computer scientists and data specialists are working alongside medical researchers and epidemiologists to develop tools that can help track the spread of the virus, minimise its growth, and support the vulnerable. It would be an irresponsible government that did not wish to exploit all available resources at its disposal to protect its population and support global efforts to counter the negative effects - both economic and medical - of the pandemic.
However, there is a risk that the urgency of the current crisis leads to rushed technological development which may neglect ethical safeguards and responsibility approaches to ensure any new tools act in the best interests of society and its citizens. In recent years, public concerns have emerged over the ubiquity of personal data collection and digital surveillance in our daily lives, alongside the potential for the misuse of personal data. We can see reasons to express similar concerns over some of the current technological responses to COVID-19.
Adrienne Hart is Artistic Director of Neon Dance, an internationally renowned dance company. In 2019 she approached the University of Oxford’s TORCH programme hoping to find a researcher that could support the company’s latest collaborative work “Prehension Blooms”. The work, due to premiere in 2021, aims to integrate swarm robotics in a live performance context.
Following a successful bid for a TORCH Theatres Seed Fund award, Adrienne is now collaborating with Helena Webb, Senior Researcher at the Department of Computer Science, and “Prehension Blooms” will feed into the ongoing project RoboTIPS - Developing Responsible Robots for the Digital Economy. RoboTIPS is a 5-year research study led by Marina Jirotka, Professor of Human Centred Computing, that seeks to foster practices for responsibility in the development and use of social robots.
In this conversation Helena discovers more about Neon Dance’s research and development project and how “Prehension Blooms” will connect to RoboTIPS.
Since Monday 23 March 2020, most schools in the UK have embarked on school closure in response to the COVID-19 outbreak in Europe. For most parents, aside from juggling between the daunting prospect of home-schooling and their work, they are also overwhelmed by their children’s need to stay connected with friends.
This is particularly a challenge for parents of primary school age children who have not been extensively exposed to independent online social communications until the COVID-19 crisis. Under this stressful and rapidly developing situation, many parents handed over to their children some general purpose video conference applications or social media platforms as a quick and easy solution. This article highlights several things that parents should look out for when facilitating their young children’s first online chat or social communication, to keep their children safe, and enjoy a happy and rewarding home schooling time.
Viewpoint > This article contains viewpoints of the author and is not intended as a research document
Even before the COVID-19 crisis, there has been great concern about the effects of misinformation online. Much of the debate around misinformation, its potential harms, and what to do about it, has centred significantly around specific events where the tangible effects of misinformation could be measured-for instance, around elections past and future, or the UK Referendum.
The COVID-19 crisis presents a new event around which misinformation can have other significant effects, both tangible and intangible. It can influence people’s behaviours at a time when cooperation and successful compliance with public health and govenrmental restrictions can mean the difference between success and failure in an urgent public health intervention to slow or stop an epidemic.
The COVID-19 crisis has meant the sudden and urgent need for physical distancing while simultaneously maintaining-or even increasing-the need to effectively communicate, coordinate and collaborate.
This combination of factors has meant many have been adopting new “remote working” tools in droves. Many are forced to use the tools that their organisations have chosen for them; but others have the opportunity to choose the tools that will both serve their needs and protect them long term. But which tools are the most secure, and most privacy respecting?
At the recent 2020 MPLS Impact Awards, a Commendation Certificate was awarded to HCC DPhil student Ulrik Lyngs for his work on the ‘Reducing Digital Distraction’ (ReDD) workshop. The MPLS Impact awards aim to
recognise and reward researchers at all career stages, for research that has had, or will have significant social or economic impact.
The ReDD workshop is developed by Ulrik Lyngs in collaboration with Maureen Freed, Deputy Head of Counselling at the University of Oxford. The counselling service works one-to-one with nearly 3,000 students each year. An increasing proportion of these students report strongly conflicting feelings about their digital devices, because having these devices ever-present and switched on often compromises their ability to tune out distractions and be wholly and productively immersed in academic work.
The ReDD workshop aim to help students struggling in this domain. In the workshop, students reflect on their use, struggles, and goals for digital technology use, and are provided concrete tools and strategies. In the process, the workshops also generate valuable data for advancing the frontiers of digital wellbeing research.
Playing this video requires sharing information with YouTube.
Sir Nigel Shadbolt launched the new Institute for Ethics in AI at Oxford, a dedicated research institute to the interdisciplinary study of ethics challenges introduced by emerging AI technologies.
He is tasked with overseeing the development of the new research centre, that aims to tackle the fundamental challenges of the 21st century.
The Schwarzman Centre was made possible by a generous donation from Stephen A. Schwarzman to the University of Oxford.
The new research institute is expected to open in 2024, adjacent to the current Mathematical Institute.
We look forward to working with the Institute, and helping shape its research agenda around human centred systems and AI.
The purpose of the Doctoral Prize scheme is to
help retain the best students receiving EPSRC funding in research careers to develop them beyond the end of the PhD to help launch a succesful career in research, and to increase the impact of the PhD in terms of publications, KT [Knowledge Transfer] and outreach.
Under the Doctoral Prize, Ulrik will
- Continue dissemination of his DPhil work (including at the CHI 2020 conference),
- Expand collaborations with other research groups (including Human Centered Design & Engineering, University of Washington, Seattle, and Google’s Digital Wellbeing team), and
- Continue collaboration with Oxford University’s Counselling Service on the Reducing Digital Distraction workshops, which provide practical guidance for students who struggle to manage their relationship with digital devices such as smartphones and laptops.
How shall we better protect children online? And what do Chinese parents think about their children’s online safety? To answer these questions, a new report presented findings from an online survey of 593 Chinese parents with children aged 6-10: While Chinese parents showed some level of privacy concerns, their primary concerns were still around inappropriate content and screen time. Online short-video platforms (e.g. TikTok) played an important role in Chinese young children’s daily life, however, many of these apps are not always appropriate for children’s age.
As the generation growing up at the frontier of IoT, children’s daily activities are constantly shifting from ’offline’ to ’online’. Both the amount of information, and the value of them has been continuously increasing, and there has been a growing risk of children’s privacy being compromised or improperly exploited . China is now home to 169 million Internet users under the age of 18, with 89.5% of children under 13s have been reported to have access to the Internet. While mobile phones are still the major way for teenagers going online (92%), tablets (37.4%) and smart TVs (46.7%) are among the devices most frequently used and have been used more by teenagers than the other age groups,. Alongside with the rapid increase in online adoption of Chinese children, there have been growing concerns. For those under 18s, 30.3% have had exposure to inappropriate contents and 15.6% had experienced online bullying. However, those privacy-related risks have not been looked at or discussed.
On the 27th of October, researchers from the Human Centred Computing team were at the Oxford Science and Ideas Festival, talking to members of the public about privacy, security, and digital wellbeing.
As well as answering questions about how people use technology, the group solicited feedback from families about how happy they were with their use of technology. Responses included hopes and fears around everything from overuse of smartphones and managing passwords, to more futuristic questions about robots in the home.