by: Jun Zhao

 
03 Jul 2020

The increasing reach and pervasiveness of AI and algorithms into everyday life raises pressing social and ethical issues for individuals and communities. Several recent deliberations, including track-and-trace for the pandemic and fight for a fairer society, have raised the urgency for critical thinking about our society and technology innovations now more than ever. Under these tremendous circumstances, we hear a great need from Oxford researchers and students for a platform to voice and exchange our concerns, and reflect and react to the urgent need for more inclusive and responsible algorithms and technologies.

The first virtual ReEnTrust Responsible Technology workshop was a direct response to this call for actions. Thanks to the support of OxAI, Oxford Business Network for Technology and the EPSRC ReEnTrust project, we welcomed 18 participants from 5 different Oxford faculties/departments, including Oxford CS, International Development, Law, Geography and particularly the Said Business School.

by: Konrad Kollnig

 
31 May 2020

Tracking, the collection of data about user behaviour, is widespread on the web. For this reason, the idea of a “Do-Not-Track” (DNT) setting emerged a little more than a decade ago, in 2009. This system gives users a simple choice to reject and accept online tracking on websites. Despite great ambitions, DNT has failed, whilst the practice of tracking continues. I want to explore with you why, but also how DNT has managed to change user privacy for the better.

by: Max Van Kleek

 
21 May 2020

Over the past few weeks, digital contact tracing has entered centre stage as a crucial element in strategies for exiting COVID-19 lockdown. But in order for digital contact tracing to succeed, people need to understand what it is, how it works, and the key challenges that need to be overcome. This short blog post is one attempt to do these three things, and to provide links to other articles for more information.

Digital contact tracing is simply the name given to a new kind of application that uses people’s mobile phones to keep track of whom they’ve been in close proximity to in the recent past. It relies on the assumption that most people carry a smartphone with them, and uses this assumption to translate the problem of keeping track of people to keeping track of their smartphones instead.

by: William Seymour

 
05 May 2020

Main Track Papers

‘I Just Want to Hack Myself to Not Get Distracted’: Evaluating Design Interventions for Self-Control on Facebook ( Ulrik Lyngs, Kai Lukoff, Petr Slovak, William Seymour, Helena Webb, Marina Jirotka, Max Van Kleek and Nigel Shadbolt)

Beyond being the world’s largest social network, Facebook is for many also one of its greatest sources of digital distraction. For students, problematic use has been associated with negative effects on academic achievement and general wellbeing. To understand what strategies could help users regain control, we investigated how simple interventions to the Facebook UI affect behaviour and perceived control.

Informing the Design of Privacy-Empowering Tools for the Connected Home (William Seymour, Martin Krämer, Reuben Binns and Max Van Kleek)

Connected devices in the home represent a potentially grave new privacy threat due to their unfettered access to the most personal spaces in people’s lives. Prior work has shown that despite concerns about such devices, people often lack sufficient awareness, understanding, or means of taking effective action. To explore the potential for new tools that support such needs directly we developed Aretha, a privacy assistant technology probe that combines a network disaggregator, personal tutor, and firewall, to empower end-users with both the knowledge and mechanisms to control disclosures from their homes.

The COVID-19 pandemic occurs at a time when there is also a massive availability of ICT tools and technologies. As a result, all over the world, programmers, engineers, computer scientists and data specialists are working alongside medical researchers and epidemiologists to develop tools that can help track the spread of the virus, minimise its growth, and support the vulnerable. It would be an irresponsible government that did not wish to exploit all available resources at its disposal to protect its population and support global efforts to counter the negative effects - both economic and medical - of the pandemic.

However, there is a risk that the urgency of the current crisis leads to rushed technological development which may neglect ethical safeguards and responsibility approaches to ensure any new tools act in the best interests of society and its citizens. In recent years, public concerns have emerged over the ubiquity of personal data collection and digital surveillance in our daily lives, alongside the potential for the misuse of personal data. We can see reasons to express similar concerns over some of the current technological responses to COVID-19.

by: Helena Webb

 
14 Apr 2020

Adrienne Hart is Artistic Director of Neon Dance, an internationally renowned dance company. In 2019 she approached the University of Oxford’s TORCH programme hoping to find a researcher that could support the company’s latest collaborative work “Prehension Blooms”. The work, due to premiere in 2021, aims to integrate swarm robotics in a live performance context.

Following a successful bid for a TORCH Theatres Seed Fund award, Adrienne is now collaborating with Helena Webb, Senior Researcher at the Department of Computer Science, and “Prehension Blooms” will feed into the ongoing project RoboTIPS - Developing Responsible Robots for the Digital Economy. RoboTIPS is a 5-year research study led by Marina Jirotka, Professor of Human Centred Computing, that seeks to foster practices for responsibility in the development and use of social robots.

In this conversation Helena discovers more about Neon Dance’s research and development project and how “Prehension Blooms” will connect to RoboTIPS.

by: Jun Zhao

 
13 Apr 2020

Since Monday 23 March 2020, most schools in the UK have embarked on school closure in response to the COVID-19 outbreak in Europe. For most parents, aside from juggling between the daunting prospect of home-schooling and their work, they are also overwhelmed by their children’s need to stay connected with friends.

This is particularly a challenge for parents of primary school age children who have not been extensively exposed to independent online social communications until the COVID-19 crisis. Under this stressful and rapidly developing situation, many parents handed over to their children some general purpose video conference applications or social media platforms as a quick and easy solution. This article highlights several things that parents should look out for when facilitating their young children’s first online chat or social communication, to keep their children safe, and enjoy a happy and rewarding home schooling time.

by: Max Van Kleek

 
29 Mar 2020

Viewpoint > This article contains viewpoints of the author and is not intended as a research document

Even before the COVID-19 crisis, there has been great concern about the effects of misinformation online. Much of the debate around misinformation, its potential harms, and what to do about it, has centred significantly around specific events where the tangible effects of misinformation could be measured-for instance, around elections past and future, or the UK Referendum.

The COVID-19 crisis presents a new event around which misinformation can have other significant effects, both tangible and intangible. It can influence people’s behaviours at a time when cooperation and successful compliance with public health and govenrmental restrictions can mean the difference between success and failure in an urgent public health intervention to slow or stop an epidemic.

The COVID-19 crisis has meant the sudden and urgent need for physical distancing while simultaneously maintaining-or even increasing-the need to effectively communicate, coordinate and collaborate.

This combination of factors has meant many have been adopting new “remote working” tools in droves. Many are forced to use the tools that their organisations have chosen for them; but others have the opportunity to choose the tools that will both serve their needs and protect them long term. But which tools are the most secure, and most privacy respecting?

12 Feb 2020

At the recent 2020 MPLS Impact Awards, a Commendation Certificate was awarded to HCC DPhil student Ulrik Lyngs for his work on the ‘Reducing Digital Distraction’ (ReDD) workshop. The MPLS Impact awards aim to

recognise and reward researchers at all career stages, for research that has had, or will have significant social or economic impact.

The ReDD workshop is developed by Ulrik Lyngs in collaboration with Maureen Freed, Deputy Head of Counselling at the University of Oxford. The counselling service works one-to-one with nearly 3,000 students each year. An increasing proportion of these students report strongly conflicting feelings about their digital devices, because having these devices ever-present and switched on often compromises their ability to tune out distractions and be wholly and productively immersed in academic work.

The ReDD workshop aim to help students struggling in this domain. In the workshop, students reflect on their use, struggles, and goals for digital technology use, and are provided concrete tools and strategies. In the process, the workshops also generate valuable data for advancing the frontiers of digital wellbeing research.