Dr Lyngs has been awarded a Carlsberg Foundation Oxford Visiting Fellowship

Under his 2-year Oxford Visiting fellowship, Dr Lyngs will continue his research on digital self-control with Nigel Shadbolt and the HCC team.

by: Jun Zhao

02 Dec 2020

We are very excited that our HCC member Dr Ulrik Lyngs has received a Carlsberg Foundation Visiting Fellowship at the University of Oxford. Dr Lyngs completed his DPhil study with HCC last year. His research focuses on design patterns that support self-control during use of smartphones and laptops.

The fellowship will bring Dr Lyngs back to Oxford in February 2021 to continue his research, in an ambitious collaboration between HCC and the University of Oxford Counselling Service, as well as external collaborators at Human Centered Design & Engineering, UW, and the University of Copenhagen’s Department of Computer Science and Center for Social Data Science.

by: Jun Zhao

02 Oct 2020

HCC is delighted to welcome Prof. Reuben Binns, who has commenced his post as Associate Professor of Human Centred Computing on 1 October.

Reuben Binns is an Associate Professor of Human Centred Computing, working between computer science, law, and philosophy, focusing on data protection, machine learning, and the regulation of and by technology. Between 2018-2020, he was a Postdoctoral Research Fellow in AI at the Information Commissioner’s Office, addressing AI / ML and data protection. He joined the Department of Computer Science at the University of Oxford as a postdoctoral researcher in 2015. He received his Ph.D. in Web Science from The University of Southampton in 2015.

HCC welcome 4 new DPhil students

HCC welcome 4 new DPhil students

by: Jun Zhao

02 Oct 2020

We are very excited to welcome three new DPhil students to join us in Michaelmas 2020. They are:

Just like the rest of HCC members, these new students all come from different backgrounds and pursue different ways to make a better digital society. We can’t wait to see where their research will lead to!

Welcome to Oxford!

Human Centred Computing & Safety Reading Group

Join our new reading group on safety, security, and privacy in HCC.

by: Martin J. Kraemer

30 Sep 2020

A weekly reading group on human centred computing with a focus on safety, security, and privacy. The group will include close readings of academic papers and presentations of works in progress, particularly those applying interdisciplinary approaches to socio-technical subjects. Therefore topics will include both ‘technical’ areas like machine learning and qualitative or ‘social’ subjects like the home as a research area.

The group will meet on Fridays at 2pm BST via Teams–all are welcome to join. Please email julia.slupska@cybersecurity.ox.ac.uk to be added to our Teams channel.

HCC received the funding from Oxford Martin School

HCC received the funding from Oxford Martin School

by: Jun Zhao

02 Sep 2020

Oxford Martin School awarded a multi-million research grant to HCC academics, led by Prof Sir Nigel Shadbolt, Prof Sir Tim Berners-Lee, Prof. Max Van Kleek, Prof. Reuben Binns and Dr. Jun Zhao to build the new generation world wide web and promote human autonomy and self-determination in the digital societies.

Thirty years ago, the World Wide Web launched as an open, common, universal infrastructure that anyone with a computer and a modem could use to communicate, publish and access information. In recent years, however, it has radically diverged from the values upon which it was founded, and it is now dominated by a number of platform companies, whose business models and services generate huge profits.

by: Helena Webb

28 Jul 2020

HCC project ‘Digital Wildfire’ has won a University of Oxford Vice Chancellor’s Innovation Award. These awards recognise high-quality research-led innovation across the University. Our project received a Highly Commended in the category of Policy Engagement.

Social media platforms such as Facebook, Instagram, Snapchat and Twitter are a hugely popular feature of modern life as they enable users to share content, news and ideas with many others around the world. Unfortunately, these same capabilities allow the spread of ‘digital wildfires’ in which harmful content spreads rapidly online and damages individuals, groups and even entire communities. Digital wildfire events are becoming more and more common; for instance, we are all familiar with malicious campaigns against individuals, hate speech against demographic groups, and worries over the spread of fake news and conspiracy theories online.

Our Digital Wildfire project was a collaboration between Oxford and the Universities of Warwick, Cardiff and De Montfort. It was led by Professor Marina Jirotka and involved HCC member Helena Webb. We conducted various activities to investigate how digital wildfires spread on social media, what kinds of harm they cause, and what actions can be taken to limit or even prevent their damage.

by: Jun Zhao

03 Jul 2020

The increasing reach and pervasiveness of AI and algorithms into everyday life raises pressing social and ethical issues for individuals and communities. Several recent deliberations, including track-and-trace for the pandemic and fight for a fairer society, have raised the urgency for critical thinking about our society and technology innovations now more than ever. Under these tremendous circumstances, we hear a great need from Oxford researchers and students for a platform to voice and exchange our concerns, and reflect and react to the urgent need for more inclusive and responsible algorithms and technologies.

The first virtual ReEnTrust Responsible Technology workshop was a direct response to this call for actions. Thanks to the support of OxAI, Oxford Business Network for Technology and the EPSRC ReEnTrust project, we welcomed 18 participants from 5 different Oxford faculties/departments, including Oxford CS, International Development, Law, Geography and particularly the Said Business School.

by: Konrad Kollnig

31 May 2020

Tracking, the collection of data about user behaviour, is widespread on the web. For this reason, the idea of a “Do-Not-Track” (DNT) setting emerged a little more than a decade ago, in 2009. This system gives users a simple choice to reject and accept online tracking on websites. Despite great ambitions, DNT has failed, whilst the practice of tracking continues. I want to explore with you why, but also how DNT has managed to change user privacy for the better.

by: Max Van Kleek

21 May 2020

Over the past few weeks, digital contact tracing has entered centre stage as a crucial element in strategies for exiting COVID-19 lockdown. But in order for digital contact tracing to succeed, people need to understand what it is, how it works, and the key challenges that need to be overcome. This short blog post is one attempt to do these three things, and to provide links to other articles for more information.

Digital contact tracing is simply the name given to a new kind of application that uses people’s mobile phones to keep track of whom they’ve been in close proximity to in the recent past. It relies on the assumption that most people carry a smartphone with them, and uses this assumption to translate the problem of keeping track of people to keeping track of their smartphones instead.

by: William Seymour

05 May 2020

Main Track Papers

‘I Just Want to Hack Myself to Not Get Distracted’: Evaluating Design Interventions for Self-Control on Facebook ( Ulrik Lyngs, Kai Lukoff, Petr Slovak, William Seymour, Helena Webb, Marina Jirotka, Max Van Kleek and Nigel Shadbolt)

Beyond being the world’s largest social network, Facebook is for many also one of its greatest sources of digital distraction. For students, problematic use has been associated with negative effects on academic achievement and general wellbeing. To understand what strategies could help users regain control, we investigated how simple interventions to the Facebook UI affect behaviour and perceived control.

Informing the Design of Privacy-Empowering Tools for the Connected Home (William Seymour, Martin Krämer, Reuben Binns and Max Van Kleek)

Connected devices in the home represent a potentially grave new privacy threat due to their unfettered access to the most personal spaces in people’s lives. Prior work has shown that despite concerns about such devices, people often lack sufficient awareness, understanding, or means of taking effective action. To explore the potential for new tools that support such needs directly we developed Aretha, a privacy assistant technology probe that combines a network disaggregator, personal tutor, and firewall, to empower end-users with both the knowledge and mechanisms to control disclosures from their homes.