GDPR: from checklists to meaningful change

Author: Sarah Gold, CEO, Projects by IF

GDPR could lead to new products and services designed around digital rights. There are exciting opportunities for organisations that think about data ethics and digital rights now. In this blog, Sarah Gold talks about what Projects by IF has learnt from exploring some of these opportunities.

The new rights that GDPR brings could transform the relationships people have with the organisations they rely on.

Organisations can build services with digital rights baked into the design, so they’re not hidden away in privacy policies and control panels that no one accesses. The potential for change is big and exciting.

To get to this point, companies need to move beyond tweaking their terms and conditions, to redesigning services around these new digital rights. It also means solving important design and technology problems along the way.

At IF, we've been looking into these new opportunities. Here’s a summary of three things we’ve learnt:

Design for how people really use services

One of the new rights people have under GDPR is ‘data portability,’ the right to move data about them between services. It could make switching services much simpler.

However, data is often about more than one person. For example, a phone bill describes a bill payer, their friends, family members and colleagues. What happens when two people with conflicting rights over data cannot agree to move it?

Through our research into data portability with the Open Data Institute, we found that services need to be designed to consider the needs and relationships of multiple people from the outset. Services should support people’s existing relationships which means finding new ways to ask for consent that reflect how people really use a service.

Ensure automated decisions can be understood

Digital products and services are increasingly making decisions about people in sensitive areas of their lives - whether that’s getting a job, managing finances or choosing which school to send their children to. Under GDPR, people have the right to understand how an automated decision has been made and to challenge that decision. We believe it’s also important to consider the role of wider society in helping people understand how services work.

Last year, we developed a prototype of a fictional benefits service to explore the challenges, and limits, of making automated decisions understandable to people. This prototype was created purely for research - it’s not a vision for how we think a benefits service should run.

The prototype demonstrated how people should be able to see, within the context of the service itself, whether a decision has been made by a human or a machine. It also explored how people could challenge a decision, or choose to challenge the decision as part of a wider group, perhaps supported by a third party organisation. We also suggest how a government inquiry might investigate when things go wrong.

We’ve also recently started a project with the London School of Economics to create a framework that product teams can use to help people understand automated decisions. Through this work, we will continue to explore how services can help people exercise their new rights in relation to automated decision-making.

Don’t hide rights, make them part of the service

To date, most companies have relied on ‘terms of service’ that few people read or understand, to create a legal agreement between a person and the service. They are not good enough and won’t meet GDPR’s stricter conditions for consent. People don’t have the time, or sometimes skill, to read and understand terms. So how do we help people understand how data about them will be used?

We believe it starts with explaining what data is being collected, and how it will be used, at the point of use. It also means asking for permission in a way that people can understand.

A recent prototype we developed, called AutoSwap, helps illustrate how this might work. AutoSwap is a fictional service that automatically switches mobile phone provider for a number of different reasons, for example, cost, privacy policy or signal strength. AutoSwap demonstrates how services can explain to people, in clear and accessible ways, how data about them will be used.

How to get there

Forward-thinking organisations can lead the way in solving GDPR’s challenges and showing what good practice looks like. To do that, organisations need to understand how they gather data and have better conversations about what they’re using it for and why. That’s why we’ve started data ethics toolkits, which help them do just that. Organisations also need to upskill product teams to make more ethical decisions and embed digital rights into their services. We’ve been sharing what we know about this in our open-access catalogue of design patterns for data sharing.

With more personal data being collected, and more parts of our lives becoming connected to the digital world, this issue of data and ethics will only get more important in the next few years. We need courageous organisations to consider digital rights and data ethics now. They’ll be the organisations who will have trust as their differentiator.