“Hey Alexa, stop sharing my data!”

Our homes are becoming ever more connected, with ownership of smart devices more than doubling in the last two years. This data, from a report earlier this year from PwC, reveals that almost 40% of people now enter the connected home market via smart entertainment devices. In this Comment piece Martin Head, Programme Director at Corsham Institute, argues that as integration grows the data gleaned from our conversations at home and our interactions with the domestic Internet of Things (IoT) has to be subject to a far more developed and robust culture around data ethics.

The PwC 'Disrupting Utilities' survey estimates that the market in 2019 for smart devices will be 10.8bn, whilst techUK have recently reported that the use of smart speakers has doubled from 2017 to 2018, and that households owning more than three smart home products have grown by a quarter since 2017.

As Ronan O’Regan, PwC’s Digital Utilities lead said, as their report was launched, “While smart home assistants are relatively new to the market, we believe they could potentially be the ‘glue’ towards wider adoption. You could say they are having an ‘iPhone effect’ in the market".

The development of connected technology is still in its infancy with some devices offering solutions seemingly still in search of a problem (Bluetooth kettles anyone?). The real scope and ultimate power of connecting our homes in an integrated way is still a long way from being realised.

There are huge opportunities in how we might use the technology to support and protect people, however, these devices generate vast quantities of personal data – a fact that may be misunderstood by many users. Therefore, a deeper understanding of data privacy and an ability to develop a trusted relationship with the providers and the uses of the technology is needed. As techUK’s Sue Daley wrote in a piece for the Observatory for a Connected Society in regard to AI and ethics, “It is our job to continue to build the culture of data trust and confidence needed to ensure technology remains a force for good”.

The relevance and power of all the personal data gathered from a connected home when numerous devices are integrated together is going to take on new dimensions at an exponential rate. PwC in launching their recent report said that, “tech giants are blurring lines and breaking down barriers, creating innovative products that capture data to provide differentiating insights, novel solutions and a seamless user experience”, but they recognised that trust in suppliers “could become a major battleground for traditional players over the next few years”.

For most consumers the journey to understand the meaning and full implications of sharing their personal data is only just beginning. Corsham Institute’s Your Data, Your Rights survey earlier this year showed that while 60% of respondents cared a lot about the use of their personal data, only 18% knew a lot about its collection, and the recent techUK ‘Connected Homes’ report shows that for 23% of consumers personal privacy was the second highest barrier to buying connected home products (after the cost).

Further, while there is an official definition of personal data in the 2018 Data Protection Act [Chapter 12, Part 1:3(2)] as “any information relating to an identified or identifiable living individual… particular by reference to, (a) an identifier such as a name, an identification number, location data or an online identifier, or (b) one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of the individual”, what we’re prepared to share isn’t a fixed concept, it will differ between age groups, other demographics and for the value we perceive we are receiving from each device or app.

In a recent workshop held at the Digital Catapult, by a multi-university Petras funded project into the ‘Value of Personal Data in the Internet of Things, the ‘privacy paradox’ was highlighted as the tension between individuals’ stated desire to maintain privacy, set against their willingness to share their data online, while it was recognised that ‘little is understood about the value consumers place on keeping their data private’. Their work to date includes findings that individuals perceive a lack of choice about whether to share their personal data, care deeply about protecting it and are even willing to pay to do so.

The debate over data ethics is increasingly fundamental, and ever more urgent as our homes become more connected, when devices could be listening to our everyday conversations and making decisions about our preferences, opinions and shopping habits. The challenge is not centered on finding more appropriate business models to use the data, nor that people should have to pay for the protection of the personal data generated about themselves, it’s that there is an urgent need for a culture of data ethics to develop as rapidly as the technology is doing and to grow hand in hand with it.

Individual citizens need to own and control third party access to their personal data and further still, when access has been granted, have choices on how it is used on an ongoing basis, and for that permission to be able to be withdrawn if circumstances change. This is an area that Corsham Institute will be working in with partners to develop ethical frameworks and community-led test beds to understand in greater depth the implications for all of us, as our homes become ever more connected.

If Alexa and Cortana are to become invited guests into our living rooms or even virtual members of the family, they must end up serving the real interests of their owners and work for us, and not use their all too attractive functionality as a cover story for a massive data mining exercise by those who market them.