Connected toys: opportunities, risks and challenges

Author: Erik Silfversten, Analyst, RAND Europe

Christmas is one of the most prominent UK holidays and the average Brit spends between £350 and £500 on gifts each year. Many of those gifts will be for children, with the average child aged nine and under receiving on average £350 worth of toys annually. In recent years, toys have become more sophisticated, computerised and connected but have at the same time increasingly been subject to privacy and security concerns. How can we know that this year’s connected Christmas gifts are actually safe?

The connected home becomes the connected toy

Families are increasingly using so-called smart or connected devices to facilitate everyday tasks and services. You can turn off the light using your smartphone, ask your chosen voice assistant for the latest football scores, or have your fridge notify you when it is time to re-stock your favourite breakfast juice. Similarly, children are increasingly engaging with smart or connected toys that can listen, talk and interact. As technology evolves everyday items, it is only logical that children’s toys will become more and more computerised and connected to the Internet.

In addition to children engaging with devices and technology originally designed for adults, such as Alexa, Siri or the Google Assistant, there are a number of smart and connected products specifically designed and marketed for children. These include:

  • Smart and connected toys such as interactive dolls and robots (e.g. My Friend Cayla, Hello Barbie, i-Que).

  • Children’s smartwatches equipped with apps, GPS, Internet connectivity, and microphones (e.g. Gator Watch, Tinitell, Xplora).

  • Learning and development products such as the Starling, a wearable word counter that shows how many words a baby is exposed to, assisting parents with their baby’s language development.

  • Safety and security products such as the Seal SwimSafe, a smart wristband that alerts the parents if the wristband is submerged in water, removed or moves out of range of the parent’s sensor.

Smart toys and connected learning, development and safety products offer a range of features and opportunities for interactive play and education. Smart toys typically connect to a phone or a tablet via Bluetooth, or directly to the Internet, and use voice-recognition technologies that allow the toys to answer children’s questions. Some toys use pre-recorded phrases, while others can query Internet resources (such as Wikipedia) and hold basic conversations with children.

While these types of products and services provide many potential benefits, they also present several challenges and concerns – particularly in relation to security and privacy, but also in relation to unintended consequences and impacts on children’s behaviour.

Security, privacy and behaviour: a brave new world of connected toys

When toys make the transition from being an analogue play tool to being a child’s interactive friend with the ability to collect personal data and provide external information, it becomes necessary to ensure that parents are informed and have confidence in the toys’ security and privacy safeguards. Smart toys are already able to engage in real-time tracking of children, direct communication with children, and storage of personal data including names, photos and voice recordings. As such, consumers would be led to believe that strong safeguards and security measures are already in place.

However, recent reviews of smart toys have prompted concerns and many of these types of devices have been found to lack sufficient security measures. Parents have urged to boycott popular toy manufacturer VTech following a serious hacking incident, Germany recently banned several types of children’s smartwatches due to privacy concerns, and recent reviews from the Norwegian consumer council found that several of the Internet-connected smart toys tested failed “miserably when it comes to safeguarding basic consumer rights, security, and privacy”. These flaws cover a wide range of issues including embedded security, terms and privacy agreements and data security.

Researchers have found that many connected and smart devices for children lack basic security features and embedded security, which may have significant privacy implications. This lack of security may for example mean that a user can take control of a device using just a mobile phone, making it possible to communicate with and listen through the device without having physical access to it.

Children are vulnerable consumers who often cannot understand exactly how a smart toy works and what implications it may have. For example, a study on children’s engagement with smart toys revealed that many children did not realise that the toys were recording or that the recordings of their interactions with the toy were available to their parents. Clear user agreements and terms of use are therefore essential for parents to ensure that children understand their toys and that parents have sufficient control over what data is collected and how it is used. However, many smart toys have unclear or missing terms that do not ensure parental consent, do not notify parents if the terms change and do not make it clear what personal data is collected, transmitted and stored.

The review of smart toys by the Norwegian consumer council also found questionable or insecure data protection practices. In addition to lacking clear terms, many smart toys transmitted personal data to servers outside of Europe (sometimes even without encryption), shared personal data with third parties for marketing purposes, and used personal data for other purposes than originally prompted for. In many toys capable of voice-recognition, all communication between the child and the toy is transmitted to external data processing companies (which may reside outside of the EU).

Smart and connected toys have also prompted concerns about the behavioural effects they may have on young children’s behaviour. With speech recognition and the ability to hold conversations, inquisitive children may find themselves being exposed to a smart toy that has opinions and values that, if unsupervised, may be undesirable and beyond parental control. The My Friend Cayla doll, predominantly marketed to young girls, showed a tendency to talk about playtime, flowers and cooking while the i-Que robot, predominantly marketed to young boys, showed a preference for scientific facts and silly jokes.

The same study found that some words were banned or filtered by the toy, including “homosexual", “bisexual”, “lesbian”, “atheism", and “LGBT”. Independent researchers have also shown that it is possible to manipulate smart toys to say and engage in non-standard conversation, including the use of explicit words.

Shared responsibility: consumers, manufacturers and regulation

Connected and smart toys, as well as other devices designed for children, will most likely increase in popularity in the future. While parents can take proactive steps to assist their child in the safe use of connected toys and consumers can demand more security-minded products, manufacturers should also seek to design more secure products with embedded privacy and security safeguards. Regulation may have a role to play in achieving more secure products and services as well. In the US, the Children’s Online Privacy Protection Rule has been extended to smart toys, while in the EU GDPR will most likely have a positive outcome on smart toy privacy and data security.