‘Who knows what about me?’
This week the Children’s Commissioner’s published a new report, which touches on important issues about how children need greater protection in today’s digital world.
The ‘Who knows what about me?’ report explains that ‘data footprints begin from the very moment when their parents proudly upload that first baby photo to social media’. Further, it identifies that ‘[o]n average, by the age of 13, parents have posted 1300 photos and videos of their child to social media’. The report expresses concern about children’s privacy but also considers that there could be significant implications for children when they become adults. It suggests that children and parents need to be much more aware of what information they share on the internet should consider the consequences.
The report makes a number of recommendations:
- Schools should have a responsibility to start educating their pupils about the importance of guarding personal information;
- Companies producing apps, toys and other products aimed at children should be more transparent about any trackers capturing information about children;
- The new Centre for Data Ethics and Innovation should undertake a programme of work specifically focused on children;
- The Government should consider refining data protection legislation if necessary;
- Companies should state their terms and conditions using language children can understand;
- The Government should consider introducing an obligation on those using automated decision-making to be more transparent about the algorithms they use and the data fed into these algorithms, where data collected from under 18s is used;
- There should be a statutory duty of care governing relationships between social media companies and the audiences they target.
Similar concerns have been expressed by research undertaken at the Centre of Information Rights at the University of Winchester. (Transparency Project member Emma Nottingham has discussed some of the issues on Voice FM Radio.)
The research concerns children who have grown up in the age of social media, whom are referred to as ‘generation tagged’. The research expresses concern that children can appear on social media because of the actions of others, such as parents posting photographs on a Facebook or Instagram page, or even opening a Twitter account for their baby. It also considers the risks posed to children who feature in fly-on-the-wall reality documentaries on broadcast media, and explores how they risk becoming the target of comment on social media outside of their immediate friends and family, particularly in light of the use of hashtags, which have been used increasingly by broadcasters in conjunction with television programmes as a way of encouraging interactive tweeting during broadcast.
The research identifies that there are significant issues with both the relevant law and oversight processes relating to images of and information about young children on broadcast and social media. Neither data protection law nor the tort of misuse of private information seem to deal with the fundamental question of whether the children should have been so exposed, instead relying to a large extent on the consent of parents. Further, it suggests that is a future risk of emotional harm.
Part of the research culminated in a report entitled, How can we Protect Generation Tagged?, which followed a consultation in which participants discussed the impact of broadcast and social media on the privacy and best interests of young children. Eight recommendations were made and they are listed below:
- Young children should have a privacy right independent from their parents’ privacy expectations.
- More open discussion is needed around the digital social norm that accepts the objectification of young children, the posting of negative comments and images where it might reasonably be expected that the child would not agree, yet requires a best interests test to be applied in offline settings such as health and education.
- More research into the impact of broadcast media exposure of young children is needed to understand what effect it has on them, both positive and negative. Once these effects are more fully understood, actions can be taken to reduce any potential harm.
- There is a need for more consistency in terms of compliance and regulation between regulated broadcasters and non-mainstream digital media/social media. The introduction of a Children’s Digital Ombudsman could provide a way for children’s interests to be better representedin relation to all forms of digital publications.
- ‘Controller hosts’ (such as Facebook, YouTube and 2000/31/EC Arts 12-15 Twitter and ‘independent interme as Google) should have a duty of care to consider young children’s privacy and best interestsin their operations.
- Settings on social media services (e.g. Facebook and Twitter) should be privacy respectingas default when images or information about young children are concerned. Potentially, it should be possible to require that warnings be shown where social media systems detect that a person intends to post images of young children without these privacy settings enabled.
- There should be a limitation on the extent to which information and images relating to a young child can be copied, re-contextualised and re-shown in a different context to the original post or publication.
- There should be more education for both children and parentsabout the impact of ‘sharenting’ and the level of personal information they are potentially exposing by doing this. Clarity is required as to which body should have overall responsibility for such educational programmes.
Duty of Care Campaign
There are now an increasing number of media reports highlighting concerns about children and the online world. For instance, in June 2018, The Telegraph launched their ‘Duty of Care’ campaign in which they have called for a statutory duty of care to be placed upon all social media and online gaming companies, which would result in legal action if breached. In light of such discussion, in addition to the research being undertaken, it is likely that in the near future legal and ethical frameworks will be improved so that children benefit from greater protection.
(See also Marion Oswald,Senior Fellow at The Centre for Information Rights, University of Winchester on the the British and Irish Law Education and Technology Association (BILETA) consultation run by The Centre of Information Rights here in 2017)