I felt hugely honoured when, a couple of years ago, the Council of Europe invited me and two of my brilliant colleagues, Professor Sonia Livingstone of the LSE and Professor Eva Lievens of the University of Ghent, to act as expert advisers to the Committee of Ministers in drawing up recommendations on “children’s rights in the digital environment”.
These recommendations do not have legal force but, given the diverse nature of the Council of Europe’s membership, and the thoroughness and professionalism of their processes, they do represent an important advance in the creation of new international norms. New norms can, in time be reflected in new laws but long before that they start to shape and reflect changing attitudes, typically being rooted in current best practice among industry leaders. For that reason I am very proud to have been associated with the final outcome document which appeared on Monday.
I won’t try to summarise the whole thing here. It is all definitely worth looking at. I will, however, pick out a few of what were, for me, key developments.
Overview
Member States should:
- review their legislation, policies and practice to ensure that they are in line with the recommendations, principles and further guidance set out in the appendix of this recommendation, promote their implementation in all relevant areas and evaluate the effectiveness of the measures taken at regular intervals, with the participation of relevant stakeholders;………
- require business enterprises to meet their responsibility to respect the rights of the child in the digital environment and to take implementing measures, and encourage them to co-operate with relevant State stakeholders, civil society organisations and children, taking into account relevant international and European standards and guidance;……
Here are a number of examples of some of the major points in the appendix referred to:
Major emphasis on children’s rights
The appendix details each of the principal elements of the UNCRC and goes on to put the issue of access squarely on the agenda
Access
- Access to and use of the digital environment is important for the realisation of children’s rights and fundamental freedoms, for their inclusion, education, participation and for maintaining family and social relationships. Where children do not have access to the digital environment or where this access is limited as a result of poor connectivity, their ability to fully exercise their human rights may be affected.
- States should ensure that access to the digital environment is provided in educational and other care settings for children. Specific measures should be taken for children in vulnerable situations, in particular children living in alternative care, children deprived of liberty or whose parents are deprived of liberty, children in the context of international migration, children in street situations and children in rural communities. In particular, States should require online service providers to ensure that their services are accessible by children with disabilities.
- Connectivity and access to devices, services and content should be accompanied by appropriate education and literacy measures, including those which address gender stereotypes or social norms that could limit children’s access and use of technology.
- States should ensure that terms and conditions that are associated with the use of a device which can connect to the internet or that apply to the provision of online services or content are accessible, fair, transparent, intelligible, available in the child’s language and formulated in clear, child-friendly and age-appropriate language where relevant.
Data, age and age verification
- Recognising that personal data can be processed to the benefit of children, States should take measures to ensure that children’s personal data is processed fairly, lawfully, accurately and securely, for specific purposes and with the free, explicit, informed and unambiguous consent of the children and/or their parents, carer or legal representative, or in accordance with another legitimate basis laid down by law. The data minimisation principle should be respected, meaning that the personal data processing should be adequate, relevant and not excessive in relation to the purposes for which they are processed.
- Where States take measures to decide upon an age at which children are considered to be capable of consenting to the processing of personal data, their rights, views, best interests and evolving capacities must be taken into consideration. This should be monitored and evaluated while taking into account children’s actual understanding of data collection practices and technological developments. When children are below that age and parental consent is required, States should require that reasonable efforts are made to verify that consent is given by the parent or legal representative of the child.
- In relation to the processing of children’s personal data, States should implement, or require relevant stakeholders to implement, privacy-by-default settings and privacy-by-design measures, taking into account the best interests of the child. Such measures should integrate strong safeguards for the right to privacy and data protection into devices and services.
- Specific measures and policies should be adopted to protect infants from premature exposure to the digital environment due to limited benefits with respect to their particular physical, psychological, social and stimulation needs.
- States should require the use of effective systems of age-verification to ensure children are protected from products, services and content in the digital environment which are legally restricted with reference to specific ages, using methods that are consistent with the principles of data minimisation. (emphasis added)
- States should take measures to ensure that children are protected from commercial exploitation in the digital environment, including exposure to age-inappropriate forms of advertising and marketing. This includes ensuring that business enterprises do not engage in unfair commercial practices towards children, requiring that digital advertising and marketing towards children is clearly distinguishable to them as such, and requiring all relevant stakeholders to limit the processing of children’s personal data for commercial purposes.
Protection and safety
- Taking into account the development of new technologies, children have the right to be protected from all forms of violence, exploitation and abuse in the digital environment. Any protective measures should take into consideration the best interests and evolving capacities of the child and not unduly restrict the exercise of other rights.
- There are a number of areas of concern for children’s healthy development and well-being which may arise in connection with the digital environment, including but not limited to, risks of harm from:
– sexual exploitation and abuse, solicitation for sexual purposes (grooming), online recruitment of children for the commission of criminal offences, for participation in extremist political or religious movements or for trafficking purposes (contact risks);
– the degrading and stereotyped portrayal and over-sexualisation of women and children in particular; the portrayal and glorification of violence and self-harm, in particular suicides; demeaning, discriminatory or racist expressions or apologia for such conduct; advertising, adult content (content risks);
– bullying, stalking and other forms of harassment, non-consensual dissemination of sexual images, extortion, hate speech, hacking, gambling, illegal downloading or other intellectual property infringements, commercial exploitation (conduct risks);
– excessive use, sleep deprivation and physical harm (health risks).
Child sex abuse material
- Mindful of available technologies and without prejudice to the principles of liability of internet intermediaries and their exemption from general monitoring obligations, States should require business enterprises to take reasonable, proportionate and effective measures to ensure that their networks or online services are not misused for criminal or other unlawful purposes in ways which may harm children, for example in relation to the production, distribution, provision of access to, advertising of or storage of child sexual abuse material or other forms of online child abuse.
- States should require relevant business enterprises to apply hash lists with a view to ensuring that their networks are not being misused to store or distribute child sexual abuse images.
- States should require that business enterprises and other relevant stakeholders take promptly all necessary steps to secure the availability of metadata concerning any child sexual exploitation and abuse material found on local servers, make them available to law-enforcement authorities, remove these materials and, pending their removal, restrict access to such materials found on servers outside of their jurisdiction.
Risks and impacts on the rights of the child
- States should require business enterprises and other stakeholders to undertake due diligence in order to identify, prevent and mitigate their impact on the rights of the child in the digital environment.
- States should require business enterprises to perform regular child-rights risk assessments in relation to digital technologies, products, services and policies and to demonstrate that they are taking reasonable and proportionate measures to manage and mitigate such risks.
- States should encourage business enterprises to develop, apply and regularly review and evaluate child-oriented industry policies, standards and codes of conduct to maximise opportunities and address risks in the digital environment.
- Recognising that parents, carers and others may rely on an online service’s stated terms and conditions of service as a guide to the suitability of that service for their child, being mindful of available technologies and without prejudice to the liability of internet intermediaries, States should require business enterprises to take reasonable, proportionate and effective measures to ensure that their terms and conditions of service are enforced. (emphasis added)
Domain names for country code top level domains
- When awarding a contract or license to an entity to become the registry for a country code top-level domain, States should include clear requirements to have due regard to the best interests of children. Such requirements should cover, for example, a clear prohibition by the registry of the registration or use of any domain name which advertises or suggests that child sexual abuse material may be available on any domain within the registry’s purview and the establishment by the registry of mechanisms to ensure this policy is enforced, including by registrars and registrants. The same requirements should apply to the registration of generic top-level domains.
- Where a registrant proposes to establish or renew a site or service targeted at children or used by children in substantial numbers within their country code domain, States should ensure that the registry or other competent authority requires registrants to put in place appropriate child-protection policies. This may include, for example, requiring that neither the registrant nor anyone employed by the registrant in connection with delivering the service or in managing any data generated by the service has been convicted of acts of sexual exploitation or sexual abuse of children or other relevant offences.
Let’s hope these practices are implemented by every government in the world in respect of their own country codes and that the practice is taken up by other Registries as well as ICANN itself.