IV. AI rights for children
Have you ever thought about how much AI impacts children? They are exposed to algorithms at home, at school, and at play. Algorithms shape the environments in which they live, the services they have access to, and how they spend their time. Children play with interactive smart toys, they watch videos recommended by algorithms, use voice commands to control their phones, and use image manipulation algorithms for fun in social media.
The presence of AI in children’s lives raises many questions. Is it acceptable to use recommendation algorithms with children or to provide an interactive toy if the child cannot understand that they are dealing with a computer? How should parents be advised on the possible impact of AI-based toys on the cognitive development of a child? What should children learn about AI in schools in order to have a sufficient understanding of the technology around them? At what point should a child be given the right to decide about the consents involved? How long should the data be stored?
As Unicef and other organisations emphasize, we must pay specific attention to children and the evolution of AI technology in a way that children-specific rights and needs are recognized. The potential impact of artificial intelligence on children deserves special attention, given children’s heightened vulnerabilities and the numerous roles that artificial intelligence will play throughout the lifespan of individuals born in the 21st century.
For more information:
However, the current international framework that protects children’s rights does not explicitly address many of the issues raised by the development and use of artificial intelligence. Instead, it identifies several rights that may be implicated by these technologies, and thus provides a starting place for any analysis of how children’s rights may be positively or negatively affected by new technologies, such as rights to privacy, to education, to play, and to non-discrimination.
You have reached the end of this section!Continue to the next section: