Children’s rights to privacy, safety & freedom of expression
09 December 2016 - A Workshop on in Guadalajara,Mexico
>> MODERATOR: Please come to the table. We'll have more interaction if you come closer to us. Thank you.
Hello, everybody. Welcome to workshop 5. For streaming purposes, we need ‑‑ if you ask something and want to participate, please talk loud and close to the microphone. If you are on the second row, wait until the microphone arrives to you. So we will start in four minutes. And that's when we go on streaming. Thank you.
>> MODERATOR: Hello everybody. Welcome to the workshop on children's rights and privacy, safety and freedom of expression. I'm Jutta Croll. I'm glad to see so many faces around the table and the young faces as well because we talk about children's rights, and that shouldn't be done only by adults but by the children that are experts of themselves. When we considered to have a session at the IGF, this was close to when I had read the report that was produced by Sonia Livingstone and her colleagues about one in three users of the Internet worldwide would be a child.
In the certain of the U.N. convention of the rights of the child, that means under the age of 18. Then I had a close look to the U.N. convention on the rights of the child and had to look at which paragraphs and articles are related to the Internet. Then at the end I came to the conclusion more or less everything is related to the Internet because the Internet is related to the whole life of children that are now growing up in this world. So that was the moment when I thought it would be good to have a session at the IGF where we are talking about children's rights and what does this mean in a digital world.
I'm very honored to have a very good, prominent panel here. I'd like to introduce you to all of the speakers. We will start with Sonia Livingstone who say professor and she's a head of researchers that produced the EU kids online studies and also the recently published report global kids online and as mentioned before the one in three report. To my left is also Abhilash Nair. He will represent the legal perspective. He's an expert for human rights and the legal requirements and policy issues around safety. I hope I did get that right.
Then I'm very happy to have Ellen Blackler who is the president of the Walt Disney company so representing the perspective. She has both products and services for children and what children make out of that when they use them.
Then I have Arda Gerkens who is president of INHOPE international organization of hot lines where people report harmful content and she's managing the Dutch hot line and has been a senator in the Netherlands government.
So coming to the NGOs, sitting to my right is Arda is he's the organizer called Rudy international. He will explain more in detail what the organization does do in the democratic republic of Congo. He's also representative of the user Internet forum. He stepped in on real short notice due to a colleague from another NGO from the African continent couldn't make it and wasn't available for remote presentation. You may have seen the name Khan on the announcement, and he's not right here. Happily we have you.
Last but not least, Marie‑Laure Lemineur, and she comes exploitation of children online at APART international and she has a long‑standing experience in the field of exploitation and abuse, and I'm very happy to have you here.
I just wanted to inform you that this session should be very interactive, so we will start with short statements from the panelists, three to five minutes only, and then afterwards we will do an exercise that is called appreciative inquiry session. I will explain that a little bit more, what it means and how you can engage in the debate afterwards. So we've drawn up some aspects that the panelists might address in their initial statements. Sonia, the floor is yours. Thank you.
>> SONIA LIVINGSTONE: We weren't sure which of us. Let me kick off, and then you're coming in Abhilash. Okay. Jetta, thank you very much. There are many things to say on this kind of wide agenda that you've painted. I'll begin with our 1 in 3 report.
So in this report we tried to get a sense of the obvious question, how many children are using the Internet around the world? This led us into inquiry into data and a very unsettling discovery for the most part countries are not even measuring how many children are online because they do household surveys of what's available in the home or they do surveys of adults who are 16 or 18‑plus.
So in one sense we actually don't know the most basic fact, but we are piecing things together and estimated that 1 in 3 children in the world is already online, and this is set to grow very fast in middle and then low income countries. Of those already online, 1 in 3 users is a child under 18. Even from the statistics we have, it gives you a sense of the scale of the rights issues that arise in relation to children.
Then I think what a lot of these conversations are very kind of global north, and as soon as I hear anyone referring to their own children, then I know we've kind of lost of plot in terms of the diversity and range of ways in which young people go online. We talk a lot as if children go online at home and parents are there, but in many parts of the world children go online primarily on a mobile phone and could be anywhere or go online in a cyber cafe where there are zero regulation or oversight for their safety anywhere.
Outside the Global North there are a lot of issues of connectivity and access, which are very ‑‑ which are crucial in terms of inequalities which are crucial in terms of what children might do or sacrifice to gain connectivity and lots of ways in which technology is shared and phones or connectivity is shared so that it's not obvious who exactly is the user and who exactly is it who is generating the data that might be used.
So context matters crucially. The other thing to me in the Global North and Global South, I think we have transcended that notion of the digital native, but we should erase it from our minds, because young people are very ‑‑ there's a lot of facility with functionality and keenness to be on trend and know the latest. Critical, informational, privacies, literacies, these are lacking and I'm not criticizing young people here.
They're lacking among the entire adult population as well as the youth population. These are major kinds of challenges. We have a lot of evidence on what people don't know, and not even the kind of evidence one might hope for in terms of trends in increasing understanding.
I think people are increasingly aware of the risk of using their technologies, but I don't think they're necessarily increasingly savvy in grasping who has their data and how it's used or shared. So in addition to diversity of context of use and who is around the child when they're going online, we still manage to put that together with different context of thinking about privacy.
So the child has the right to privacy, article 16 on the convention of rights for a child. That seems straight forward. Privacy from parents is different from privacy from peers, which is what young people are concerned about. Completely different from the conversations and privacy at schools and there are growing concerns, I think about how states, schools, health services and so on connect and use children's data and completely different from the question of how commercial services are collecting or prompting from children's data I should say more accurately and possibly infringing their rights.
Yet, in our call for multi‑stakeholder solutions, we kind of put them altogether as if parents and schools and the schools will teach the kids what they need to know and the parents make sure it's implemented and the companies offer the right services and it will all come together neatly. Actually, if you ask a child where their data is in relation to these different kinds of actors, it's just getting really crazily complicated.
I don't think there are any ‑‑ you know, it's not that they're a kind of evil actor out to exploit. It's a lot of different sectors exploit or are pursuing different kinds of interests. Parents are anxious. Peers are fun‑loving and will share thoughtlessly. Schools are trying to track learning but at the same time they collect a lot of data they don't need.
Companies are trying to run businesses. The result is a very different array of ways in which kids' data is connected and increasingly intersected and presumed the next ten years are going to see a lot of intersections of data sets in ways that is quite mind‑lowing positive imagine children are going to be informed about. Is that enough?
>> MODERATOR: Thank you. I think it's enough for the moment. Thank you so much, because I know you could talk hours, and it would be interesting.
>> SONIA LIVINGSTONE: Five minutes is my natural unit.
>> MODERATOR: Thank you very much. To Abhilash.
>> NAIR ABHILASH: Thank you. I will talk about the issue of age of consent of children to set the scene. When you look at international instruments and domestic legislation that defines the person as a child, you would see that there is hardly any consensus as to who is a child. Internationally you would say 18 as the age of any person under 18 is recognized as a child under the U.N. convention for the rights of the child.
Also, if you look at domestic legislation in countries, you would see that are different ones depending on three different factors. The context. An example would be in the U.K. 16 is the age of consent for sex, whereas the age of consent for posting for a nude photograph is 18, but that's in line with many other countries. Also, conduct is another factor.
An example would be the age of criminal liability until the U.K. is 10. So if an 11‑year‑old child murders somebody, which doesn't happen very often, they can be prosecuted as being criminally liable.
Culture is also another influencing factor. For example, in some countries the age of consent is lower than 16. In some countries age of consent is higher than 16.
So setting a definite of age in defining who is a child is problematic, but it gets even more complicated whether it comes to the Internet because cyberspace does not recognize boundaries, it transcends physical spaces. So as a result you would see that in some countries you need to be 18 or over to be able to gamble online, to purchase alcohol can range from 16 to 21 depending on where you are, and there are also the norm active standard.
This is not necessarily law that to be on a social networking site, you need to be 13. That is lower in one particular country, but it has become a normative standard in the rest of the world.
So this is getting even more interesting with the increasing use of gadgets and devices.
If you were here, you would know that there are newer ways. The way the children use the Internet is no longer on a laptop or mobile phones. There are many ways. Some have devices and toys that actually collect, store and distribute data. So it is quite important that we need to start thinking about, you know, trying to find out what might be an appropriate age for children to be using or sharing information on the Internet through various media and devices, and who has a responsibility for ensuring that children are safe.
If you look at instruments like the U.N. convention and the rights of the child, it sets out certain rights for children's rights to privacy we talked about already, right to free expression. There is also an obligation on the state, the society and parents to ensure that children are ‑‑ to ensure the safety and welfare of children. That means the rights are not absolutely precise enough.
The trick is to find the balance between ensuring safety for children in one hand and on the other to uphold the rights of the children. What can we do to achieve that? Do we need more legislation? Would that help? Do we need the multi‑stakeholder approach, which is what I want to really discuss after I finished my opening statement.
I actually had an interesting conversation with Larry this morning about using Lawrence's model, the modalities in regulation in finding a solution for that. So the state can legislate as much as they like. Laws, for example, requiring companies to ensure that the default is set to maximum privacy around that and minimum privacy. Would that on its own help if people don't understand how to use it in an appropriate fashion?
So norms can probably help. Again, they can shape the norms by educating parents and children at school for example to get help, but also the architecture of the Internet where you actually design products in a way that there is maximum privacy and minimum privacy, although companies tend to have minimum privacy as a default for various reasons.
I also thought about markets, you know, incentivizing companies to in terms of taxation, for example to impose a lower tax regime for products that were child friendly or privacy friendly, but these are initial thoughts. I just wanted to set the scene and make you have a discussion about how to take these things forward or have a more meaningful discussion.
>> MODERATOR: Wonderful. Thank you, Abhilash. You built the bridge to hand over to Ellen Blackler, who is representing the industry perspective and who might have a perspective on the incentivation for companies moral legislation. I think these are both topics you could pick up and go to the aspects as well. Thank you. Ellen.
>> ELLEN BLACKLER: Hello. Thank you everyone for coming today. I think we'll really touched on two buckets of things that are related but actually pretty different. One is the legal frameworks and the rules that we present to the users and the other is the guidance we give parents. I'll say more about that. Talking about the legal frameworks in relationship to parenting, we learn from parents they also ignore the rules.
A strict rule is not helpful to them. That's what the parents tell us, even about things like movie ratings. They don't want only to know if something in the U.S. we have something PG‑13. That is not helpful if you have a teenager. What parents tell us is they want to know why it got that rating. Was it bad language? Violence? Sexual content? Then they can decide what is right for their kid. So we hear all the time for parents slapping that number on it is not ‑‑ can't be the end of the story.
I think Facebook has had the same experience where the research shows not only are kids under 13 on Facebook, but they're with their parent's knowledge and consent. In some circumstances the parents have helped them do that, have enabled them being online. So sometimes we get into this battle about the age, and it's a little bit about legal liability and not about the user.
So you need a legal framework. I'm not suggesting we don't. You need those kinds of rules to move forward, but I don't think we should kid ourselves into thinking that that actually responds to what is needed in the marketplace.
Then on the parenting side, I think what Sonia is doing which is really interesting work on teasing out what it is that kids are doing and what kind of guidance parents should get. When you think about offline, where we're had a little more historical experience, we have guidance to parents. If your child is ready for this, here's how you tell if your child is ready for kindergarten, enthuse how you tell they can have their own bank account.
It's helpful to think about using the learnings from the cognitive sciences, what we want to tell parents about whether their kids are ready for social media, because what parents tell us is these firm rules are both not helpful to them and not practical in the marketplace. So I think that is something that the community ‑‑ the child advocacy from a rights perspective can really help sort through, which is a little bit of a different question than we think about with, you know, the legal rules.
Lastly, I'll pick up on a theme from the session the other day where I think you or maybe someone else talked about safety by design. You know, that was a really helpful idea when we first started to get into privacy regulation, to encourage people to do privacy by design to understand the privacy implications and design their products for them. I think using that same language is also going to be very helpful to new industries that come into the fold with interconnected products, and then also helping people understand and articulate what they should be thinking for.
When you say to an engineer you use the safety by design perspective, you know, that is not helpful unless there's more guidance. In an area I always find is helpful precedent is disabilities access where we ‑‑ the law said to the engineers, you need to make these products accessibility for people with disabilities, and then they needed help to understand what that meant. I think we're in the same place where we can pressure the product manufacturers to use safety by design, and then they need help understanding what that means.
I think doing all this from a rights perspective is very new from a business perspective, thinking about it as a child's rights perspective as opposed to protecting yourself some liability driven by parents is a really different way of thinking about it. The industry is going to need our collective help to do that properly.
>> MODERATOR: Thank you, Ellen. That was a very good statement. I'll use that for the advertisement for the work that they have done in Germany. I've put some of the brochures on the table there where the safety by design concept is a little bit more explained and that's exactly what we found out when we had some interviews with industry representatives and engineers. They told us, okay, we are understand the concept, but we need to know how we should do that.
That's the parallel to the accessibility topic that whether the I.D. came up, the engineers would not know how to make a website accessible and make a product accessible, but it's a process to learn to understand. So thank you so much for that.
So now, Arda, the terms used.
>> ARDA GERKENS: I would like to talk more about that parenting, because we also have a helpline and a hot line. We have parents calling us with most of the times problems, right? They will always call us after the fact, and they would like to know what went wrong? How do we proceed from here? More importantly we'd love to have parents who can engage with their children about their online world before the fact.
We also always advise them to talk with them about, how was your day at school or at Facebook or what did you do at mine craft? Was it fun? We think that's very important.
Speaking of this, there's a Dutch initiative called safe traffic in the Netherlands. It's an initiative that's been done with parents and schools together. It's a voluntary organization. All the school children get in primary school they get a diploma to be in traffic and also to cycle the bikes. We have a lot of bikes in the Netherlands.
These are parents who are engaged with this whole traffic. Also, they make traffic plans around the school and they talk to the parents about what is safe and what's not. If you look at the bikes, they have a very small child 4 years old, you probably have side wheels and helmets on. In other countries you have to keep those helmets even when you're an adult, but in my country at some age you stop with the helmet and the side wheels going on. It's just like the Internet. I don't think you can tell kids, this is the age you can go down to the Internet. You probably will be on the Internet as we saw yesterday already as a baby.
It's also very hard to say you can't be on Facebook until you're 16, because social media is also very broad. Many kids are in what's uprooms at 8 or 9 years old. The problem with parents is we have parents that don't know, and there's so much out there.
Let's be honest. We don't know what can go wrong until something goes wrong. I would advocate it's good to get parents engaged like the safe traffic initiative in the Netherlands together with the teachers and with university science to talk about what is the way to lead your kids to the Internet and get them to have a diploma on it.
>> MODERATOR: That was actually my last remark.
>> ARDA GERKENS: I'd love them to get a diploma on it when they have a primary school, so they can learn the basic skills being on the Internet and also the parents will know what can go wrong so they can guide them through this digital age.
>> MODERATOR: Thank you, Arda, for your comments. I think you also build a very good bridge because we are now turning to the NGO sector. I've already heard from the situation and it might be different in his country but on different continents about what role parents can play. Always NGOs are expected to step in and to help families, so what is your perspective?
>> SPEAKER: Thank you so much. I'm glad to be here today. Thank you for inviting me to be on the panel today. It's good to see Marie and Larry in the audience, because they were ‑‑ I mean they saw me first, I think, in 2011 when I was speaking on the same issues at my first IGF in 2011. A lot has been going on, and I'm from the Democratic Republic of Congo and my name is (indiscernible). I'd like to jump on the issue of parents, because it's really good to hear from different perspectives. We have like different situations in Africa.
Parenting is sometimes when it's about telling their kids what is the age to go online. I mean, we are facing situations where the kids are online before their parents. Their parents know nothing about the Internet, so there's no way to tell the kids what you need to do and you need to access the Internet at this age. Most of the times it happens the kids are online before their parents know anything about the Internet.
Young people are really using the Internet, and most of them are on social media. Social media, namely Facebook and what's up and probably Instagram. This is through mobile phones most of the time. Not everyone can afford a computer or not everybody can go to ‑‑ we don't generally have WiFi in our homes, right? So you don't own a computer, you only rely on a mobile phone to access the Internet. So my organization, I run a small NGO and we work with children. We basically do more on the humanitarian work, but we also have a part where we engage young people and especially children on Internet‑related topics, mainly on Internet safety tools.
Also early this last year I worked for UNICEF, so I was dealing with our communications. So UNICEF is one of those organizations giving the voice to young people or to children to express themselves. We need to say that for children to be able to express themselves, they not only need the space for them to speak, but they need to be taught how to see whatever they have to say.
If you don't give them space and opportunities to speak, then you are not encouraging them to speak. When you're encouraging them to speak, that means you're also telling them how to use the online tools. If you want them to express themselves using online tools, there's a need for the literacy for young people in order for them to be able to use the Internet.
Regarding issues of gender inequality, this is a very huge issue in Africa. In a family with one boy and one girl, the parents easily give as much to the boy and nothing to the girl or at least a feature phone to the girl. Do you see what I mean? Men are the most encouraged, you know, and are more entitled to access the Internet rather than women, and this is one of the issues that needs to be solved.
I think we can continue the discussion, but that would be maybe quite a few points to start with.
>> MODERATOR: Thank you for your starting point and also for pointing out to the gender inequalities. We are always aware we face them everyone where. It's a big topic at the IGF, but I think you have mentioned that it's a very special situation on the African continent with gender issues. Marie‑Laure you have experience from many continents, and you can go on now.
>> MARIE‑LAURE LEMINEUR: Well, my remarks are about on the question of is there an adequate age for consent to social media. I have a problem now, and I'm the last one and most of what I was going to say has been said. Forgive me if I'm being repetitive, and maybe there will be a couple of comments that can be a complement to what Abhilash said. I was thinking that we should somehow reflect around the concept of the age of consent.
It's a very subtle concept influenced by, you know, societal factors, gender, culture. If you think about the age of sexual consent in Latin America, for example, in some penal criminal codes you had, you know, 11, 12 years old for a female, for girls, you know. And older age of sexual consent for boys. That's an example. It changes over time.
In France you had to be 21 and now it's 18. So it's a subject notion. Also, I was wondering what is an adequate age of consent? What does it mean, "adequate"?
Again, you know, if you think of the children as end users, their personality traits, the family environment and cultural context where they're based, it all comes together, and somehow you have children of the same age and they reach a different level of maturity and somehow that is going to impact their perception of how they react to dangers, how the level of ‑‑ how resilient they are when they are going through some sort of situation where they're being harmed. So having said that, somehow we need those, you know, subjective legal standards.
We need them as a society. So we know that there is the famous, you know, 13 years old threshold that is a U.S. federal law, but from my perspective those ‑‑ this type of standard is used less unless we force businesses or, you know, the owners of those platforms to have a verification age system.
Plus, legal consequences are linked to the noncompliance. So unless we have those businesses, you know, implementing age verification, what Ellen describes will keep on happening. We have users themselves tricking the system sometimes with their own parents. The research and statistics shows there's a high proportion of 10 to 12 and sometimes younger users on social platforms, and we had the BBC in the U.K. release a statistic of 75% of 10 and 12‑year‑olds used social platforms.
So do we want to be pragmatic and accept the notion that, you know, children are using those ‑‑ the trend is that children are using younger and younger and younger and the age is decreasing. It's a legitimate used in the sense that they also exercise their rights on those platforms. So here we have the tension between different rights and needs, and so maybe the way to go is to be very pragmatic and also remember that the laws reflect what a society is.
So why not be a bit provocative and say, okay, the trends that children are, you know, using ‑‑ younger and younger and use those platforms and exercise their rights. That's the reality. There are business needs, so why not lower the legal age, but put a legal mechanism in place, you know, very strict ones, of course, with a combination of legal mechanisms and very strict sanctions in case of noncompliance, and then we can sort of combine, you know, the interest and have a win‑win situation in that sense if that's possible. I don't know. I'm just putting it on the table. That would be my take on the topic.
>> MODERATOR: Thank you so much. I think you have pretty much opened the floor for the debate that will follow. We have the idea to do that in the form of an appreciative inquiry session, and I just want to shortly explain to you how that would go. So, first of all, I would like to mention that the ‑‑ we would have an appreciative approach to the whole topic, and I think with the U.N. charter of the rights of the child, we have a positive approach to rights of the children.
We should now consider whether this is also true for the process of digitalization and what does that mean to the rights of the child? Not only talking about how the rights are infringed, but a really light saying that children are exercising their rights when they use social media regardless of the age, if they're allowed to do so or not, they're exercising their rights.
So in an appreciative inquiry session, we start with the first step, which is appreciating and valuing the best of what is, and I've already made a start because I a lot appreciate the U.N. charter of the rights of child. The second step is envisioning what might be. Third engaging in dialogue about what should be, so that's more looking into the future. Afterwards we will collect ideas on how we can innovate what will be in the future.
So let's go to the first step, then, appreciating value in the best of projects. I would like to invite you all around the table, what do you appreciate about the situation right now? We've been talking about the age of consent to use social media, and that was triggered, that point was triggered because in the ‑‑ in Europe we have now coming the general protection regulation, which is somehow questioning the current rule we have with 13 because national governments could decide for that country to have an age between 13 and 16.
Any nationality could set their own age, but that would end up in a situation where children, for example, in the Netherlands are allowed to use the Internet without the consent of parents at the age of 13, U.K. is 16, France might have 14 because that's all possible. Let's have a look at that situation in the face of what law already sets. It's a fact that children are younger and younger whether they use social media, and they are using it to exercise their rights. So that's to open up the discussion. Larry, please.
>> AUDIENCE: I don't understand how it's possible for any country to signed this convention to pass a law or any continent or body to pass a law like the GDPR. I'll read from article 13. The child should have the right to freedom of expression and it goes to seek, to receive, to impart in any other medium of the child's choice. I'm skipping around.
But the bottom line is that it's very, very explicit that the child has a right to freedom of expression and freedom to seek information on any media of choice. Now, obviously, Facebook and social media wasn't around when the convention was first ratified, but they were very clear, just as the United States constitution wasn't around when digital media was written, but we apply that to digital media.
It baffles me they have that provision in the law, frankly. Obviously, I understand the reason why we want to protect children against manipulation by commercial entities and marketing, but as we've talked about in previous IGFs and John Karr and I co‑chaired a number of sessions, when protection gets to the point it inhibits the child's rights, we need to look at that protection and figure out how it can modified so that their rights are, in fact, enabled. The fact of the matter is that social media is the way people of all ages express themselves.
It is the modern town square, so to speak, and whether it's a child wanting to participate in a family gathering or a child wanting to explore their sexuality or their politics, they should have rights independent of their parents. I just, again, I'm baffled this this law passed. I'm frankly not particularly enthralled with the children online private see protection act in the United States. I applaud the intention.
I don't know if Katherine Montgomery that co‑authored is it here, I don't imply the unintended consequences of limiting the children's access to free speech.
>> MODERATOR: Thank you for that comment. I think you have not been appreciative but more critical.
>> AUDIENCE: Let me appreciate the motivation to protect children from being exploited by marketers. I do appreciate that. I'd like to be able to appreciate that in a way that didn't violate their rights.
>> MODERATOR: Ellen Blackler and her neighbor to the right.
>> ELLEN BLACKLER: I'm all for criticism and excited to get to that point. I'm so excited by how we're now talking about ‑‑ there's so much agreement on children's the benefit to children of using the technology. We used to come to meetings can say the worst thing that can happen to kids is kept away from the technology, because that's the future. It's exciting to see that shift, and I appreciate that.
>> AUDIENCE: Thank you. Mauricio from Mexico. It's very interesting to hear we're more concerned about how to regulate child's consent age than to accept the facts. The fact is that children are more intelligent than most adults about getting through to social media. I think that we do not need to undervalue their intelligence and to accept the facts.
In that sense, I think that GDPR had ‑‑ it's a good path to understand that children need or we needed it to understand a children's age and move forward to consider them on their capacity for deciding the age for going into the social media. What we need to understand is we need to go with them and not to consider some topics that are taboo anymore. They need to understand the risks and benefits of social media, and I agree with the sense we need to change all these and maybe to consider a lower age for getting them consent for getting through the media.
In Latin America in that sense in 13 seconds, it's in pampers, and we need to get in in a closer approach with this legislation so to get in a harmony session for standard sizes and these kind of rules for children's consent. Thank you.
>> MODERATOR: Thank you for that statement. Any questions from the floor from the statements? I would really like to invite the young people to tell us what age would you think would be appropriate to go onto social immediate, to make use of social media without the content of your parents? Here and over there, then.
>> AUDIENCE: It works. I don't think an age is appropriate, actually. Just the fact that Larry also mentioned that the children have to have rights to be online to express themselves. I guess there has to be another solution for this.
I mentioned in another session that I guess it would be better just to delete the data at the age of 18, and then start over again, then, banning them from all these platforms and websites of social media. That's better than banning all these children from the web.
>> MODERATOR: Thank you.
I know Sonia has something to say to that, to the blank sheet at the age of 18. Will you do that directly, and then we go to the other young speakers?
>> SONIA LIVINGSTONE: A quick point. Should children's data be retained when they posted it under 18 and retained forever, and the other one is what age of companies target advertising at you. That's a different question from the question about what age should young people be allowed to participate. The trouble is they're tied together in the legislation.
>> MODERATOR: Thank you, Sonia. Now on the right side. It's your turn.
>> AUDIENCE: I'm Natalie, age 15 from Hong Kong. As a teenager, I think no specific age regulations. Just require parent's permission. Ink that social media Internet has the advantage, too, and provide educations to a child. Child have their right to learn and share or express their feelings, too. While we are considering how to protect them, we should also balance the rights and how much we protect.
A coin has two sides. I think we might not stop child on the Internet in order to protect them. That reflects real world. We can't avoid all the damages. So we would have let them experience and educate them, not ban their freedom of expression or their rights to learn.
When it comes to the possibility to ban social media, I think as a kid or teenager, my peers have Facebook while they're eight years old, but while Facebook requires you to be 16 years or both, they simply lie. They can type their fake birthdays to have the service. So I think this is not a good alternative plan.
I'm having limitations for the child. For example, they can serve information about sex and violence on social media and they can't collect the privacy of the students or information when they're 18 years old.
After 18 years old, they can have the alternative to choose to continue using the social media or not. Thank you.
>> MODERATOR: Thank you for that statement. I have one more question back to you. You said your friends were using Facebook at 8. Did you think they were ready to do so? Did they get enough guidance or education for doing so?
>> AUDIENCE: I don't think so, but any simply can lie because you can't identify them underage, yes. It is not appropriate to do so.
>> AUDIENCE: I think this is an interesting question. When I heard everyone commenting I was like, do I have an opinion on this, or do I have my personal opinion on this. It varies from areas you talk about. Parents are not so readily able. I'm talk about some parents who are digitally illiterate are not able to understand how to even help you better use social media.
I know Facebook it was good for them to say, you need to use Facebook at this age and this age. But experience has proven people like to give fake dates as you say. It shows children are or young people are really, really eager and they're trying to find ways to express themselves.
They feel like they have more ‑‑ I don't know how to put it. They feel like they're ready to go it on social media because sometimes they're not being given other platforms where they can express themselves. To say since our parents don't provide us spaces to speak, and we're not proud of other groups. UNICEF is creates groups so children have space to speak and share experiences. There are children who don't have access to the platforms. They want to try to use social media so that's another way to express ourselves, which would be interacting with our peers.
>> SPEAKER: I would like to react on something you said about the parents not being aware, and I think it's a good point. The notion of having parental consent should be challenged, because in ‑‑ I would say in most regions in the world, parents do not have a sense of what's going on nor do they have the awareness or skills or ability to consent something they don't understand, too.
I know that's, you know ‑‑ I mean, that's their role to and the guardians and parents, their role to guide the kids, but how do we deal with that? I don't know the answer to that. That's the reality. How can you agree to something you don't understand as a parent? You know what I mean?
>> SPEAKER: I think it's important to focus on the parents first and make sure they can help their kinds and once they're able to help their kids, I don't know there's an appropriate age for children. It's different from country to country. If children are already able to use the Internet, it's age 8. In my country some parents are like we don't give children a mobile phone until they're 18.
Those children use their friend's mobile phones. When you put up a barrier, they find ways to overcome the barrier. Don't put it as a challenge to them and try to talk to them, and say here's what you have on Internet. Please behave like this. There's no way you can really, really control completely what someone is doing online. It's not easy.
>> MODERATOR: We have already stepped into envisions what might be, and we're also engaged in dialogue, so we're going quick through the appreciative inquiry session. I have Susie Hargreaves on the list and then Sonia
>> AUDIENCE: Thank you very much for I'm Susie Hargreaves. I want to pick up on the issue of privacy, safety, and freedom of expression in relation to self‑generated images. So the older age group. So in the U.K. age of consent is 16 but if you take a picture and you're 17, you can be criminalized for it.
You can actually be charged and you can actually be put on the sex offender's register, even though you're over the age of cone sent. I think people take a common sense approach to this, because the reality is nobody wants to criminalize 17‑year‑olds.
There is a big argument about people choosing to take those pictures and use it as a form of expression. We have an ongoing discussion and debate about ‑‑ I mean the majority of our work is about very, very young children. We talk about the ways to protect the privacy of those older children, one by verifying they're under 18 to remove the images if they want them removed, and secondly by respecting what is and has become quite sort of common practice.
I don't have any answers. I just want to share it's an ongoing debate. Thank you.
>> MODERATOR: Thank you, Susie. Sonia, can you step in there?
>> SONIA LIVINGSTONE: I want to clarify that I don't think anyone wants to stop children in expressing themselves on the Internet. I don't think anyone is trying to regulate in order to stop children from doing that. There's the question about protecting their safety, but the regulation we're talking about, both the Child Online Protection Act are designed to stop companies monetizing the way in which children express themselves online. What we really want to move to yet what could be point is for companies to provide services for children without monetizing them. That's what we don't have.
Yes, we can educate parents, but there are limits to that. Yes, we can advise children, but there are limits to that. What we don't have is any company or companies by and large, I'm sitting opposite one that does find a way to provide services for children, but most of the services that we're talking about will not provide those services if they can't monetize the children and they just stop providing them.
>> MODERATOR: I give the floor to Ellen who wants to respond directly and Arda.
>> ELLEN BLACKLER: This is the central problem. KOPA is a good example. It was collection of data from children in a world with notice social networking. We don't want this data collected from children, and at the time was used really only for advertising. So think about 13 in that context. That probably is a fine age to say you shouldn't collect data from children for commercial purposes. But it had the effect of making it ‑‑ of providing a disincentive to companies to create children's experiences designed for them.
At Disney, we do, but we collect no data. There is virtually ‑‑ I think what we have seen is there's no commercial opportunity to create a platform for children's expression that is legal. We don't want it to be monetized with the children's data. I don't think anybody has found the business model to create that. So the ability to allow incentives for companies to create experiences designed for children is the trick, and I think, you know, it's very difficult.
The reason you see so many children ‑‑ I suspect that many children would use something other than Facebook if there was such a thing available to them. So how do we get such a thing available to them, and that is something that we haven't been able to find a legal framework that both protects them and allows for a commercial production of this content.
>> ARDA GERKENS: If you look at the industry, I see three kinds of industry. The one that knows about it and does good, you're good. Legal has learned their lesson and is doing a good job now. They have companies that also now that don't do good because they want to collect the data and use it and sell it. The third and this is a really big group, the ones that just don't know. Let me give you another example.
In the Netherlands one of our colleagues at the hot line has a friend that starts a fashion site, and the idea is people can comment on the fashion and upload pictures and everything. You can imagine within no time teenagers will be all over that platform. She was talking to her about it, and she said did you hi of the fact that teenagers might be starting to post and chatting and all the rest that comes with it.
She said no, actually, I didn't. I can't blame her for it because it's the downside of a world that we'd rather not think of because we want the world to be nice and safe and free. All the time we talk about privacy and data protection, it is ‑‑ I think it need to be within the education of the technical sector, and it's not if you go and have education being a website builder, you don't learn about the laws and you should at least know a little bit of it so when you build it, you can say to your customer, did you think of it? If you collect this data, you need to comply with the data collection laws and need a safe and security connection. At this point it's not happening, and that's worrisome.
>> MODERATOR: Thank you for the statement that companies might not know, but safety by design would mean they should consider from the first idea of producing designing and developing a product or a service what would happen if this service would be used by this age group, that age group and that the age group and how to protect them. I have Maureen and Patty and ‑‑ can you give me your name again?
>> AUDIENCE: Building on what has been said, and we had an interesting session yesterday about Internet of Things and children's rights. We're talking about social media collecting, social platforms collecting data. Yesterday we talked about toys. Are there any type of devices collecting data by end users who will be children.
There's something called tending the garden you mentioned yesterday. I don't know if all of you here in the room are familiar with it.
Apparently it's a toy, and that's because I heard about it yesterday. It's a toy collecting health around the health of child transmitting. It allows the parents to monitor the health. So aspects of it of the child. So think about what will happen if those companies do not do what Ellen was saying.
I mean, they're not transparent. They actually use this data to make money out of it. You know, analyze it, the meta data, et cetera. There is huge potential here for violating the rights of the children and misusing this data.
>> NAIR ABHILASH: I wanted to add to that. I think it's all fine to have safety by design and everything else, but I think we need to be conscious of the fact that consent or obtaining consent needs to be much more explicit. There's informed cone sent, and parents and children don't really fully appreciate or realize the implications of the amount of personal data they share exactly. That's not rocket science to draft it in the way normal people understand it. It doesn't take a lawyer to interpret it. I would be an exception to that, right?
>> MODERATOR: Thank you. It's Patty and then it's ‑‑ you give the floor first to you? Okay.
>> AUDIENCE: I think nobody has talked ‑‑ I'm grateful you're here from Disney, and I don't mean you're the one company. I think that people have ‑‑ we're not talking about normalization and critical thinking, and I think the industry ‑‑ what we're talking about is keeping our children safe and keeping everyone safe. I think because Disney is here, I mean, to me Walt Disney was a company you could rely on that was safe, and then it's become ‑‑ it's normalization for their stars to be acting very sexy and stuff, and I think that's the problem.
It's that the norms have jumped up. What was seen at one point not normal, we see young children exposed to things on Walt Disney and those shows, and not having people going and saying, this is not right. This is critical thinking. And the floodgates have opened. You know, people used to be afraid of having the kids watch TV. Some parents don't let their kids watch TV. Some parents do.
Now we have the Internet. Everyone is exposed to so much and it keeps going rapidly, no one knows what is the right of a child. Is it the right to be able to watch whatever and do whatever you want on the Internet, or is it the right of the child to be taught critical thinking and what is normal in regards to your own family? For industry, they need to start thinking about, yes, maybe they're doing things. What is this affecting? Maybe they have blocks, but maybe even watching Walt Disney isn't the best thing for your child.
What is the right of a child and any parent? I mean and that's my point.
>> MODERATOR: I think it's good you respect to the core topic of the discussion. She wants to react directly and then you get the floor.
>> MARIE‑LAURE LEMINEUR: Very briefly. We heard about rights and we haven't mentioned the words responsibility and obligations by end users and all the stakeholders, you know. Just a very quick point.
>> NAIR ABHILASH: Companies have to ‑‑
>> SPEAKER: The lady said Disney is not collecting any data. My friend has to be taken out because of Norwegian research. This discovered that it's still collecting data, and Disney is analyzing that. Also wants to talk about the Disney movies instead of other things that are just more of daily use for the children. So what's the problem, I guess, is that we can't trust some companies with the responsibility to do so.
>> ELLEN BLACKLER: I'm glad you raised that. The Kayla doll got a lot of attention. You'll be glad to know that the Kayla doll had no relationship with us, and nose messages were not on our behalf and didn't have our permission. We'll probably be actually taking some action against them about that.
That was their ‑‑ we have no licensing relationship with them and we collect no data and did not give them permission to use our data in that doll.
>> MODERATOR: Thank you for making that clear. Another comment from that side.
>> AUDIENCE: Is somebody from Facebook or any network here?
>> MODERATOR: I don't think somebody from Facebook is here in the room.
>> AUDIENCE: This is the problem. We're speaking about that and we're with a good company that looks out for the children. So I think we have to press them to be at a table. We need them and the government to ask them to make something. The law is before what is happening.
What is happening is children everywhere in the world, I think, in Latin America, we are not so tied to the law, so parents accept that the children are under 13 on Facebook, and it's no problem for them. I think we have to sit down with them. I don't know where or how because whether I see the Facebook people, they say, we are ‑‑ we haven't got people ‑‑ children below 13 because look, we haven't got.
Then when we make workshops and say there are children who has a network and has a Facebook everywhere they say yes. So what do we do with that?
>> MODERATOR: Thank you for that question. I will try to forward that to Facebook after the session.
Should we consider to just lower the threshold in order to accept the fact that children are going to Facebook and other services as well and don't accept any of the age thresholds? They lie as you said before about their age, and sometimes they put themselves in more danger with lying about their age than if they're honest with regard to their age because there are some measures in operation to keep children at a certain age more protected than when they are over 18. If children lie about it they don't get the measure of protection.
It might be good to lower the threshold and then when you have sanctions you have those that accept it. I have Larry and a woman. Larry, you have the floor. Maybe the woman first and then you.
>> AUDIENCE: I'm from Mexico, and I wanted to talk about something that you're talking about the age of the kids. I do talk with parents and kids about technology all the time, and one of the most common things that I find is that it's all about information and literacy.
They do want to have information. They're worried. I mean, the fear, I think, it's the main motivation for parents today, and when you talk to them about empowerment and going to really play a role of the parent, they really want to play it on the positive side and they really grasp that in a very positive way. However, they do need guidelines.
They need a bunch of them, because rules is one thing, but guidelines is super important and tie those together and it's what's expected for them. I'm coming back to the age point is, one thing that I think is very hard is, too, because in digital because in social media you have very different social medias today. They go not to one social media but they go to an average of I think it's six to seven, correct? Some of them are on the social media and kids are on there. Some of them are general audience on social media.
I think it's very hard to think about the age, but it's better if we think about what maturity should they have and establish, establish that mature and base different ages for different types of dynamics conducted in those social media. That's what I wanted to say.
>> MODERATOR: Thank you. Larry.
>> AUDIENCE: I don't work for Facebook but I'm on the safety advisory board which means in an advisory manner. They don't center to listen to me. They have some protections between 13 and 18 that are different than the protections they apply to people over 18. They already have a precedent for some level of different. For example, someone under 18, location is off by default. It's off and not on. There are some.
I think the ‑‑ I'd like to go back to a point Sonia made that is very important. To distinguish between access to social media and the ability for social media companies to monetize or otherwise collect data from children or to, you know, analyze and what's the term profile children?
It strikes me if legislation is appropriate, it would figure out a way and I don't know how it's done exactly that companies can still make money from their child subscribers without collecting data.
For example, correct me if I'm wrong, but when you watch Disney television programs in America, you watch advertising. So there is plenty of precedent to advertising to children. At least in the United States we have rules on television as to how you can do that. For example, you can't combine the ads and the characters in the program in such a way it's impossible to distinguish between the program and advertising. Not always perfect but at least we thought that through to some extent.
Again, perhaps there may be a way to focus on data collection rather than requiring parental permission for social media. One would not argue that a 13‑year‑old should have to have parental permission to read "The New York Times." That's absurd. That's what happens in social media or would happen under COPA and does and will happen under GBR.
I think Sonia made a good point that legislators ought to think through the difference between access and profiling and perhaps create different rules. One last comment a wise person years ago made a comment if privacy protections are good for children they're good for adults as well.
>> MODERATOR: I see you, and Marie, I see you. We have a remote participant. I give the floor to the remote participant. Do we hear? Does it work? She needs a microphone. So do read it out, or do we have the person on the screen? Okay. Please go ahead.
>> SPEAKER: This speak by Rodrigo from Brazil. To Sonia, when they use news and entertainment portals regulated by algorithms, shouldn't we talk about identity risk itself and the termination violation? How to connect the basic RCR right of having some freedom to regulate their own identity as an intellectual safety? To this I will go to the mediated self-construction when users alienated of algorithm functions.
>> MODERATOR: Very difficult question. Sonia, could you go on that?
>> SONIA LIVINGSTONE: Yeah, I think this question is getting at something really fascinating. Here we have our own identities and think we know who we are, and we have our algorhythmic identities that the Internet thinks we are from all the likes and websites we visited and all the things we posted and things others say about us and the particular ways the algorithms have connected that up. There is a kind of double‑us, a shadow us out there.
That sadly is the currency that determines what advertising we get, what access we get. Perhaps in the future what kind of prices we're going to be offered for things, and we have no ‑‑ that is not auditable or accountable by us. We ask what Facebook keeps about us or data they keep about us, but we can't access the whole thing. There's a shadow of us that has an identity linked to us over which we cannot audit or correct or alter, but it's going to have consequences and that is happening for children and for all of us, and I think it's an extraordinary next challenge. Thank you for the question.
>> MODERATOR: Thank you so much, Sonia. We have another question from the floor.
>> AUDIENCE: Hello. I'm Ian from Hong Kong and I'm 16. I heard many discussions on whether we should restrict children from using the Internet or using social media. I don't think ‑‑ I appreciate the effort that different stakeholders have put in to protect children, but is it in an appropriate way? I think that children will eventually use the Internet and Facebook or other social media. If we simply restrict them from using this, they wouldn't know why they shouldn't use it. They wouldn't know why they would collect data and sell it.
Is that how we should educate them? Also, is it able to issue a law that restricts the companies and maybe government or probably companies to sell the data to make money and restrict them, the data they collected can only be used for improving their own service? Thank you.
>> MODERATOR: I think everyone wants that, but there's nothing in it for the companies. If they companies don't monetize the data, don't sell the data but target advertising according to your data, if they can't do that, they will have no incentive to provide you with any services at all. That's the problem. They'll just say, you know, why should we let you in our services at all? Yeah. Answer.
>> ELLEN BLACKLER: I want to build on this idea because Larry put going on the tabling to back to. COPA is a bit of a mess and it was developed before social networking. It says you can't collect data for these reasons, so there's a bunch of reasons you can't collect data for.
That list could be better drawn so that, for instance, you could allow data to be collected, which is what happens when you post something on your Facebook page, but then not allow tha