California protected children online in a way every state should follow

0
0

On September 15, California Governor Gavin Newsom (D) signed the Age-Appropriate Design Code law, which passed unanimously in the state Senate in late August, despite protests from the tech industry.

Modeled after the UK Children’s Code that went into effect last year, California law protects children’s privacy and well-being online by requiring companies to assess the impact of a product or service designed for or “probably accessible to children” for children. “.

The law will go into effect on July 1, 2024, after which companies that have violated the law may be subject to fines of up to $7,500 per child affected. While that may seem like a small amount, similar legislation in the European Union has allowed the Irish Data Protection Commission to fine Meta $400 million for Instagram’s handling of children’s data. (In the case of the new law, the California Attorney General would impose fines.)

The California’s Age-Appropriate Design Code Act defines a child as any person under the age of 18, compared to the Children’s Online Privacy Protection Act (COPPA) of 1998, for which 13 is the age limit.

COPPA has codified the protection of children’s data and prohibits “unfair or deceptive acts or practices in connection with the collection, use and/or disclosure of personal information from and about children on the Internet.”

The new California law goes further. It requires the highest privacy settings to be the default for young users, and companies to “give a clear signal” to let kids know when their location is being tracked.

Jim Steyer, founder and CEO of Common Sense Media, one of the bill’s main sponsors, told The Bharat Express News, “This is a very important win for children and families.”

The law firmly sided with children’s safety over profit, stating, “If a conflict arises between commercial interests and the interests of children, companies must prioritize the privacy, safety and welfare of children over commercial interests.”

In a 2019 interview with The New York Times, Baroness Beeban Kidron, chief architect of the UK Children’s Code, elaborated on her meetings with tech executives.

“The main thing they ask me is, ‘Do you really expect companies to give up profits by restricting the data they collect about children?’ Her reaction? ‘Of course I am! Of course everyone should do that.’”

“If a conflict arises between commercial interests and the interests of children, companies should prioritize the privacy, safety and well-being of children over commercial interests.”

– California Age-Appropriate Design Code Act

How does the Age-appropriate Design Code Act protect children online?

Parents are increasingly concerned about the excessive time children spend online, the lure of platforms with autoplay and other addictive features, and the exposure of children to content that promotes dangerous behavior such as self-harm and eating disorders.

The Age-Appropriate Design Code Act requires companies to write a “Data Protection Impact Assessment” for every new product or service, which describes how children’s data may be used and whether this use could cause harm.

“In principle, [companies] should look at whether their product design has exposed children and teens to harmful content, or allows harmful contact by others, or uses malicious algorithms,” said Steyer.

By law, Steyer explained, YouTube could still make video recommendations, for example. The difference is that they would have less data to pull from when making these recommendations. Companies would also be responsible for assessing whether their algorithms amplify malicious content and taking action if it does.

Haley Hinkle, a policy advisor at Fairplay, an organization “dedicated to ending marketing to children,” told The Bharat Express News that by mandating an impact assessment, “big tech companies will be responsible for assessing the impact their algorithms will have.” on children before they offer a product or new design feature to the public.”

Hinkle continued, “This is critical in shifting the responsibility for security from digital platforms to the platforms themselves, and away from families who don’t have the time or resources to decrypt endless pages of privacy policies and settings options.”

By law, a company may not “collect, sell, share or retain” a young person’s information unless it is necessary for the app or platform to provide its service. The law instructs companies to “estimate the age of underage users with a reasonable degree of certainty”, or simply to grant data protection to all users.

“You can’t profile a child or a teenager by default unless the company has taken appropriate precautions,” Steyer said. “And you can’t collect accurate geolocation information by default.”

While the law’s scope is limited to California, there is hope it could lead to more far-reaching reforms as some companies changed their practices worldwide before the Children’s Code was introduced in the UK. For example, Instagram made teens’ accounts private by default, disabling direct messages between kids and adults who don’t follow them. However, how they define “adult” varies by country – it’s 18 in the UK and “certain countries” but 16 elsewhere in the world, according to their statement announcing the changes.

While it’s uncertain whether Instagram will now raise this age limit to 18 in California, the Age-Appropriate Design Code Act does require businesses to consider “the unique needs of different age groups” and developmental stages, defined by the law as follows: ” 0 to 5 years of age or ‘pre-literacy and early literacy’, 6 to 9 years of age or ‘primary school years’, 10 to 12 years or ‘transitional years’, 13 to 15 years or ‘early teens’ and 16 to 17 years or ‘approaching adulthood’ .”

“Child development and social media are not optimally aligned.”

-Devorah Heitner

What are the biggest threats to children online?

Some threats to children come from large, impersonal companies that collect data to subject them to targeted advertisements, or to profile them with targeted content that can promote dangerous behavior, such as eating disorders.

Other threats come from people your child knows in person, or even from your child himself.

Devorah Heitner, author of “Screenwise: Helping Kids Survive (And Thrive) In Their Digital World,” told The Bharat Express News that in addition to “interpersonal harm from people they know,” such as cyberbullying,”there are ways they can jeopardize their own reputation.”

“What you share when you are 12 can stay with you for a very long time,” Heitner explained.

While no law can prevent a child from posting something they probably shouldn’t, the Age-Appropriate Design Code Act does require companies to “consider the unique needs of different age groups,” setting the precedent for children and teenagers have developmental delays. different from adults and require different protections.

“Child development and social media are not optimally aligned,” Heitner noted.

Parents don’t have to wait for big tech companies to change their practices before California’s new law goes into effect. There are things you can do now to increase your child’s online privacy and security.

Hinkle suggests keeping children away from social media until at least 13 years old. To do this, she says, communicating with the parents of your child’s friends can be helpful, as the presence of their peers is the biggest draw on social media for most people. children.

Once they have social media accounts, Hinkle suggests “reviewing the settings with your child and explaining why you want to use the most protective settings.” These include disabling location data, opting for private accounts, and disabling contact with strangers.

Heitner advocates an approach she calls ‘mentoring over monitoring’. Because safety institutions can’t do much, and because kids are so good at finding workarounds, she believes the best defense is to have an ongoing conversation with your child about their online habits and the impact their actions can have. have, both on themselves and on others. .

Your kids will come across malicious content during their online hours. You want them to feel comfortable telling you about it, or, if appropriate, mentioning it.

When it comes to examining their own behavior, children need to know that you are open to discussion and not quick to judge. Heitner suggests using expressions such as: “I know you’re a good friend, but if you post that, it might not sound like it.”

Children need to understand that what they post can be misinterpreted and why they should always think before posting, especially if they are angry.

It’s a delicate balance between respecting how important your child’s online life is to them, while at the same time teaching them that social media “can make you feel awful, and that [companies] take advantage of the time you spend there,” Heitner says.

The goal of parents should be to make kids aware of these issues, and “get kids to buy a healthy skepticism” about big tech, Heitner said.

In addition to the resources available from Common Sense Media, Steyer recommends that parents use Apple’s privacy settings, which Common Sense Media helped develop.

He also suggested that parents are role models in their own media consumption.

“If You Spend All Your Time” [there] yourself, what message does that send to your child?”

LEAVE A REPLY

Please enter your comment!
Please enter your name here