Facebook for 6-year olds?
5|12|17
Facebook says it built Messenger Kids, a new version of its popular communications app with parental controls, to help safeguard pre-teens who may be using unauthorized and unsupervised social-media accounts. Critics think Facebook is targeting children as young as 6 to hook them on its services.
Facebook’s goal is to “push down the age” of when it’s acceptable for kids to be on social media, says Josh Golin, executive director of Campaign for a Commercial Free Childhood. Golin says 11-to-12-year-olds who already have a Facebook account, probably because they lied about their age, might find the animated emojis and GIFs of Messenger Kids “too babyish,” and are unlikely to convert to the new app.
Facebook launched Messenger Kids for 6-to-12-year olds in the US Monday, saying it took extraordinary care and precautions. The company said its 100-person team building apps for teens and kids consulted with parent groups, advocates, and childhood-development experts during the 18-month development process and the app reflects their concerns. Parents download Messenger Kids on their child’s account, after verifying their identity by logging into Facebook. Since kids cannot be found in search, parents must initiate and respond to friend requests.
Facebook says Messenger Kids will not display ads, nor collect data on kids for advertising purposes. Kids’ accounts will not automatically be rolled into Facebook accounts once they turn 13.
Nonetheless, advocates focused on marketing to children expressed concerns. The company will collect the content of children’s messages, photos they send, what features they use on the app, and information about the device they use. Facebook says it will use this information to improve the app and will share the information “within the family of companies that are part of Facebook,” and outside companies that provide customer support, analysis, and technical infrastructure.
“It’s all that squishy language that we normally see in privacy policies,” says Golin. “It seems to give Facebook a lot of wiggle room to share this information.” He says Facebook should be clearer about the outsiders with which it may share data.
In response to questions from WIRED, a spokesperson for Facebook said: “It’s important to remember that Messenger Kids does not have ads and we don’t use the data for advertising. This provision about sharing information with vendors from the privacy policy is for things like providing infrastructure to deliver messages.”
Kristen Strader, campaign coordinator for the nonprofit group Public Citizen, says Facebook has proven it cannot be trusted with youth data in the past, pointing to a leaked Facebook report from May that promised advertisers the ability to track teen emotions, such as insecurity, in real-time. “Their response was just that they will not do similar experiments in the future,” says Strader. At the time, advocacy groups asked for a copy of the report, but Facebook declined.
Tech companies have made a much more aggressive push into targeting younger users, a strategy that began in earnest in 2015 when Google launched YouTube Kids, which includes advertising. Parents create an account for their child through Google’s Family Link, a product to help parents monitor screentime. FamilyLink is also used for parents who want to start an account for their kid on Google Home, which gets matched to their child’s voice.
“There is no way a company can really close its doors to kids anymore,” says Jeffrey Chester, executive director for the Center of Digital Democracy. “By openly commercializing young children’s digital media use, Google has lowered the bar,” he says, pointing to what toy company Mattel described as “an eight-figure deal” that it signed with YouTube in August.
Chester says services such as YouTube Kids and Messenger Kids are designed to capture the attention, and affinity, of the youngest users. “If they are weaned on Google and Facebook, you have socialized them to use your service when they become an adult,” he says. “On the one hand it’s diabolical and on the other hand it’s how corporations work.”
In past years, tech companies avoided targeting younger users because of the Children’s Online Privacy Protection ACT (COPPA), a law that requires parental permission in order to collect data on children under 13. But, “the weakness of COPPA is that you can do a lot of things if you get parental permission,” says Golin. In the past six months, new apps have launched marketed as parent helpers. “What they’re saying is this is great way for parents to have control, what they are getting is parental permission,” says Golin.
Several children-focused nonprofit groups endorsed Facebook’s approach, including ConnectSafely and Family Online Safety Institute (FOSI). Both groups have received funding from Facebook.
A Facebook spokesperson says, “We have long-standing relationships with some of these groups and we’ve been transparent about those relationships.” The spokesperson says many backers of Facebook’s approach, including Kristelle Lavallee of the Center on Media and Child Health, and Dr. Kevin Clark of George Mason University’s Center for Digital Media Innovation and Diversity, do not receive support from Facebook.
Source: www.wired.com