Advertisement
Australia markets closed
  • ALL ORDS

    8,153.70
    +80.10 (+0.99%)
     
  • ASX 200

    7,896.90
    +77.30 (+0.99%)
     
  • AUD/USD

    0.6515
    -0.0004 (-0.06%)
     
  • OIL

    83.11
    -0.06 (-0.07%)
     
  • GOLD

    2,254.80
    +16.40 (+0.73%)
     
  • Bitcoin AUD

    107,408.10
    -941.90 (-0.87%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • AUD/EUR

    0.6041
    +0.0007 (+0.11%)
     
  • AUD/NZD

    1.0904
    +0.0002 (+0.02%)
     
  • NZX 50

    12,105.29
    +94.63 (+0.79%)
     
  • NASDAQ

    18,254.69
    -26.15 (-0.14%)
     
  • FTSE

    7,952.62
    +20.64 (+0.26%)
     
  • Dow Jones

    39,807.37
    +47.29 (+0.12%)
     
  • DAX

    18,492.49
    +15.40 (+0.08%)
     
  • Hang Seng

    16,541.42
    +148.58 (+0.91%)
     
  • NIKKEI 225

    40,369.44
    +201.37 (+0.50%)
     

YouTube, TikTok, Snap Resist Facebook Comparison on Kids Privacy

(Bloomberg) -- Social media companies YouTube, TikTok and Snap sought to distance themselves from rival Facebook Inc. Tuesday as lawmakers pressed for legislation to codify privacy protections for kids and teens on their platforms.

Most Read from Bloomberg

The executives appeared at a Senate committee hearing a day after a consortium of 17 news outlets, including Bloomberg, published dozens of articles based on troves of leaked Facebook data that detailed how the company prioritized profits over the safety of users -- particularly teenagers -- on its products.

ADVERTISEMENT

The Facebook Papers: Social Network Shaken by Content, User Woe

The Senate Commerce Committee’s consumer protection panel, led by Connecticut Democrat Richard Blumenthal and Tennessee Republican Marsha Blackburn, examined efforts by Alphabet Inc.’s YouTube, ByteDance Ltd’s TikTok and Snap Inc. to protect the privacy of children and teenagers online.

Blumenthal said that there’s bipartisan urgency to move forward with legislation on regulating these companies.

“Whether it’s Facebook or your companies, in various ways I think you have shown that we can’t trust big tech to police itself,” he said.

Blumenthal dismissed the current company policies that the executives said protect young people and teens, saying “there’s no way to hold you accountable under current law.”

He said tech companies should not be relying on parents to protect their children’s privacy on their platforms, the features need to be built in.

“Being different from Facebook is not a defense. What we want is not a race to the bottom, but really a race to the top,” Blumenthal said.

The witnesses included Michael Beckerman, TikTok’s vice president and head of public policy for the Americas, Jennifer Stout, Snap’s president of global public policy, and Leslie Miller, YouTube’s vice president of government affairs and public policy.

Emphasis on Safety

Blumenthal and Blackburn’s subcommittee previously heard from Facebook whistle-blower Frances Haugen, the former product manager who leaked documents to the committee and the U.S. Securities and Exchange Commission. Haugen highlighted how Facebook’s engagement-based algorithms lead harmful content to become viral on the platform. She said these algorithms particularly affect teenage girls who already have negative views of their bodies.

The three social media companies attempted to set themselves apart from Facebook in their approach to online safety. The hearing marked TikTok and Snap’s first appearance before Congress.

Last week, Blumenthal separately invited Facebook CEO Mark Zuckerberg to testify before the subcommittee in a future hearing.

Facebook, Alarmed by Teen Usage Drop, Left Investors in the Dark

Snap emphasized that one of its strongest privacy protections is that it only allows users ages 13 and up, and has no plans to market to kids under 13. The registration process fails for individuals under the age of 13 that attempt to sign up.

“We make no effort -- and have no plans -- to market to children,” Stout told the committee. “We want snapchatters to be connected to the people they’re connected to in real life” she added, differentiating the platform from Facebook.

Stout said that regulation alone won’t solve the challenges surrounding privacy online. “Technology companies must take responsibility and actively protect the communities they serve,” she said.

TikTok highlighted specific actions it’s taken to protect children’s safety in recent years, including disabling the direct messaging feature for users under age 16. The company also disabled all users from sending certain videos, photos and website links, and only videos that have been approved through content moderation are allowed.

According to Beckerman’s testimony, TikTok has removed 11 million suspected underage accounts from April to June 2021. But the company acknowledged the challenges it faces.

“We do know trust must be earned, and we’re seeking to earn trust through a higher level of action, transparency and accountability, as well as the humility to learn and improve” Beckerman said.

Blackburn raised concerns about data collected by TikTok and whether it’s shared with the Chinese government, where parent company ByteDance is based. She said that despite vague assurances, TikTok “has not alleviated my concerns in the slightest.”

TikTok said it stores its data outside of China, including in Singapore and the U.S. “We do not share information with the Chinese government,” Beckerman said at the hearing.

Facebook, Trump and How Online Speech Is Moderated: QuickTake

YouTube’s Miller told the panel that YouTube Kids, created in 2015, provides parents with tools to control and customize the app for children. Miller said that kids under 13 who aren’t in a parental “supervised experience” are not allowed on YouTube. They don’t allow personalized advertisements on YouTube Kids or the “supervised experience.”

Miller said the company has removed nearly 1.8 million videos from April to June 2021 for violations of the company’s child safety policies.

Efforts to Legislate

Blumenthal asked if the three social media platforms would support his legislation -- known as the EARN IT Act -- which would make tech companies liable for child sexual abuse material on their platforms. The Senate Judiciary Committee unanimously advanced the measure last Congress, and Blumenthal said he plans to reintroduce it in the coming weeks

All three platforms said they supported the intentions and goals of the proposed legislation, but didn’t go as far as saying they support the measure.

“This is the talk that we’ve seen again, and again and again, that you support the goal. That’s meaningless unless you support the legislation,” Blumenthal said. “I ask that each and every one of you support the EARN IT Act.”

Blumenthal and Massachusetts Democrat Ed Markey asked the companies if they supported updating the Children’s Online Privacy Protection Act, which was enacted in 1998, years before the launch of the social media companies. The law currently restricts collection of personal information of children under age 13. The legislation would expand the protections to age 16. The bill has bipartisan support from Republican Senators Bill Cassidy of Louisiana and Cynthia Lummis of Wyoming.

TikTok’s Beckerman said he supports reforming COPPA, but said Congress should go further in setting uniform age verification standards for all platforms to follow.

Markey also discussed his legislation to prohibit certain manipulative marketing practices geared toward online users under the age of 16, including banning auto-play features and algorithms that amplify violent and dangerous content. That bill has no Republican cosponsors to date.

(Updates with lawmaker, witness statements throughout)

Most Read from Bloomberg Businessweek

©2021 Bloomberg L.P.