Advertisement
Australia markets closed
  • ALL ORDS

    8,153.70
    +80.10 (+0.99%)
     
  • ASX 200

    7,896.90
    +77.30 (+0.99%)
     
  • AUD/USD

    0.6519
    -0.0017 (-0.26%)
     
  • OIL

    83.11
    +1.76 (+2.16%)
     
  • GOLD

    2,254.80
    +16.40 (+0.73%)
     
  • Bitcoin AUD

    108,759.48
    +2,386.52 (+2.24%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • AUD/EUR

    0.6035
    +0.0005 (+0.08%)
     
  • AUD/NZD

    1.0902
    +0.0022 (+0.20%)
     
  • NZX 50

    12,105.29
    +94.63 (+0.79%)
     
  • NASDAQ

    18,254.69
    -26.15 (-0.14%)
     
  • FTSE

    7,952.62
    +20.64 (+0.26%)
     
  • Dow Jones

    39,807.37
    +47.29 (+0.12%)
     
  • DAX

    18,492.49
    +15.40 (+0.08%)
     
  • Hang Seng

    16,541.42
    +148.58 (+0.91%)
     
  • NIKKEI 225

    40,168.07
    -594.66 (-1.46%)
     

Senators came to the Instagram hearing armed with their teenaged finstas

Senators created accounts posing as teens, and they had a lot of questions about what they saw.

Anadolu Agency via Getty Images

Instagram’s top executive spent more than two hours being grilled by the Senate about Instagram’s safety policies and its impact on teens’ mental health. Unfortunately for Mosseri, members of the subcommittee on Consumer Protection, Product Safety, and Data Security came to the hearing armed with fresh anecdotes from their own finstas.

During the hearing, Mosseri’s first time appearing in Congress, multiple senators revealed that they and their staffs had created fresh Instagram accounts disguised as teenagers. They all said that the app had steered them toward content that was inappropriate for young users, including “anorexia coaches,” and other content related to self harm.

The staff of one lawmaker, Tennessee Senator Marsha Blackburn, managed to uncover a significant bug in one of Instagram's teen safety features. Blackburn said that her staff created a fresh account as a 15-year old-girl, but that the account defaulted to public, not private. Instagram said in July that teens younger than 16 signing up for the first time would be defaulted to private accounts.

ADVERTISEMENT

“While Instagram is touting all these safety measures, they aren't even making sure that the safety measures are in effect for me,” Blackburn said. Mosseri later confirmed that the company had mistakenly not enabled the private default settings for new accounts created on the web. “We will correct that quickly,” he said.

Blackburn wasn’t the only senator who came to the hearing prepared with questions about what they saw on a staff-created finsta. Connecticut Senator Richard Blumenthal said that his staff had created a new account posing as 13-year-old just days before. He said after following “a few accounts promoting eating disorders,” that “within an hour, all of our recommendations promoted pro-anorexia and eating disorder content.” He later added that a search for self harm content turned up results so graphic he didn’t feel he could describe them.

It was a notable shift from the cringeworthy moment at a September hearing, when Blumenthal clumsily pushed Facebook’s Head of Safety, Antigone Davis, on whether she could “commit to ending finsta.” This time, Blumenthal pressed Mosseri on whether he would commit to ending work on Instagram Kids entirely — Mosseri did not — and whether he would make more data about Instagram’s algorithms available to researchers outside the company.

“I can commit to you today that we will provide meaningful access to data so that third party researchers can design their own studies and make their own conclusions about the effects of well being on young people,” Mosseri said. He later added that Instagram is working on giving users the option for a chronological feed.

Utah Senator Mike Lee also shared his experience creating an Instagram account for a fictitious 13-year-old girl. He described how the recommendations in the account’s Explore page changed after following just one account. “The Explore page yielded fairly benign results at first,” he said. “We followed the first account that was recommended by Instagram, which happened to be a very famous female celebrity. After following that account, we went back to the Explore page and the content quickly changed.

“Why did following Instagrams top recommended account for a 13-year-old girl cause our Explore page to go from showing really innocuous things, like hairstyling videos, to content that promotes body dysmorphia. sexualization of women and content otherwise unsuitable for a 13 year old girl?”

Mosseri replied that, according to the company’s Community Standards Enforcement report, content promoting eating disorders accounts for “roughly five out of every 10,000 things viewed.” Lee didn’t buy it. “It went dark fast,” he said. “It was not five in 1,000 or five in 10,000, it was rampant.”

While much of what the senators described was similar to what journalists and others have reported experiencing on Instagram, the exchanges were telling because they underscored a point that’s been raised by whistleblower Frances Haugen and others studying the company: That Facebook often uses deceptive statistics to mask its problems. And that the sheer size of the platform means that even relatively low amounts of harmful content can have an outsize impact on users.

It also indicated just how seriously lawmakers are taking the issue of teen safety and social media. While previous hearings with big tech executives have often veered wildly off topic, senators stayed relatively focused on the issues. And it was clear that there was bipartisan agreement on the need for Instagram to disclose more information about its platform to the public and to researchers.

“I would support federal legislation around the transparency of data, or the access to data from researchers, and around the prevalence of content problems on the platform,” Mosseri said. But Blumenthal pushed back, saying Instagram’s previous actions haven’t gone far enough.

“​​The kinds of baby steps that you've suggested so far, very respectfully, are underwhelming,” Blumenthal said at the close of the hearing. “I think you will sense on this committee, pretty strong determination to do something well beyond what you've indicated you have in mind. That's the reason that I think self-policing based on trust is no longer a viable solution.”