Senators came to the Instagram hearing armed with their teenaged finstas
Instagram’s top executive spent more than two hours being grilled by the Senate about Instagram’s safety policies and its impact on teens’ metal health. Unfortunately for Mosseri, members of the subcommittee on Consumer Protection, Product Safety, and Data Security came to the hearing armed with fresh anecdotes from their own finstas.
During the hearing, Mosseri’s first time appearing in Congress, multiple senators revealed that they and their staffs had created fresh Instagram accounts disguised as teenagers. They all said that the app had steered them toward content that was inappropriate for young users, including “anorexia coaches,” and other content related to self harm.
The staff of one lawmaker, Tennessee Senator Marsha Blackburn, managed to uncover a significant bug in one of Instagram’s teen safety features. Blackburn said that her staff created a fresh account as a 15-year old-girl, but that the account defaulted to public, not private. Instagram said that teens younger than 16 signing up for the first time would be defaulted to private accounts.
“While Instagram is touting all these safety measures, they aren’t even making sure that the safety measures are in effect for me,” Blackburn said. Mosseri later confirmed that the company had mistakenly not enabled the private default settings for new accounts created on the web. “We will correct that quickly,” he said.
Blackburn wasn’t the only senator who came to the hearing prepared with questions about what they saw on a staff-created finsta. Connecticut Senator Richard Blumenthal said that his staff had created a new account posing as 13-year-old just days before. He said after following “a few accounts promoting eating disorders,” that “within an hour, all of our recommendations promoted pro-anorexia and eating disorder content.” He later added that a search for self harm content turned up results so graphic he didn’t feel he could describe them.
It was a notable shift from the at a September hearing, when Blumenthal clumsily pushed Facebook’s Head of Safety, Antigone Davis, on whether she could “commit to ending finsta.” This time, Blumenthal pressed Mosseri on whether he would commit to ending work on Instagram Kids entirely — Mosseri did not — and whether he would make more data about Instagram’s algorithms available to researchers outside the company.
“I can commit to you today that we will provide meaningful access to data so that third party researchers can design their own studies and make their own conclusions about the effects of well being on young people,” Mosseri said. He later added that Instagram is working on giving users the option for a chronological feed.
Utah Senator Mike Lee also shared his experience creating an Instagram account for a fictitious 13-year-old girl. He described how the recommendations in the account’s Explore page changed after following just one account. “The Explore page yielded fairly benign results at first,” he said. “We followed the first account that was recommended by Instagram, which happened to be a very famous female celebrity. After following that account, we went back to the Explore page and the content quickly changed.
“Why did following Instagrams top recommended account for a 13-year-old girl cause our Explore page to go from showing really innocuous things, like hairstyling videos, to content that promotes body dysmorphia. sexualization of women and content otherwise unsuitable for a 13 year old girl?”
Mosseri replied that, according to the company’s Community Standards Enforcement report, content promoting eating disorders accounts for “roughly five out of every 10,000 things viewed.” Lee didn’t buy it. “It went dark fast,” he said. “It was not five in 1,000 or five in 10,000, it was rampant.”
While much of what the senators described was similar to what journalists and others have reported experiencing on Instagram, the exchanges were telling because they underscored a point that’s been raised by whistleblower Frances Haugen and others studying the company: That Facebook often uses to mask its problems. And that the sheer size of the platform means that even relatively low amounts of harmful content can have an outsize impact on users.
It also indicated just how seriously lawmakers are taking the issue of teen safety and social media. While previous hearings with big tech executives have often veered wildly off topic, senators stayed relatively focused on the issues. And it was clear that there was bipartisan agreement on the need for Instagram to disclose more information about its platform to the public and to researchers.
“I would support federal legislation around the transparency of data, or the access to data from researchers, and around the prevalence of content problems on the platform,” Mosseri said. But Blumenthal pushed back, saying Instagram’s previous actions haven’t gone far enough.
“The kinds of baby steps that you’ve suggested so far, very respectfully, are underwhelming,” Blumenthal said at the close of the hearing. “I think you will sense on this committee, pretty strong determination to do something well beyond what you’ve indicated you have in mind. That’s the reason that I think self-policing based on trust is no longer a viable solution.”
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.