The head of Instagram faced a grilling from US lawmakers on Wednesday over how the platform protects its youngest users, an appearance that comes amid intensifying criticism of Instagram’s impact on children and young adults.
In opening statements, Senator Richard Blumenthal promised to be “ruthless” in the hearing, saying “the time for self-policing and self-regulation is over”.
“Self policing depends on trust, and the trust is gone,” he said. “The magnitude of these problems requires both and broad solutions and accountability which has been lacking so far.”
Instagram executive Adam Mosseri, appearing before the Senate commerce consumer protection panel, defended the platform and called on lawmakers to create an industry body to better regulate social media.
Mosseri also attempted to shift blame on the wider industry, saying that “keeping young people safe online is not just about one company” and added that more young people use other apps including video platforms TikTok and YouTube.
“We all want teens to be safe online,” Mosseri said in opening statements. “The internet isn’t going away, and I believe there’s important work that we can do together – industry and policymakers – to raise the standards across the internet to better serve and protect young people.”
He called for an industry body to address “how to verify age, how to design age-appropriate experiences, and how to build parental controls” on apps. He suggested targeting the protections offered by Section 230, a federal law that shields platforms from legal liability for what users post on them.
The hearing comes as Instagram and its parent company, Meta Platforms (formerly Facebook), face global criticism over the ways their services affect the mental health, body image and online safety of younger users after the release of internal documents from former employee and whistleblower Frances Haugen.
Those papers, published by the Wall Street Journal and handed over to Congress, revealed the company’s own internal research showed Instagram negatively affected the mental health of teens, particularly regarding body image issues.
Lawmakers also pressed Mosseri to release more of the internal research referenced in those papers including a presentation about anorexia and suicidal thoughts among teens. Mosseri committed to better transparency but said that specific presentation was likely deleted due to data retention laws.
Other senators had strong words for Mosseri, bringing grave examples of harms done to children through the Instagram platform. Maria Cantwell told Mosseri of one of her constituents who claimed her young daughter was groomed by adults on Instagram, lured into sex trafficking and taken across state lines for prostitution.
She challenged Mosseri over Instagram’s terms and conditions, which state that a child’s only legal recourse over such incidents would be an arbitration – a closed court process in which matters are settled quietly with no judge, jury, or appeal option.
“That story is terrifying,” Mosseri said. “We try to be as public as we can about how well we do on difficult problems like that one, and we believe that there should be industry standards, there should be industry wide accountability, and that the best way to do that is federal legislation, which is specifically what I’m proposing today.”
Blackburn asked Mosseri to speak directly to parents whose children have been hurt, or hurt themselves, as a result of their Instagram use. “You have broken these children’s lives, and you have broken these parents’ hearts,” she said.
Meanwhile, Blumenthal, the Democratic senator and chair of the panel, asked on Wednesday that Instagram permanently scrap its development of a platform for children, which the company previously suspended amid growing opposition. Mosseri declined to commit to a permanent stop, but said any related projects would require parental consent.
Lawmakers are increasingly pushing for greater accountability. In November, a bipartisan coalition of US state attorneys general said it had opened an inquiry into Meta for promoting Instagram to children despite potential harms. And in September, US lawmakers grilled Facebook’s head of safety, Antigone Davis, about the impacts of the company’s products on children.
Ahead of Wednesday’s hearing, Instagram said it will be stricter about the types of content it recommends to teens and will nudge young users toward different areas if they dwell on one topic for a long time.
In a blogpost published on Tuesday, the social media service announced it was switching off the ability for people to tag or mention teens who do not follow them on the app and would enable teen users to to bulk delete their content and previous likes and comments.
In the blogpost, Mosseri also said Instagram was exploring controls to limit potentially harmful or sensitive material, was working on parental control tools and was launching a “Take a Break” feature, which reminds people to take a brief pause from the app after using it for a certain amount of time, in certain countries.
Blumenthal called the company’s product announcement “baby steps”.
“They are more a PR gambit than real action done within hours of the CEO testifying that are more to distract than really solve the problem,” he told Politico.
Blackburn criticized the company’s product announcement as “hollow”, saying in a statement: “Meta is attempting to shift attention from their mistakes by rolling out parental guides, use timers and content control features that consumers should have had all along.”
Mosseri also said in Wednesday’s hearing that the platform may be reintroducing a chronological news feed in 2022, a departure from the activity-driven algorithm it currently uses.
While lawmakers seemed satisfied to make some concrete steps towards formulating better social media policies, activists remained wary.
For years the company has offered “empty promises and half-baked safety measures”, said Josh Golin, executive director of children’s safety organization Fairplay.
“The bottom line is this: Instagram’s advertising business is harming children, and nothing meaningful has been done to change that,” he said. “It’s clear that self-regulation will not work. Congress must act now and regulate big tech to protect children.”