In response to mounting criticism over inadequate safeguards for teenagers, Meta Platforms Inc.'s (NASDAQ: META) Instagram is extending its "Limits" feature to enhance user protection against online harassment.
What Happened: Instagram's "Limits" feature, initially created to help creators manage harassment campaigns, is now available to all users, reported The Verge.
This feature enables users to mute comments and direct messages from all accounts, except those on their close friends list. It allows users who are not in your close friends list to still interact with your posts. However, when the "Limits" feature is enabled, these interactions will be hidden from your view. The muted accounts will be unaware that their content is hidden. You can still choose to view them if you want.
The "Limits" feature can be activated for up to four weeks at a time, with the option to extend. It can be enabled via the "Limited interactions" option in the user profile settings.
Instagram's decision to expand the "Limits" feature comes amid increased scrutiny from the U.S. government over the safety of its young users.
Earlier this year, Meta introduced a feature that prevents adults from messaging minors by default on Instagram and Facebook and took steps to hide content related to suicide and eating disorders from teenagers on both platforms.
Why It Matters: Instagram's move to expand the 'limits' feature follows a series of actions taken by Meta to address concerns about child safety on its platforms.
In January, Meta tightened privacy controls and CEO Mark Zuckerberg testified on child online safety.
However, in February, Meta again faced criticism for child exploitation via subscription tools on Facebook and Instagram. It took steps in April against the sextortion of teenage users.