Instagram has announced that it is currently testing a new feature designed to enable users to eliminate content that has been recommended to them.
The ‘Reset’ tool aims to create a safer environment for teenagers utilizing the platform.
This “reset” tool, which is expected to be rolled out globally in the near future, effectively purges a user’s feed of algorithmically suggested content. However, Instagram has indicated that recommendations will gradually begin to personalize again over time.
The UK media regulator, Ofcom, has expressed approval of this announcement but emphasized that further actions are necessary to ensure social media safety.
“It is encouraging to see Instagram implementing these changes prior to the enforcement of regulations, and we will continue to advocate for companies to enhance the protection and empowerment of their users,” the statement read.
An intense international discussion is ongoing regarding the protection of young people online, with Australia recently suggesting a ban on social media for those under 16.
Regarding the reset process, Meta, the parent company of Instagram, has stated that the new tool will be accessible to all users, including those with teen accounts, allowing individuals to reset their recommendations with just a few taps.
Also read: Here’s complete guide to download your Instagram data
“We aim to ensure that all Instagram users, particularly teens, have safe, positive, and age-appropriate experiences, and that they find their time spent on Instagram to be worthwhile,” Meta articulated in a blog post announcing this initiative.
Users seeking to refresh their feeds will have the option to select “reset suggested content” from the “content preferences” menu.
Subsequently, they will be prompted to confirm whether they wish to unfollow the accounts that frequently appear in their posts.
Entities such as Meta will have a three-month period to evaluate the risks associated with illegal content online and must take measures to mitigate it.
In a separate development, the regulator is expected to finalize its Children’s Safety codes of practice by April 2025, which will mandate that companies provide children with greater control over the content they encounter on social media platforms.