This article was originally published by YR Media
A new Instagram feature allows users to filter out “sensitive content” on certain posts.
The social media platform’s Sensitive Content Control setting has three categories for users to filter select posts out of their explore tab: “Limit” (Instagram’s original default setting), “Allow” (users see more “sensitive” content) and “Limit Even More” (to see the least “sensitive” content).
Users are able to select the level they are comfortable with under the settings tab of their Instagram profile [Settings > Account > Sensitive Content Control].
It is still unclear how Instagram determines what gets marked as sensitive and what does not. What makes this situation increasingly complex is that Facebook, the parent company of Instagram, uses online bots to moderate posts. But Instagram hasn’t stated if they use bots to sort through their content.
The sensitive content control setting comes after a series of announcements from Instagram in 2016 and 2019 announcing the ability to disable comments on users’ posts and restrict someone’s account. According to the press release, the intention behind rolling out this new feature is to give users more choice and “another way to make Instagram work better.”
However, Instagram has drawn a lot of fire from influencers over the consequences of this feature. The setting, turned on by default, limits users’ impact within the communities they serve.
Despite backlash, Instagram has been experimenting with it’s explore page even more. On July 27, the platform announced accounts with a “potentially suspicious behavior” history will not have their posts appear in the explore page for young people’s accounts.
These latest moves from Instagram are the platform’s attempts to make the app a safer place. As these updates continue to roll out, time will tell if Instagram has found it’s solution.