Instagram is halting its practice of recommending content that skirts the boundaries of its own guidelines - amid accusations that the image sharing platform has been complicit in exposing children to harmful content via its Explore section.
Borderline cases which may just fall short of crossing line the will no longer be given the benefit of the doubt and will instead be contained by blocking recommendations in its Explore feed and while browsing hashtags.
In this way posts which fall within a grey area of inappropriateness will find it much harder to gain traction, appearing in the personal feed of members only if they happen to follow the account which posts it.
Instagram reports that the type of material which will be shunned going forward includes sexually suggestive posts as well as violent and graphic content. Any material deemed to be spam or misinformation will also fall foul of the clampdown.
In a blog post outlining the changes Instagram wrote: “We have begun reducing the spread of posts that are inappropriate but do not go against Instagram’s Community Guidelines, limiting those types of posts from being recommended on our Explore and hashtag pages. For example, a sexually suggestive post will still appear in Feed if you follow the account that posts it, but this type of content may not appear for the broader community in Explore or hashtag pages.”
Instagram's hand has been forced after a series of high-profile scandals in which children have been exposed to harmful content. The case of Molly Russell, who died by suicide after viewing graphic content, has attracted particular attention.