Last week, Ofcom announced that it would be granted new powers to regulate advertising on social media. Considering the first Facebook ad went out in 2005, this has been a very long time coming.
The government is planning to set the directions of the regulations and will allow Ofcom to adapt and draw up the details. For this to work smoothly, there will need to be a lot of deeper understanding of the nuances of the platforms, as well as smooth collaboration between all parties involved.
Before this ruling was announced, platforms were tasked with self-policing the organic and paid content they hosted. In fact, reportedly Facebook wasn't too strict with policing ads in the earlier days.
Kevin Colleran, one of Facebook’s first advertising salespersons said: "I do not remember any kind of review process in the early days (other than standard content, grammar, etc). Later on, when the ads did start being placed closer to the content of the page (rather than on the sides where the banners/flyers were placed) there may have been more strict policies put into place in an effort to protect the user experience."
Platforms have also been relying on its users to flag and report comments and content that go against the murky community guidelines they'd each set up for themselves. Putting the brunt on the content creators and content consumers rather than on the platforms themselves, seems illogical.
With this approach, platforms take a backseat approach to regulation. This has meant that many advertisers got away with a lot until it was flagged or deemed to be against the guidelines. In theory, these new regulations should help to set a homogenous set of rules for all platforms to comply with. As such, social media advertisers will be forced to be more responsible, as there will be more consistency and transparency across all platforms as to what they should and shouldn't do and there will be more chance of a penalty if their content violates these policies.
Regulations like Ofcom’s force platforms to be more compliant and wake up to effects they’re having on users. Facebook’s Mark Zuckerberg has been in Germany this week attending a security conference where he’s been speaking about his approach to regulation against “harmful” content on Facebook. He argues that it’s important to treat social media platforms differently from newspapers and existing media, but not to the extent that these platforms take a removed approach where the data is simply hosted without them taking any responsibility.
Zuckerberg’s middle-ground approach sounds like a good idea on paper but again, there are questions to be asked. He’s said he’d like to balance “promoting innovation and research” with users’ privacy and security online. This grey area allows platforms like Facebook to set and interpret their own regulations around how privacy is handled. As is custom with middle-ground approaches like these though, there’s going to be a teething period. This is something that Ofcom will need to keep in mind when regulating social platforms’ content.
While this teething period may take time, it’s definitely a welcome change that these regulations around social advertising aren’t just being explored, but are on the road to being enforced. This will only mean that content creators and brands are going to be forced to be more mindful of the content they’re putting out and the negative impacts this content could have.