Fake News Media

How real human design can help overcome fake news

By Sarah Plant and John Chadfield, Creative Director and Head of Product Strategy respectively

December 6, 2017 | 6 min read

Fake news has seeped into every part of our lives. It’s everywhere.

PHOTO BY KARSTEN KNOEFLER USED UNDER A CREATIVE COMMONS LICENSE.

/ Karsten Knoefler/Creative Commons

Increasingly, we the audience struggle to tell the difference between what’s real and what’s fake. Without the tools to know the truth, how can we find reliable sources for our daily information?

Digital media titans including Google, Facebook, Twitter and YouTube, have a responsibility to actively monitor content and should be ethically accountable for the impact of fake news and content surfacing on their platforms. But the same goes for the design world and beyond. We must all accept responsibility and acknowledge our power.

In White October’s latest Insights Sessions talk, guest Alan Rusbridger, former editor-in-chief of the Guardian, said: "We’re living in a world that we can’t tell what is true and what is untrue, a truly terrifying world."

Fake news isn’t a new concept. It was present in the coverage of the Hillsborough disaster and further back in the Harlem Riots in 1935 (caused by false rumours about the assassination of a Puerto Rican born child). It has always existed but is now amplified by social media and therefore poses a greater threat to brands, corporations and society itself. The combination of ever-smarter algorithms and regulators struggling to catch up means the problem is bigger than we realise.

The good news is key players like Google and Facebook have announced they will start to show more information in search to distinguish fact from fiction. A promising first step but it’s only the beginning.

In this new world of algorithmic content and AI, designers and technologists have an ethical responsibility and the power to surface the truth (as journalists once did). There is an opportunity for our industry to be a force for change in the current landscape. Creating ethical and responsible design can shape the future, providing the right information to the audience, allowing them to understand the source of the content and decide whether to trust what they read and watch. Giving them back freedom of choice.

Human-centred service design must be at the core of the process of presenting algorithmic information, by using it to ruthlessly focus on managing users’ expectations and signal trust. The impact of powerful algorithms is intensely complicated, deep-rooted and intangible to measure.

An essay by writer and artist James Bridle, ‘Something is wrong on the internet’, exposes the silent and damaging effects fake content and algorithms are creating with children’s cartoons and videos on YouTube. On the surface, fake videos of Peppa Pig (among others) appear harmless alongside its official branded content. Yet these fake versions include inappropriate and disturbing scenes, and through the heavy use of algorithms are being pushed to the top pages of YouTube, exposing unregulated content to babies and children. We need to better understand how the unwritten, underlying data and information changes the human experience when a person engages with content.

How data is presented by algorithms is as important as the algorithms themselves. Design is fundamental to connecting data and human experience, by surfacing the information behind the content, allowing the audience to deeper understand the content they’re consuming. Josh Clark, the founder of Big Medium, a New York design studio, has stated that the design and presentation of data is just as important as the underlying algorithm. Clark stresses that algorithmic interfaces are a huge part of our future, and getting their design right is critical. We couldn’t agree more. The impact of design can't be underestimated in taking data from machine learning and transforming it into a positive human experience.

One powerful way to do this is using sentiment analysis. The technology can identify and determine whether the writer's attitude towards a particular topic is positive, negative, or neutral. Combine this with responsible design to intentionally expose and highlight ‘unseen’ data, to signal and surface the underlying emotion or bias. By making the intent visible the audience can make a more informed decision on the content. In the same way the blue Twitter verified badge clearly highlights an account of public interest as authentic, designers can create a visual structure to make this information accessible.

This approach of well-sequenced visible data allows audiences to take proactive ownership of how they interpret the content that surrounds them every day, rather than sleep-walking into a uniformed hate storm. In addition to data transparency, content creators and publishers can also be rewarded and promoted on the quality and honesty of the content they publish. In a practical sense, visual indicators, distinguished by colours and labels, can be used as signposts within search rankings, while sentiment tags with custom categories and topics could be surfaced and be presented at the top of the article.

People deserve to see the whole picture, with surfaced information, structured content and visual cues to aid decision making around the content they engage with. With the daily emergence of new technology, we need to get better at designing experiences that manage user expectations. From working with machine capabilities to understanding the uncertainty of how machine overconfidence will present the data, we have a responsibility to apply our own learning to these machines too. Let’s be more human.

This piece was co-authored by White October's creative director Sarah Plant and head of product strategy John Chadfield

Photo by Karsen Knoefler, used under a Creative Commons license

Fake News Media

More from Fake News

View all

Trending

Industry insights

View all
Add your own content +