Now Google has 'Stories' too - AI curated news among 20th anniversary search revamp

Google AMP's new features

With Google celebrating its 20th anniversary on Monday, the company released a slew of new features to make its search more visual. It has clearly been watching the competition, with social media network features becoming a part of its new infrastructure.

After marking the milestone with a San Francisco event, Google rolled out a new features, one particularly reminiscent of Snapchat Stories, another reminding some commenters of Pinterest's unique selling point.

Earlier today, The Drum explored the legacy of the Google, formed by graduates Larry Page and Sergey Brin. Senior marketing figures pitched in on how it created one of the world's largest ad networks and defined how we find information. But it also used the opportunity to outline what the next 20 years will look like.

Cathy Edwards, director of engineering of Google Images, opened up on the new features in the Google Blog.

AMP Stories

Google is in fine company by ramping up its Stories feature. Snapchat was first to market with it, app users could upload a video or narrative to their followers, it was short-form, consumable and unique. Zuckerberg's Instagram and Facebook then faced accusations of straight up lifting the feature. Now Google's continuing to build up AMP and is encouraging publishers to use the feature.

AMP was introduced by Google in 2016 as a means of helping publishers increase their page load speed. Later that year, it was expanded to entail websites that would benefit from it. Evolving over the last few years, it appeared that Google pulled ahead of rivals Facebook and even Apple News in the news game. Still, Google purports that AMP "was created as an open source initiative" and that's what's giving it a supposed edge over rivals, according to Madhav Chinnappa, Google’s director of strategic relations for news and publishers.

There are more than 2bn AMP pages from publishers, and that's reportedly on the up every day. On top of this base, Google has plans to make the experience more visual.

It has said it is using artificial intelligence and machine learning to help automate AMP Stories.

Publisher content will be pulled into news hubs on search. Image and video will be overlaid with text from features and articles. At the moment, unlike all the other ‘Stories’ on the market, select publishers can construct Stories, but the AI will also be able to construct them from trending stories.

Here’s what Amp Stories looked like at launch in February. A few publishers have experimented with the format since.

Google's Edwards said it is a “more visual way to get information from search and news”.

The feature is kicking off with a focus on celebrity and athlete news (where there are high consumer interest and a rolling library of rich media being uploaded on a daily basis).

Edwards added: “This format lets you easily tap to the articles for more information and provides a new way to discover content from the web.”

Whether publishers will embrace yet another ‘Story’ on the market remains to be seen. BBC News managers have tested the feature and published their findings on Medium.

Featured videos

The search giant is an experiment with video previews to make search terms more visual. It provides an example of a web user searching for ‘Zion National Park’ to scope out a potential hike. One the term is searched, a YouTube video exploring the park is automatically pulled in under a new tab called Featured Videos.

It said: “Using computer vision, we’re now able to deeply understand the content of a video and help you quickly find the most useful information in a new experience called featured videos.” It outlined that the feature will be primarily tested with landmarks and points of interests at the moment.

Google Image revamp

On the back of Google Lens, the company has upped the image recognition of the service. It said people are already using the tool to identify and find pieces of clothing, landmarks and dog species (which surreptitiously trains the AI)."

Google Lens AI technology analyzes images and detects objects of interest within them. If you select one of these objects, Lens will show you relevant images, many of which link to product pages so you can continue your search or buy the item that interests you.

“Using your finger on a mobile device screen, Lens will also let you 'draw' on any part of an image, even if it’s not preselected by Lens, to trigger related results and dive even deeper on what’s in your image,” said Edwards.

Peter Wallace, commercial director of computer vision company GumGum, said: "These major changes to Google’s search experience seem like a natural evolution in line with consumer trends that we’ve been seeing for some time. Consumers of today have different expectations and have become accustomed to scrolling and consuming visual content. Image-based communication has arguably become more common than person-to-person interaction, and Google is attempting to overhaul search so that people can navigate search results as easily as they can do on visual social networks such as Instagram, Snapchat and Pinterest. Pinterest has been investing in AI-powered visual search for some time, so an announcement of this kind from Google was to be expected.

"Platforms are investing heavily in both voice and image-based search solutions as they try to keep up with changing consumer habits where speed and personalization are key.

"It’s clear that computer vision is about to become paramount to marketing strategies. Not only will search advertising formats need to evolve as the platform becomes more visual, but so too will targeting strategies. Marketers will need to change the way they approach consumers – over the years, display has become something of a dirty word in advertising, with consumers increasingly driven to using ad blockers and advertisers increasingly questioning whether these formats can actually deliver any real value. Similarly, when last view attribution became perceived to be more important than audience behaviour in driving sales, creativity also became neglected as a result."

Wallace concluded: "But with news that the search marketing landscape is about to become more visual there could be renewed opportunity for marketers to investigate more creative and accurate targeting techniques."

There’s been some feedback on the new features from members of the media on Twitter.

Ken Yeung, tech editor of Flipboard

Chris Moran, editor of strategic projects at The Guardian

Karissa Bell, senior tech reporter at Mashable

Doug MacMillan, Wall Street Journal reporter

Get The Drum Newsletter

Build your marketing knowledge by choosing from daily news bulletins or a weekly special.