Facebook probably has more data than any other company in the world, barring Google. We’ve known for a while that it has been analysing text to improve users’ experiences on the platform. This is what it has been doing with its ‘M’ personal assistant in Messenger, whose beta phase was powered by both real people and AI. It now seems that it is ready to move to a more advanced stage.
Facebook’s whole journey has been about understanding users’ data. While Google is an expert on the world’s data, Facebook is very much an expert on data from its own members. When we all joined Facebook, if you can remember that far back, we were asked to tick boxes to say what we were interested in, plus join or follow groups that reflected our interests, and put in other details including our address. Facebook was able to take this data, which was easy to analyse, and put us into boxes of football fans, film fans, people who live in south London, and so on.
The like button made this even simpler, as Facebook could see people who liked ages across the web. Meanwhile the emergence of mobile as a dominant platform meant that Facebook could see what other apps we had on our phones, and more. The use of emoji had made it easier for Facebook to see what sort of things make us happy or sad, and all these steps have led to a radical shift: from us consciously telling Facebook things about ourselves, as if we were filling in a survey, to Facebook observing us acting as ourselves and recording our actions.
DeepText is the next level of this. Analysing text for meaning is extremely hard – think of all the crazy autocompletes we suffer on our phones (SwiftKey once corrected ‘Karl Marx’ to ‘Karl Mary’ in one of my texts…) – but it seems like Facebook is making real progress. DeepText apparently can derive context and meaning, and in multiple languages. If, as they say, they can work out when people are asking for a taxi, or selling items, or whatever, by using the words they would naturally say to their friends, then they will be able to offer services intuitively to users in a useful, rather than irrelevant manner.
Google has long been experimenting with AI and trying to use it to put context into what we search for, a process that has been accelerated, we’re told with its acquisition of DeepMind (nothing to do with DeepText). For example Google will now offer different results to people searching for ‘pizza’ at lunchtime on mobile and on 3G (a restaurant), than it would if searching on Wi-Fi from home in the evening (a takeaway).
Understanding of text could give Facebook a real advantage, allowing it to build up a deeper knowledge of its users, so that advertising could become even more targeted and relevant. We’d no longer need to type searches into a box for the service to deduce that we were interested in flights or hotels, and this power could potentially make Facebook an even more essential choice for advertisers.
Dan Calladine is head of media futures at Carat Global