The biggest Watson at the 2017 Masters Tournament might not be Bubba

IBM Watson will be creating highlight reels for the 2017 Masters.

Sportscaster Jim Nantz might have some competition in anchoring tournament coverage at the 2018 Masters.

That’s because IBM Research and business strategy and experience design arm IBM iX have developed a Cognitive Highlights proof of concept to auto-curate individual shot highlights from live video streams at the 2017 golf tournament to help speed up the video production process for highlight packages.

In other words, supercomputer Watson will be producing highlight reels for the Masters, which kicks off Thursday (April 6).

According to IBM, this first-of-a-kind system extracts what it determines to be exciting moments from live video streams based on video, audio and text cognitive computing techniques.

“The system uses sophisticated computer vision algorithms to understand the content of the video, such as the detection of a player celebrating, or the start of a golf swing based on TV graphics,” IBM said. “In addition, it analyzes the audio to detect crowd cheering [or] commentator excitement, and converts speech to text to find words and expressions related to exciting moments as well as the tone of the conversation.”

And, IBM said, this is the first time to its knowledge a system is integrating so-called excitement factors using cognitive computing methods for ranking purposes.

What’s more, IBM said this will enable text-based searches like “show me all highlights of player X during the tournament” or personalized highlights based on favorite players, but did not specify where or when it will happen.

However, a representative noted Cognitive Highlights is a proof of concept this year, so it won't be available on the Masters website or app. Instead, whatever IBM learns from 2017 will be applied to the 2018 Masters.

“We are also evaluating automatically generating highlights for tennis matches such as Wimbledon and the US Open, where there are 18 matches in progress simultaneously during the early rounds,” the representative added.

The 2017 Masters will also feature a Watson-enabled Cognitive Room in the media center at Augusta National Golf Club, which hosts the tournament.

IBM said media and VIPs can interact with this room, change the experience and ask scoring-related questions, like, “Who has the longest drive?” and “Who are the leaders?”.

“Media can go in there and get an immersive visualization of different bits of data based on the tournament,” the representative said. “It's another way to get your information and gives you a glimpse into what Watson is capable of.”

Watson will also assist the Masters editorial team with transcribing interviews and captioning VOD content, helping to deliver Masters content to fans more quickly.

“The ability to quickly review and retrieve segments by player or hole can speed the process for creating and sharing highlights with fans eager to see the latest action,” IBM said. “Such a system could also increase the scale and scope of videos produced to include highlights for every player or players grouped by country for greater fan personalization.”

IBM said it has worked with Augusta National Golf Club for more than 20 years. IBM designs and powers the Masters apps, as well as Masters.com.

New technology in the Masters digital experience this year include: the ability to continually view action or highlights and to keep the leaderboard up no matter where users navigate on Masters.com or within the Masters apps; and Track 3.0, or enhanced capabilities on Masters.com and within the Masters apps for users to follow their favorite golfers, keep tabs on a particular grouping or stay fixated on a particular hole, IBM said.

The representative said Watson has previously been integrated into the US Open tennis tournament. The Watson Speech to Text API listened to VOD clips of player interviews and tournament action and automatically generated subtitles and transcripts for videos published to the US Open website and other digital platforms. In addition, the Watson Visual Recognition API analyzed every photo taken by USTA photographers to speed up identification of photo subjects, allowing the USTA to speed publication of photos across its digital environment.

Join us, it's free.

Become a member to get access to:

  • Exclusive Content
  • Daily and specialised newsletters
  • Research and analysis

Join us, it’s free.

Want to read this article and others just like it? All you need to do is become a member of The Drum. Basic membership is quick, free and you will be able to receive daily news updates.