Microsoft Unveils Video and Contextual Targeting Features
Company shows off a number of contextual targeting technologies for video and other digital content at its DEMO presentation to the ad community.
Company shows off a number of contextual targeting technologies for video and other digital content at its DEMO presentation to the ad community.
Microsoft yesterday unveiled new contextual analysis and ad targeting options for online video content. The video-based ad offerings are part of a flock of digital marketing innovations on display at the fourth annual adLabs Demo Fest at Microsoft’s headquarters.
The Contextual Ads for Video offering uses speech recognition technology to translate the audio from online clips into text, which can then be matched to specific marketing offers. Another product, Intelligent Bug Ads, carefully places small clickable ad overlays within video streams.
Other technologies presented included Air Wave, which uses Microsoft’s Surface touch screen technology for interactive ads in places like malls or amusement parks; Visual Product Browsing, a product matching tool that guides shoppers to products similar to what they’ve already viewed; and new gizmos to help marketers avoid unsuitable content and select search keywords based on user search and content behaviors.
“Online advertising has been centered around keywords for too long,” said Tarek Najm, an engineer for Microsoft’s advertising and business intelligence systems, adding the “next wave of advertising is going to use new algorithms and technologies” that display ads based on consumer intent.
One of Microsoft’s demos featured a streaming episode of the Charlie Rose Show. The clip was run through a speech analysis algorithm that scanned the audio from the program and translated it into text. The text was then used to query against a database of keywords, thus displaying relevant ads in the right-hand column at precise times during the program. Other firms to promise similar capabilities have included Scanscout and Blinkx.
Microsoft said the mechanism can differentiate between speakers and even conversations, parse the context of what’s being said, and use that data to further refine the ad placement.
Another demo depicted technology to flag and score Web pages based on traits that may offend brand advertisers or mention them in a critical context. Marketers might be tempted to use the feature to leverage a competitor’s negative press, but James Colburn, a group marketing manager for adCenter, said that wouldn’t be an issue.
“We wouldn’t allow that to happen,” he said. “We’ll have rules that would prevent situations like that.”
Microsoft’s smart overlay ad technology puts it in the company of Google’s YouTube, VideoEgg and other video ad networks, though the format has some important differences from those firms’ offerings. It works by scanning a given video and displaying tiny clickable ads in the corner, top, or side of the frame. The innovation, Microsoft said yesterday, lies in making the bugs as unobtrusive as possible. A demo clip contained twenty seconds or so of unchanging blue sky above a racetrack with little or no movement. AdLab’s algorithm automatically noted the lack of motion and gently faded an ad for Audi into the unused space. The ad disappeared when the camera changed, losing the sky.
Like the rest of the components, these ad bugs can combine with all the other technologies above, but the real question that nobody asked was whether or not there would be any way for them to fit in with the advances that Yahoo! has made. Though nobody said the name, Yahoo! was the elephant in the room.