Adobe recently made an announcement that they offer options within their Adobe Analytics Cloud to let brands analyze devices that rely on voice-activated search commands. Specifically this includes things like SIRI, Cortana, Google Home, and the popular Amazon Echo.
The system will take advantage of machine learning to parse out the context of what people are using these devices for. Having the ability to more accurately determine what people are using these devices for will help brands to customize their marketing, and make their services better.
The director of product management at Adobe Analytics Cloud, Colin Morris, said, “Understanding the organic thought process of humans and being able to analyze how people reason and think through their actions is a fascinating way to get closer to analyzing a real brand-consumer relationship.”
So called ‘zero-touch’ devices that respond exclusively to voice are growing at a very rapid pace. By some estimates, they will account for more than 2 billion devices by 2020, and the growth will continue. This includes not only voice activated devices, but those that will use biometrics, movement, and sensors.
This is an increasingly important way to interact with customers, with sales through voice enabled devices rising by 39% in the last year.
Morris went on to comment, “Brands struggle when they don’t have a visual representation. Some will not know how to make that connection with consumers, but the data will provide an explanation.”
This new feature from Adobe should go a long way toward helping brands and marketers capitalize on the convenience of voice activated devices.