Here are four predictions for how quality metrics and how we act on them will reshape OTT.
Unified dashboards will tell us what data is meaningful and what’s not.
The massive volume of data generated by OTT is both a blessing and a curse. Providers can find out almost anything about their audiences and their behavior—but only if they can locate the data.
It’s even more difficult if they’re looking for a meaningful correlation between trends. “Which metrics predict higher abandonment rates?” shouldn’t be a hard question to answer, but it all too often is.
The first step to fixing this problem is consolidation.
advertisement
advertisement
Right now, different metrics are available in different places, from different providers in the supply chain, in many different forms. Increasingly, though, companies will be able to bring data from all those sources together in a single place.
In the next three-to-five years, these unified metrics dashboards will become increasingly powerful, and will identify relevant correlations automatically, easing OTT providers’ ability to give users what they need.
We’ll settle on a standard definition of what “quality” means.
Right now, there’s a loose consensus among OTT providers and vendors around which metrics measure “quality.” Statistics like dwell time and abandonment rates can give a pretty good idea of the end user’s viewing experience.
But there’s no set algorithm for how those metrics generate a quality score. That’s going to change, making it that much easier to measure—and optimize for—quality.
Creating an industry-wide quality standard might have another benefit. Right now, the easiest way for OTT providers to filter data is to ask “which of these data points represents a problem that needs fixing”?
But to win out in the increasingly crowded OTT space, providers are going to have to proactively optimize video instead of just troubleshooting problems as they arise. Simplifying quality measurements is the first step toward this goal.
Different platforms and devices will standardize the metrics that they share.
Right now, OTT metrics are fragmented across different device vendors. An Android phone, an Xbox, and an iOS device will each expose a totally different set of events and elements to the provider. Providers can try to blend these metrics to get as consistent a picture as they can, but it’s never going to be perfect.
For the industry to reach its full potential, there is going to need to be a standard set of metrics that emerges from all devices. It’s the only way we’re going to get to one, simple quality score that will remain meaningful across different platforms.
In the next few years, device vendors will begin to work together to standardize their data offerings. In the end, it’ll be the only way for them to remain competitive in the marketplace.
OTT providers will take action on relevant data automatically.
Once this infrastructure is in place, the OTT industry will be able to take the final step: automating how it acts on data to improve quality. Even with the industry in its current, fragmented state, some OTT providers are already doing this.
Hulu, for instance, uses client data to dynamically assign traffic to one of multiple content delivery network (CDN) vendors based on which one is likely to deliver the best viewing experience.
As other OTT providers improve their analytics capabilities, adopt industry-wide quality standards, and receive ever more consistent data from device vendors, more of them will be able to follow suit.
Instead of being bogged down in technical considerations that would take a trained engineer days to sort out, providers will be able to optimize video streams with the click of a button. And they’ll be able to get back to doing what they do best: creating great video content.
Hippo Reads research network contributed to this column.