Op-Ed: Changing the Game – How Second Screens Are Transforming the In-Stadium Experience

By incorporating ultra-low latency (ULL) streaming into the stadium experience, every seat in the house can be an immersive experience.

The sports broadcasting industry is hungry for different ways to monetize and drive engagement amongst fans. We are on the brink of a revolution when it comes to in-stadium experiences. To best understand the transformative effect latency can have on an in-stadium experience, it’s best to think through a real-world example.

Latency delays don’t win championships

Say you and your best bud drive out to Indianapolis to experience the National Championship between UGA and Alabama. Your friend graduated from Alabama, and you from Georgia, but somehow you still manage to break bread together. The stadium offers a novel new experience. Log in on your smart device and be able to stream replays, stats, and behind the scenes from any camera in the complex. UGA scores, again, and you feel the 41-year losing streak screaming to a halt. You decide to replay the touchdown to gloat to your buddy, but it takes 40 seconds to get the content. The moment’s over, and you might not feel the rush quite as much as you could have. The novelty is just that, a novelty.

Now, what if that delay was less than a second? Oh, the face-rubbing glory. Oh, the opportunity to live in the moment experiencing the game in a manner you never could have conceived.

40-second latency in over-the-top streaming content is fairly common in the industry–but common doesn’t win championships. As more people use second-screen devices at live events, the delay caused by latency can create frustration and lead users to disengage.

Disengagement by fans leaves a lot of money on the table. At Videon, we want to help drive engagement and in doing so, drive monetization. We are able to do so through EdgeCaster.

Moving production workflows out of the cloud

So let’s say that you are producing the National Championship. There will be 100,000 fans in the stadium and you need second-screen viewing that performs as well, or better, than traditional broadcast feeds.

This is where EdgeCaster comes in. Up to this point, video encoders have just been video encoders, but EdgeCaster is so much more. Because it is an edge compute device—with an encoder—it handles critical cloud elements in production workflow that create delay. That is why we always say that we’ve moved functions like packaging from the cloud to the point of video origination. To the edge of the edge. Since EdgeCaster enabled player-ready video, there is no need for the cloud to do anything but deliver the video at the lowest latency possible.

How it Works

When we say that we’ve created an edge compute device, we mean that EdgeCaster with LiveEdge has the computing power and programming to format and send data to the cloud-ready for delivery. By using cutting-edge technology like chunked transfer for HTTP workflows, or SRT, or WebRTC, LiveEdge® does the work that normally would cause latency in the cloud. EdgeCaster with LiveEdge Streaming does not force you into a specific solution. It enables a wide range of solutions, kind of like a Swiss Army Knife. And when it comes to ultra-low latency delivery, everyone has an opinion. That’s why we give you the power to choose. Perhaps you might want the lowest latency possible with WebRTC. Perhaps you might want SRT. Or perhaps you want a chunked transfer with HTTP. EdgeCaster and LiveEdge put the power of what you want in your control to use when you want it and how you want it. And if you really want choice, you can use multiple output formats at the same time. While a Swiss Army knife with the scissors and a blade extended might be awkward, EdgeCaster with LiveEdge streaming supporting WebRTC and chunked transfer HTTP at the same time is anything but clunky, it is fantastic. .

But that’s not enough.

We also make sure that the content is “cloud-friendly”. That means we work with our cloud partners to ensure all of the kinks are out of the system. There is a lot of work that goes into optimizing Common Media Application Format (CMAF) and has been fully integrated and tested with AWS, Fastly, Akamai, and other CDN partners that accept HLS/DASH delivery. And there is a lot of work required to optimize WebRTC with partners like Phenix and Red5Pro. In short, ultra-low latent second-screen experiences will not disappoint.

CLICK HERE to read the original blog post in full.

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters


The Latest in Sports Video Production & Technology
in Your Inbox for FREE

Daily Email Newsletters Monday - Friday