A Look Back at CBS Sports Digital’s Strategy for Streaming Super Bowl LIII

A multi-CDN approach ensured that contingencies were uneventful

In today’s media landscape, major networks are doing more in the OTT space than just dipping their toes in the water. CBS Sports Digital, for example, tiptoed toward the deep end of the pool with a live stream of Super Bowl LIII in Atlanta. At last week’s Streaming Media East in New York City, Taylor Busch, senior director, engineering, CBS Sports Digital, offered a thorough walkthrough of his team’s preparation, construction, and execution of its strategy, which was highlighted by an outside-the-box mentality in putting a secondary emphasis on latency.

“[Latency] is important to us,” he explained, “but the most important was the rebuffering rate. Getting the latency down was not really a goal. We wanted to make sure that we had a really good experience for all of our users.

“We are able to keep those rebuffering rates very low,” he continued. “The more aggressive that you get with latency in this kind of workflow, the more potential you’ll have for rebuffering, so we were very conservative about that.”

Laying the Groundwork
Busch and his colleagues in the streaming department were well-prepared and -connected before the game’s first play. With “war rooms” located in CBS Sports’ broadcast center in New York City (for signal transmission and encoding), San Francisco (for the CDN partners), and Busch’s headquarters in Fort Lauderdale, FL (for site infrastructure), a direct line of communication was established between the network’s employees and the vendors involved with the project.

“The big thing was the organization around the war room,” Busch said. “We needed to figure out how we were going to communicate and how to escalate [the speed of the stream] for people that were [behind] in a particular area; we didn’t want them to get bombarded [with a ton of plays all at once]. There was a whole lot of planning that went into this.”

The locations were constructed for more than just practice: they were also leveraged during the night’s production as a monitoring safety net to detect any issues.

“We had a ton of internal tools looking at stream alignment and real-time encoders from the logs,” he said. “We were using the APIs from the CDNs to figure out, in as real-time as possible, what they were serving. We were able to pump that [information] into our Sumo Logic [cloud-based analytics system] to aggregate and build separate dashboards with that data. Finally, we had a multiviewer, so we were able to see all of the different streams and could visually detect the obvious [problems] very easily.”

Reaching Fans in All Corners of the Country
For an event that captivates fans of all 32 teams, especially supporters of the two fortunate enough to be playing, there is virtually no room for error on the distribution front.

To get the job done, CBS Sports Digital fired up multiple CDNs to ensure that every region had access to the OTT product.

“We had the signal coming into the New York broadcast center, and we had the luxury of putting our [AWS Elemental] encoders in the same facility,” Busch said. “We also had a backup facility in [Stamford, CT. With this in mind], we were able to see if there was any issue with latency or anything of that nature. This was really interesting to watch during the event to see where the traffic would derive from.”

Along with the effort to include everyone in the U.S., the network showcased that same sentiment of inclusivity with ad insertion. Although the fan experience may have been different for those watching on linear television, viewers watching on-the-go got a taste of the traditional broadcast.

“We went through quite a lengthy decision-making process to weigh the options,” he reported. “We decided to go through a traditional passthrough, so the ads that were on television were the ads that you saw in the stream. We ultimately did a hybrid with tracking of the digital impressions.”

Getting Out of Harm’s Way: Diverting a Live-Streaming Issue
In live sports productions, there’s bound to be a mishap or two. On football’s biggest stage, a large wave of consumers bringing tons of activity and traffic can present a worrisome challenge. To combat any foreseeable scenarios, CBS Sports Digital implemented a backup plan that would allow the problem to be fixed without viewer detection.

“We were extremely concerned about the ingest and distribution redundancy. We wanted to make sure we had enough leverage to pull during the event, if there was a problem, to hopefully not have any disruption to the end user,” Busch said. “We did see at one point that there was increased latency. What we did was turn off the [U.S. West 2] region and basically fail over to U.S. East 1. They identified that some of the traffic was public and that was causing the latency. We flipped the U.S. West 2 region back, and there was no interruption or issue to the customers.

Reflecting on his team’s work, Busch noted the fruitful and successful relationship between the network and its vendors: “The most impressive part of all of this was the amazing cooperation with all of the vendors that we worked with. There were a lot of moving parts, and we really wanted to have fine-grained control over each piece of the workflow.”

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters

;
SVGLogoHR_NOTAG-200

The Latest in Sports Video Production & Technology
in Your Inbox for FREE

Daily Email Newsletters Monday - Friday