SCMS 2018: NBC Sports’ Darryl Jefferson Offers a Look at 2018 Olympics Asset Management

The operation connected seven sites, served 300+ concurrent users, featured 40 virtual machines

Darryl Jefferson, VP, post operations and digital workflow, NBC Sports and Olympics, took to the stage at last week’s Sports Content Management & Storage Forum to explore the asset-management workflows used for the 2018 PyeongChang Winter Olympics.

For NBC Sports, it was an effort that tied together seven locations and had upwards of 300 concurrent users (and a total of 784 users) working from those locations over a private cloud with live content files that continually grew. The seven sites were connected via four 10-Gbps circuits and featured 40 virtual machines supporting asset-management needs. File acceleration and orchestration, said Jefferson, was the cornerstone, with NBC Olympics, for the first time ever, live coast to coast.

CLICK HERE for a deeper look at NBC Olympics’ digital workflows for the PyeongChang Games.

NBC Sports’ Darryl Jefferson: “The hardest thing for us is, everything was ‘live live.’ Everyone needed to see the file that was growing at all times.”

“The hardest thing for us is, everything was ‘live live’,” he pointed out. “Everyone needed to see the file that was growing at all times so that they could start cutting clips, whether it was for a local news broadcast or primetime.”

The locations were Stamford, CT (four studios, five control rooms, eight off-tube rooms, digital ad insertion, and more), New York (NBC Network Commercial Control, descriptive-audio fingerprinting), Englewood Cliffs (cable-network release, long-form VOD), Denver (network DR/cable DR, station distribution), Las Vegas (iStream Planet streaming, OTT/connected delivery), and the IBC and a mountain venue.

The overall concept was to have a local repository at the International Broadcast Center (IBC) in PyeongChang, housing a Harmonic MediaGrid with about 1 PB of storage and an Avid Nexis server with 348 TB of storage, the largest system ever at an Olympics. Users worked with proxy copies of content to make edit decisions via FileCatalyst instead of transferring the high-resolution material.

“The philosophy was to get the content to the platform appropriate for the team. For a show that night, it might go no further than the EVS [servers],” Jefferson explained. “If it had three minutes to get to air, it would get a faster path. There was a lot of prioritization.”

He related some of the lessons learned from the PyeongChang Games. The biggest was the decision to test the asset-management system by writing scripts that would replicate the expected load on the system. That process shook out several issues and allowed them to be solved prior to the start of the Games.

Another lesson resulted in a decision to limit the length of daily melts so that they were no longer 10 or 12 hours long but, instead, around two hours. That move also improved efficiency and made it easier to find the most relevant content.

The team also busted the concept of operations silos, which Jefferson said can be wasteful in causing workflows to be duplicated.

“Don’t underestimate training,” he added. “We had 3,000 people join us in a day, so we had to get them up to speed and learn the systems.”

That massive team churned out a lot of content. A total of 3,218 hours of EDLs/subclips and clips were delivered, totaling 79.2 TB. FileCatalyst moved 3,089 hours and 80.1 TB of high-resolution material between Stamford and the IBC and 24 TB and 833 hours of proxy video. Between the venues and the IBC, FileCatalyst moved 49.3 TB and 1,914 hours of material. Total short-form VOD clips numbered 2,753; VOD for set-top boxes comprised 563 short-form VOD and 212 long-form VOD. Turnaround suites were similar in terms of equipment but included an extra monitor for quad split on incoming feeds, a timecode clock, and Airspeed remote-console application for controlling ingest.

The editing setup at the IBC housed 16 full edits, eight in-file ingest edits, five GFX ingest edits, eight soft edits, and eight channels of Airspeed.

One important part was that all logging took place in Stamford but with MAM logs visible from Korea. The logging process was automated but still required some human intervention.

“Oddly enough,” said Jefferson, “people wanted content further curated by a human, and we find that, as an organization, we lean on human logs beyond objective logs. Being able to focus on subjective logging about emotional events like high fives or an athlete hugging their mother is a result of the team’s being able to leverage the work that [Olympic Broadcast Services] does on the objective logging related to scoring, timing, and who is competing.”

Password must contain the following:

A lowercase letter

A capital (uppercase) letter

A number

Minimum 8 characters

;
SVGLogoHR_NOTAG-200

The Latest in Sports Video Production & Technology
in Your Inbox for FREE

Daily Email Newsletters Monday - Friday