CBS, MSG Present Lessons Learned at 3D Sports Summit
Story Highlights
SVG’s first-annual 3D Sports Transmission and Production Summit on May 20 was a chance for those who have completed live 3D sports productions to offer advice to the audience on best practices and pitfalls. First to take the stage were executives from CBS Sports and MSG Network, who showed footage and provided insight into what went right — and not so right — in producing college basketball and hockey, respectively, in a new dimension.
“I’m not sure anybody is an expert in 3D,” said Ken Aagaard, EVP of operations and production for CBS Sports, to begin his presentation. “However, we’re going to try to explain at least what we learned during the process of producing the Final Four.”
Location, Location, Location
Aagaard first discussed the issue of camera positions, which proved to be an overarching theme throughout the day.
“3D is basically two cameras in one, so, every time you talk about a camera, you’re talking about two,” he said. “Camera positions become really critical to this. In addition to trying to find camera positions, you’ve got to make sure that you don’t have hands in front of you when people are jumping up and down. When we did the Final Four, we were not entirely sure where to put the cameras, so we were making a lot of decisions blind.”
Aagaard walked the audience through each of the six camera positions his team used for the Final Four productions, during rehearsal at a college All-Star game the night before the semifinal games, and the three games that were shown in theaters. One change he mentioned for the next time around would be to untie his robotic backboard cameras from those used for the 2D production. In April, the cameras were mounted together.
Cardboard Stand-Ins
Camera positions were an issue also for MSG Network’s broadcast of a Rangers-Islanders game. Making the production easier was the fact that MSG owns the rights to both Rangers and Islanders broadcasts, so the network had complete control over the TV broadcast. Finding space in the arena for the two-in-one cameras, however, was still an issue.
“Seat kills was a very difficult process because Ranger games are sold out,” said Gerard Passaro, SVP of distribution and technical operations for MSG Media. “And putting the cameras lower takes out some of our better seats. We gave food vouchers and tried to move people to other locations.”
Without having seen the 3D rigs in advance, it was difficult for MSG to determine the precise angles that would work best, so the broadcast team settled for knowing that they needed to be “low and close.”
“We made some cardboard mockups and took them around the arena to determine what effect that would have on viewers, as well as to see the angles,” explained Mike Mitchell, chief engineer for MSG Network. “We didn’t want to shoot through the nets that we hang to protect patrons from pucks as well as the glass around the arena, so we had to move the slash positions in toward the blue lines.”
Through the Plexiglass
The biggest issue for MSG’s production was the vertical posts holding up the plexiglass, which are impossible to avoid. In addition, without cameras inside the rink, every shot will be from behind the glass, which creates some issues when it comes to choosing depth of field.
“We want to put the players we’re shooting on the screen, so anything between the camera and play is coming off the screen,” said Steve Schklair, president of 3ality Digital. “If the player is on the screen, the glass is in your face, so if it’s scratched, you’ve got these scratches flowing in the middle of the audience. We can compensate, but then the players are all deep in space. Creatively, that’s not what you want to do, but technically, sometimes you have to.”
Still, he noted that shooting through the glass was not as big an issue as he had thought — although the posts did pose a problem when cameramen panned across them.
Valuable Rehearsal
At the Final Four, CBS had the advantage of an All-Star game on Friday night, which served as a rehearsal, before three tournament games between Saturday and Monday. That schedule allowed the production team plenty of time to make changes, which proved especially valuable for the graphics team.
“We were trying to figure out where to put the graphic in the scene itself,” Aagaard noted. “It’s a little bit hard on your eye when the cheerleaders pop up if the convergence is not set for that.”
During the Friday-night test, however, CBS learned that it had put too much separation between the left eye and the right eye in its graphics, which was making viewers dizzy.
“We thought we had it right in the truck,” said Bruce Goldfeder, director of engineering for CBS Sports. “We did a pass through the switcher with DVEs and separated the graphics. But we got feedback from the theaters and worked with people there during our test time to get it just right. That was big, having people look at it and really fine-tuning it.”
2D to 3D in Four Hours or Less
Both CBS and MSG Network produced their 3D shows out of a 2D truck — CBS using NEP’s SS9 and MSG Network relying on Game Creek Video’s Yankee Clipper — and both had success with that configuration.
“We ran copper between our truck and PACE’s 3D truck, where the convergence operators were,” Goldfeder said. “Within three or four hours, we were all set up.”
Schklair added, “Fitting out a 3D OB unit from a 2D truck can be done relatively quickly. Really, the only difference is space for the convergence operators and putting enough 3D monitors in there so that the director has a least program and preview. Then, you need a fairly good router so you can be patching things all over the place, but there’s not a lot of fitting out of the truck. The integration takes a day or less.”
However, he said, finding room for those convergence operators will soon be a thing of the past: “Eight convergence operators sitting in a B unit is an inconvenience that has to go away.”
Automated Convergence?
3ality Digital is setting up some test events for an automated convergence system, which will allow the director to cut between cameras without having to rely on convergence operators to keep the subjects’ depth consistent.
“That depth is completely measurable, so it can be automated,” Schklair said. “At some point, as soon as we finish testing the software, the operators go away because the computer can keep all the depth consistent, and the stereographer will oversee it.”
He put that point at three to six months away, once testing is complete.