NAB Reflections: Adobe’s Steve Forde on How Sensei AI Fuels the Latest Creative Cloud Features
In the sports environment, particularly, AI enhances collaboration
Adobe unveiled a major update to its Adobe Creative Cloud platform at NAB 2018, highlighted by several new automation features powered by its Adobe Sensei AI platform. The new Color Match for Premier Pro CC matches two shots at one click, applying editable Lumetri adjustments from one clip to another to achieve visual consistency across the whole project. On the audio side, the new Sensei-enabled Autoducking feature automatically adjusts soundtrack audio around dialog, whether for a single clip or an entire Premier Pro project. In addition to the new AI functionality, the Adobe Immersive Environment VR/360 editing environment is now available in After Effects CC.
During the show, SVG sat down with Steve Forde, GM, emerging products, digital video and audio, to discuss the new Color Match and Autoducking features, how Adobe is leveraging Sensei to create the latest AI-fueled workflows, the latest VR/360-video features, and much more.
What are the big updates being announced for Premier Pro here at the show?
The two big [new features] are color matching and auto ducking. Both leverage Adobe Sensei, which is our platform for AI and machine learning.
For shot matching, [a user] can now apply a high-grade color across the entire timeline based on a core element within the scene. Our machine-learning AI algorithm had to watch a lot of video and can now make that happen. I think that’s one of the most powerful [abilities] of Creative Cloud.
A lot of our customers share information with us as the input content and collaborate. We don’t understand the nature of that content, nor do we associate that with anything the user is doing. But what we can do is algorithmically understand the content and use it to [enhance] the algorithm … to better understand things like skin tone. So the algorithm becomes smarter. Then we brought in professional color-grade artists to say, “Okay, this is what the algorithm thinks it can do; let’s critique that and teach it a little bit more.” And we [baked] that algorithm into the product.
That’s the exact same thing we did with auto ducking. We basically had Audition listen to a lot of audio and learn in order to make the algorithm better.
How do you see AI and machine learning impacting the broadcast market in the coming years?
I know some people worry that AI and machine learning are going to replace them one day. Actually, these [programs] are not that smart, but, when you teach them to do one thing really well, they can excel. Computers are very good at repetitive tasks, so we are focusing on getting rid of the tedious tasks that consume a lot time. That way, we can let the user be more creative.
I think that’s the core message for what we believe artificial intelligence and machine learning bring to the table. You’re seeing these two [features] as big investments we made in our video and audio products, and I think you’ll continue to see a lot more.
Are you seeing much interest in AI-fueled features by clients in the sports-production space specifically?
We have relationships with many of the major sports brands, and one of the major machine-learning requests we’ve heard is [the ability to] create highlight reels very fast. How can you teach it to understand a goal in soccer or other common core components in each sport and then cognitively understand that? I think that’s an area that’s of interest for us. Does that replace anyone? Absolutely not. That just makes what they’re doing better. That is where we want to go with AI.
Adobe Team Projects has proved quite popular in the sports community. Why do you think that is, and how has use of that platform evolved since its debut?
I think the collaboration in a sports environment is huge. You’ve got a lot of different people in disparate locations working together on a similar task. So Team Projects has been very popular specifically in [that] part of the market, and it’s something that we continue to see.
But I think Team Projects is only the beginning. We’ve got some great interaction between After Effects and Premier Pro that allows an After Effects artist to work with a Premiere Pro editor more seamlessly. And they can collaborate just by logging in.
One of the biggest things with Team Projects is, it gets rid of one of the most difficult things that people have when they collaborate: passing files around. They get into scenarios of Untitled 1, 2, 3, 4, 5 … Final-Good-1, Final-Good-2, that kind of stuff. And, in a sports environment, fast turnaround is everything. If you’re trying to figure out which project file is the good one, that’s where I think Team Projects really shines. You log in, you have context, you can go back to the conversions if you want, you can do that quick turnaround. That’s really leveraging the power of the cloud.
Adobe has added several new VR and 360-video production capabilities to the Creative Cloud over the past year. Any new features being announced at the show on that front?
We have what we call the Adobe Immersive Environment. We have put a lot of effort into After Effects for this context. Premiere’s had really good response, and people are creating 360 content. We wanted to bring that into the motion-graphics environment, especially as people are creating complicated titles and effects.
Once again, that shows that power between After Effects and Premiere Pro. But then, being able to have the After Effects artist put on the VR goggles and see what’s going on from a 360 point of view in context to what the edit may be is powerful. That was most of our investment with Immersive. They don’t have to go, Tell me what this looks like from the goggles because I can’t do that in After Effects. I need to be able to see that in both environments.
Do you see the amount of VR production being done with Adobe tools increasing right now?
I think VR continues to be exploratory with the market. We’re investing in it as we see more demand. We’re really excited about the [new features], but we’ll continue to read the market and develop tools in response.
What about augmented reality? Any plans to integrate AR capabilities into Premier Pro and other Creative Cloud tools?
I definitely think our VR work sets the stage a little bit for what can be in an AR environment. We’re not announcing much here at the show in relation to AR, but I will say, while VR is interesting, AR is really interesting. We are talking to a lot of folks, including in the sports space, and we think we’ll have more to talk about [regarding] AR in the near future.
Adobe has made an effort to integrate with third-party vendors in the broadcast space over the past few years. Why is this so important to your strategy?
It’s critical. At the end of the day, we compete heavily in this business, and our open infrastructure is probably one of our biggest competitive advantages. When you look at a system — whether that’s EVS in the sports context or our MAM partners — we now have over 250 panels in Premiere Pro and After Effects. And that’s also expanding into things like Audition, which is really great. But I think that open architecture has been the reason an EVS, for example, can say they can recommend to their customers that this system will work in conjunction because we’ve co-developed.
And we have relationships in some unlikely places as well, like Avid Interplay, which integrates with Premiere Pro. Even though we compete in some areas, we also cooperate in others, so that’s a very powerful statement, I think, to market. And sports was probably one of the key areas that led the way on that.
Over the past decade or so, Premier Pro has gained a much larger market share of the NLE sector. How has this growth impacted your strategy, and how will you look to continue the momentum moving forward?
At the beginning of this transition, Premiere was a distant third of the market; it was an Avid and Final Cut world. The key for us was focusing on our customers. Irrespective of decisions made in other companies, we have to focus on what do our customers need and how are we servicing them. We took a very one-on-one approach in order to do that. We knew we had to earn the business at CBS or at ESPN and ABC. We found out what we needed to do to get the product where it needed to be, and we just made that investment over and over again. We’re still doing it.
That’s why we have more than 300 meetings [at NAB 2018]. They still come in with a laundry list, saying, “This is great, this sucks, we need this.” Then we have an engagement team of engineers to make sure that those developments get done. We work on something with our customer, and then that migrates into what the public sees in the final product. A lot of these customers probably wouldn’t have given Premiere the time of day 10 years ago [but] truly feel we’re in a partnership, and they in fact are helping us design that.
This interview has been edited for length and clarity.