Laurel Canyon Live Uses Ultimatte and ATEM Constellation 8K for Live Streaming Virtual Productions
Recently, Blackmagic Design announced that Laurel Canyon Live is building a state of the art live streaming venue with Blackmagic Design gear, including Ultimatte 12 compositing processors, ATEM Constellation 8K live production switcher, URSA Mini Pro 4.6K G2 digital cameras and more.
Laurel Canyon President, John Ross, talked to ProductionHUB about the new venue, the future of livestreaming and why he decided to create this amazing space for artists, content creators, media, entertainment companies and more to stream performances and events against dynamic virtual backgrounds in real time.
PH: Can you talk about Laurel Canyon Live? How did it get started?
John Ross: The idea of Laurel Canyon Live first came to me about two years ago after we did a charity event in our mixing stage with Colin Hay from Men at Work. It dawned on me that this IMAX / Dolby Atmos equipped room with a cinema scale projector was actually a very interesting place to do live concerts in the virtual space. I set about exploring more of the possibilities of how we could, with a few tweaks, turn this place into a completely unique live production space.
We had about 80% of the infrastructure in place already being utilized for the primary purpose for this room which is to mix motion picture soundtracks. We had a large console installation, an amazing sound system installed, and many Pro Tools systems rigged to play back. So the audio side was actually pretty sophisticated and honesty, likely massive overkill for this new venture. We needed to add the production elements including lighting and control, the ability to hang the lights and finally actually shoot the material. That is when we went about looking at the various options and acquiring the many pieces of equipment necessary. For me, it made more sense to go to one vendor that had figured out a lot of the hand shaking protocols from one discipline to the other, and had a globalized whole-approach to how a lot of this technology works together. There are not many choices out there aside from Blackmagic Design. They have great cameras, switching gear, recording capability, post-production tools including DaVinci Resolve, and that was the reason we went in that direction.
In my research for putting together Laurel Canyon Live, one of the clients that we worked with often is Gary Goetzman, who is Tom Hank’s producing partner at Playtone, and also the producer of the Rock and Roll Hall of Fame. He was nice enough to invite me to go to the Hall of Fame in New York where we got the back tour to how the whole thing is put together. We visited the truck with the sound installation, lighting installation and video switching gear and that is when I realized we basically had exactly that capability here already and I could make a permanent version of the truck in pre-existing editorial suites.
PH: What types of Blackmagic Design gear can you find in the venue? How did you decide which gear to include?
John Ross: At Laurel Canyon Live, all of the cameras that we use are Blackmagic Design cameras. We have six URSA Mini Pro 4.6K G2s with a combination of PL and EF lens mounts. We have eight Micro Studio Camera 4Ks that make up the balance. The URSA Mini Pro 4.6K G2s are tethered to the machine room via the fiber optic SMPTE 311 hookup using Blackmagic Camera Fiber Converters and Blackmagic Studio Fiber Converters. We have an ATEM 4 M/E Advanced Panel as a video switch surface and the ATEM Constellation 8K 40x24 12G switcher. We also have two Smart Videohub 12G 40x40 routers. There are a number of DaVinci Resolve Studio installs. We have the ATEM Camera Control pallet, we have the DaVinci Resolve Mini Panel. We have two Ultimatte 12s with the Ultimatte Smart Remote 4s. We also have many multi-view devices, including two Blackmagic MultiView 16s, so that the various truck functionalities, the monitors that are inside of the various disciplines, such as lighting and switching, all those rooms, have access to multi-view sources. Finally, we also have HyperDeck Studio Pros that we use for recording some of the material.
PH: How can content creators, artists, and media companies utilize this space?
John Ross: Basically, I think of this place as a toolbox. We have a large room with a stage set up, and that room is the primary performance area where we have the capability of staging a pretty large band with the ability to film them with multiple cameras. The last show we had two jibs and two railed cameras set up. You could perform and push this immediately out to the general public or film the performance, do some editorial and then push it out at a later date. Essentially the entire 15,000 square foot house has been wired up with tactical fiber, so not only can you use this particular room but other areas of the house can be used, all of which are capable of doing shows or housing live artists in various incarnations.
PH: Can you talk a little about the dynamic, virtual backgrounds? How do you create and choose those?
John Ross: We use Unreal Engine as a solution for creating these real-time CGI environments and backgrounds. Currently, we are using green screens as a way to key out the real aspect of what’s going on and key in these virtual backgrounds. We have some pre-designed environments including rooms and halls, but the environments that ultimately will be used by any particular artist is part of what the individual design teams would be seeking to create. So there is no standard concept. But this is a big part of the design element, that the artist and the team dictate how it will ultimately appear. It can be anything from fading in and out of a location in the middle of a song, to the entire concert taking place in the middle of a virtualized space. All of these things are possible, and all can be done live in real-time.
PH: How is the venue a game-changer for the industry?
John Ross: Up until present, there has essentially been two different ways of consuming live entertainment / how a band can communicate a vision or an idea to their fan base. The first is music videos. Music videos are a medium that can create almost anything visually, but essentially are just a snapshot of time. It is something that occurred at a certain time and will always be replaying a past, “dead” event. And of course, the second, a live performance that is happening in real time. This however comes with the limitations of only being able to do things on stage and build up an illusion that is limited to your audience being there in person to witness everything. Further, for the most part, when you go to larger venues the audience is pretty much watching the artist on television anyway because they’re watching screens on either side of the stage due to the fact that they often can’t get that close to an artist.
So what we’ve done, we believe, is created another option that is halfway between the music video-type situation and a classic live concert. A live event that can borrow from the technologies that are being currently deployed inside the music video space but in real-time with a live interactive audience witnessing the event. Combined with the fact that it is a one-off event and will not be replayed at some other point in time, the audience is part of a group witnessing a unique event with all the jeopardy that comes along with a live performance. And now that we’ve electronically-tethered an audience to the artist, we have the ability to do some really interesting things with this technology that still maintain that jeopardy and uniqueness of a live one-off environment.
PH: Do you think most concerts and performances will utilize this technology in the future?
John Ross: I believe that once the COVID situation changes and there is potentially a return back to normal life (although no one is quite sure what normal life will be) I do believe that by virtue of the fact that people are beginning to decentralize from the places where they work and from where they gather for entertainment, many will have become more comfortable with co-existing in a more virtualized environment. Traditional concerts and attendance will of course come back, but we believe you can enhance or amplify attendance for any particular concert by utilizing this virtualized aspect of it as well. All the sudden a 15K seat auditorium or arena, when compared to the scale of access via the internet, actually becomes a fairly small venue in comparison and the potential to touch literally millions of additional people at one particular time is quite appealing for an artist. Once traditional touring comes back, I think this layer of reach will be available and will enhance the concert going experience.
PH: How will it continue to evolve?
John Ross: The technology will always continue to evolve. As of now, H.264 is a compression algorithm that allows people to receive very good quality pictures with a very small footprint. It allows you to watch things on your iPhone and televisions, etc. without really stressing the hardware and software. As we speak, they have already ratified H.265 which has a better compression algorithm that ultimately goes up to 8K and will continue to up the ante allowing better and better pictures to deliver more of a high-quality experience. But there is also work being done on H.266, with a lot of it focusing on supporting AR and VR. Once we’ve made a commitment to head into this electronically-tethered world and entertain via real-time streaming environments, these new technologies that are being brought out will make the experience exponentially better. It’ll be very difficult to un-ring this bell. It’s a situation where it’ll only get better and more compelling. The technology will become more and more accessible, and as the reach of the internet grows, more and more people will be able to experience this and the scale will be pretty massive.
PH: When will this venue be ready to use?
John Ross: Laurel Canyon Live will be ready very, very soon. In about two months we will be live with Dolby Atmos / Dolby Vision client interface. We’ve done four of the six tests with Dolby who we are working with to develop some of the back-end / user interface aspects.