Unreal Part 2 : Architecture + Film [DXY Journal]

Filming The Mandalorian in The Volume, a 75ft diameter sound stage with LED lined walls, displaying real-time computer generated context, that moves to the parallax of facing cameras.

DesignXY Ltd Journal, looking in depth at how our architectural practice uses digital technologies to visualise projects to high standards, using software that is increasingly shared with the film and game industry [Part 2/4].

[7min Reading Time]

Dear @disneyplus, why are we being treated like second class citizens in the UK, having to pretend we’re so pleased for the rest of the world to be able to watch #TheMandalorian, while we have to wait until... when... the end of March 2020?

That’s a long time. The universe is a dangerous place and frankly, not all of us are going to make it, waiting to join everyone that’s already in #AGalaxyFarFarAway.

@kristofferbarryInstagram - 6th December 2019

I had no idea when I wrote that in December 2019, that we would be under a Covid-19 induced lock-down, when Disney+ and The Mandalorian arrived in the UK. In many ways, it worked out better – it was the first time in a long time, that I can remember any level of enthusiasm from my wife and three kids, to sit down together and watch something on TV in a serialised format. We enjoyed it so much that we forgave Disney; in fact, we enjoyed it so much that my wife, a mild to moderate Star Wars fan, said that she’d like to watch it again... so we did… over another eight Friday evenings.

It started with the film that was later re-titled Episode IV : A New Hope; like myself, it was released in 1977. Although I didn’t see the film that year, all of my milestones in life run in parallel and it has been hugely influential; certainly I have to attribute a significant part of my commitment to the what if of life, to the convincing world building of Star Wars. I think I learned to draw, to extend that world - but even today, I wish I had half of the talent of illustrators like the late Ralph McQuarrie, or Doug Chiang.

If you’ve read this far in this Journal series, you’re probably hoping that there’s a point… well, I think there is. After the first season of The Mandalorian series concluded, Disney Gallery : The Mandalorian appeared on Disney+, with eight parts, each dedicated to an aspect of the production of the show, from Directing (Episode 1) to Legacy (Episode 2), Score (Episode 7) and concluding with Connections (Episode 8) and the links to other explorations of the Star Wars galaxy.

It was Episode 4 (Technology) though, that really took me by surprise. Sat in the roundtable discussion, along with Jon Favreau (Executive Producer / Creator of The Mandalorian) and Kathleen Kennedy (Executive Producer / Head of Lucasfilm), was Richard Bluff (Visual Effects Supervisor), Jon Knoll (ILM Visual Effects Supervisor), Dave Filoni (Executive Producer / Director) and Hal Hickel (Animation Director). Jon Favreau made quite a claim in his opening statement.

“The Mandalorian is the first production ever to use real-time rendering and video wall ‘in-camera’ set extensions and effects that was again… necessity was the mother of invention because we were trying to figure out how to do the production here in the timeframe, at the budget level, but still getting the whole look that we’re used to seeing.”

Jon Favreau

At this point, I need to point out that the fact that this is a Star Wars production is almost incidental; it’s the next evolution in a process that Favreau has been developing with Disney, since his remakes of The Jungle Book (2016) and The Lion King (2019); Disney and the Star Wars franchise have already been utilising some of the technology described in this post, with one particular sequence in Solo : A Star Wars Story, offering proof of concept on one of the key elements – the use of a game engine to allow the sequence to be developed in great detail, before it was actually committed to film.

What’s the difference in The Mandalorian? The main change is The Volume; a 75ft diameter cylinder, with LED screens on the curved walls and flat ceiling. What’s driving the images on the LED screens is the key to why this is an entirely new way of creating content for film / TV.

On set, Baz Idoine (Director of Photography) explained the implications to the documentary crew. The content that appears in the video walls was motion tracked from each camera, allowing position tracking, giving real-time perspective and parallax changes on the screens as the cameras move, so the content on the LED screens updates as the camera moves, in real-time.

In some film footage from set, showing the production of the show, a video screen comes to life, displaying the branding of four different companies / products, the technology of which is at the heart of the process; Industrial Light & Magic, Epic Games, Profile Studios, Lux Machina.

Hang on… we know there would be no Star Wars without Industrial Light & Magic; Profile Studios are involved from a performance / motion capture perspective; Lux Machina provide the LED panels and lighting. In what way are Epic Games involved with The Mandalorian?

Back at the roundtable, Kathleen Kennedy says she might struggle to explain the importance of a gaming engine in the production of the show; she receives this answer.

“It’s real-time, so the visuals that anybody playing a video game are looking at, are being calculated in milliseconds; so if you move right, or you move left in a scene, or you turn around and you see a view in an environment that you haven’t seen before, then it’s happening in milliseconds.”

Richard Bluff

“We’re taking that video game reality and we’re doing a cinematic experience based on it.”

Jon Favreau

Epic Games is the studio behind Fortnite, which as a platform demonstrates what their gaming engine is capable of delivering, albeit in a highly stylised, cartoon rendering of the technology. The origins of the company go back to 1991, as Potomac Computer Systems, which became Epic MegaGames in 1992, and simply Epic Games from 1999. After successes with Unreal, Unreal Tournament and the popularity of Fortnite, Epic Games seem to be demonstrating with their input in to The Mandalorian, that their proprietary platform, Unreal Engine, is capable of more than building, compiling and rendering video games.

But what’s the point? Why is this such a paradigm shift for the TV / film industry? Within my lifetime, we’ve gone from fully practical effects in films, to the first attempts at computer graphics with films like Tron (1982), or Pixar’s Luxo Jr (1986), the water effects of The Abyss (1989), the metal effects of Terminator 2 (1991), through decades of blockbuster movies where green screen effects have become the norm. Many of these visual effects no longer stand up to close scrutiny. We’ve come back to the realisation that the most convincing effects, are those that looked real through the lens of the camera, as the action was being shot on set in the first place.

The Mandalorian represents a new version of that old fashioned way of making films, with virtual landscapes and sets appearing on the LED screens surrounding the set, and being filmed in real-time, as the actors interact with their environments. This creates a more immersive and realistic experience for the actors and crew, which translates to a more believable experience for the viewer. It’s this immersive aspect that brought me back to thinking about this technology and the applications from an architectural perspective.

“What’s cool too, is now other film makers are coming through – other people that are curious about this stuff. Now that you see somebody could do it, it becomes easier to emulate, because this technology is pretty readily available, it’s nothing proprietary here – it’s game engine technology, it’s video screens, it’s positional camera data, it’s things that are kind of… it’s just combined in a way that nobody’s done it before and you also have to have an understanding of how to do it.”

Jon Favreau

I think Jon Favreau has touched on something very important. He recognised the limitations of the processes that were at his disposal; he knew he would have to accept a lower level of quality if he accepted the processes as they were, because of constraints related to time, budget or technology. Instead, he continued his exploration as an innovator in film making – asking questions of the right people and testing potential new processes at a small scale, until a workable and satisfactory production pipeline was established.

The product in this case, the TV Show, then starts to move through that pipeline, from pre-production (planning, pre-visualisation, set / costume / prop creation), to production (filming on set, musical score), and then in to post-production (visual / audio mixing, colour grading, editing, etc.).

The products in an architectural sense, are buildings. While the production pipeline for a building will be very different to a TV show, there are opportunities to use design technologies that are relevant across disciplines. Having a common tool set means that there will be opportunities for professionals to potentially work across a range of disciplines, to share project files in their native formats, but also to talk to one another with a depth of understanding in one another’s processes and workflows, that means the end product in any collaborative project will yield stronger and more cohesive results.

This notion of collaboration is reinforced in the depth of the toolset. As well as Unreal Engine, there’s Quixel Megascans, which is a library of highly detailed materials, textures and objects (digital assets) that can be used in the Engine, but can also be imported from / exported to a number of other digital platforms.

We’ll look at the implications and opportunities for architectural applications in Part 4 of this Journal series, but next, we’re moving on to the reason that the Unreal Engine exists - Unreal Part 3 : Gaming.

Kris


Previous
Previous

Unreal Part 1 : Architecture + Convergence [DXY Journal]

Next
Next

Unreal Part 3 : Architecture + Games [DXY Journal]