BTS “Dunkirk VR Experience” with Matthew Lewis | Practical Magic

BTS “Dunkirk VR Experience” with Matthew Lewis | Practical Magic

We’re proud to shine a light on one of the standout 360/VR productions of the last year, Save Every Breath: The Dunkirk VR Experience. Original content developed by Practical Magic, it takes us inside the world of acclaimed filmmaker Christopher Nolan‘s epic WWII film Dunkirk, where 400,000 Allied soldiers are trapped on a beach in France.  As the name implies, Practical Magic use a lot of practical effects and innovative techniques. They also used the SkyBox 360/VR plugins, now integrated into Adobe CC 2018, in their production pipelineRead our interview below with Co-founder and CEO Matthew Lewis to find out more.

Save Every Breath: The Dunkirk VR Experience

Q&A with Matthew Lewis, CEO and Co-founder, Practical Magic

Hello Matt! So tell us – what was the main goal of the project?

For “Save Every Breath: The Dunkirk VR Experience”, the goal was to use 360-degree video to immerse viewers in a brief taste of three distinct sections of the film. Dunkirk takes place on land, on the sea, and in the air – and we take the VR viewer into each of those three worlds in the VR experience. The viewer is underwater in the English Channel, then pilots an RAF Spitfire, and finally ends up on the beach, as one of the many trapped Allied soldiers. It was a fun challenge to accurately depict those three settings in VR.

The viewer is immersed in the action: underwater in the English Channel, then pilots an RAF Spitfire, and finally ends up on the beach, as one of the many trapped Allied soldiers.

The VR Experience has 3 settings – land, sea and air. How did you approach each one?

Production was fairly complex, as we don’t use 360° cameras. Instead, we use motion control systems to shoot “slices” of scenes, and put them back together in the computer later. We also try hard to shoot everything as practically as possible, So for example, when we shot underwater, we were really shooting underwater, that wasn’t a digital effect. The water you see in the VR piece was real water, and the soldiers you see are really underwater. For the beach scene, we traveled to France, and shot live-action motion control plates on the actual beach at Dunkirk. Everything in that shot is location-accurate right down to the sand. We then came back to LA and recreated the same motion control move, and shot live-action actors against blue-screen, who were then composited into the live-action beach.

The cameras were RED EPIC Dragons shooting 6K resolution slices, up to about 24 slices per shot, which means a total source resolution of over 140K before assembly.

Practical effects were used throughout the VR experience – the actors were filmed underwater.

140 K – that’s an enormous amount of resolution. Tell us about your different camera setups to capture that. 

We used very different setups for every scene, and even different parts of every scene, but the camera was almost always a RED EPIC. For example, underwater, we put a RED EPIC into a waterproof housing and shot each angle as its own element. In plain English, we put the camera maybe 5 or 6 feet underwater, then had actors jump into the water from a platform we built overhead, falling past the camera. This let us capture natural light penetrating the surface, real people, real bubbles, and so on. We shot a few dozen elements with different actors in different positions, and used the best ones to composite the underwater scene.

The beach scene was shot differently, using a motion control system, but still a RED EPIC camera. This was a very unusual setup, with a total crew of four people (including myself) rushing against a rapidly approaching tide on location in France. We used an in-house CNC machine to build a custom, lightweight motion control base that we could very quickly set up and tear down, because the tide moves so quickly in Dunkirk. I think it was around 5 am when we started moving the kit out to the water line to set up, then shot a few passes when the sun was in the correct spot. We had to be right at the edge of the water and the tide was changing at each take, so it was unavoidable that the system would end up underwater, which is exactly what happened. The shot you see in the finished product actually had the camera running on motion control track that was running straight into the English Channel.

Everything in the beach shot is location-accurate right down to the sand. Back in LA, the live-action actors were shot against blue-screen and composited into the live-action beach.

What about the audio? It is also key for a great VR experience. How did you deal with that?

We do build a spatial mix for our 360/VR projects, and that’s definitely true for The Dunkirk VR Experience. A strong example of that is when you’re underwater and the bullets start piercing the surface, whizzing past you left and right. If you’re watching in a VR headset and you have spatial audio support, you get a massive benefit from hearing those bullets travel through your world, from the moment they pop through the surface of the water, through the bubble trail they leave behind as they pass you.

How did you assemble all of the slices? What software, and how long did that take? Have you worked this way before — shooting slices and assembling? How did you know it would work?

From a post-production perspective, this is largely uncharted territory. The software is evolving day-to-day, it’s way out on the edge of feature sets. We knew it would work because we’d done it a year earlier on Assassin’s Creed VR for FOX. We used a lot of software, the most recognizable of course being Nuke and After Effects. Nuke and the CaraVR toolset did a lot of the heavy lifting for foundational compositing, while After Effects and Skybox were used for finishing work and a few major hero effects. We had just a few weeks to perform all of the post work with a small team, working on multiple shots at a time in parallel. Meanwhile, Double Negative in London was using their 3D assets from the film to build the air scene, and they did an excellent job in the same short timeframe.

When it came time to render, we were working with a fairly large amount of source material in each frame, and performing reasonably intensive compositing tasks. Fortunately we have After Effects built into our render farm, so we were able to ‘parallelize’ the render work and get it done on time.

3D assets were used to build the air scene.

What kind of computing power did you need? 

We used Dell Precision Workstations running Adobe Creative Cloud on Windows to do many of the visual effects and most of the overall post production.

Where can people view Save Every Breath?

It was published globally on all of the major VR platforms, as well as Facebook and YouTube for mobile viewing. It was also localized in maybe 10 or 11 different languages, so more people worldwide could have a native experience.

Save Every Breath is available globally on Facebook, Youtube and on every major VR Platform

Where did Mettle come into the workflow? How did it facilitate the production?

The Skybox tools were used whenever we were working in After Effects. Not just for titles and general compositing, but also for complex work like putting all of the soldiers on the beach, which was done using a combination of Skybox and Trapcode Form. The soldiers were all real live-action soldiers, but we used them as sprites in Trapcode Form to multiply them on the beach, so 20 or 30 soldiers became hundreds, and they all moved and acted organically, because they were all real people.

You already had experience with Mettle software. What are some of your earlier projects?

We’ve been using Skybox since its earliest days, I think our first big public use was on a project with Jaunt called “Escape the Living Dead VR“, and then extensively on Assassin’s Creed VR, a complex project we did for Fox in 2016.

What do you think of the recent integration of Skybox into Adobe CC? What’s the impact on creators like you?

I think it’s critical that VR tools like Mettle Skybox are integrated into Adobe Creative Cloud and our team was entirely unsurprised to hear that Adobe had come to the same conclusion. I think it’s great that Skybox is a part of Adobe CC now, it’s an important tool and I’m excited to see how people start using it.

Many Thanks Matthew! We’re happy to have helped on this project and we look forward to what you do next!

Dunkirk is the highest-grossing World War II film of all time, taking $525 million worldwide. Dunkirk received praise for its screenplay, direction, musical score, and cinematography; some critics called it Nolan’s best work, and one of the greatest-ever war films. His ten films have grossed over US$4.7 billion worldwide and garnered a total of 26 Oscar nominations and seven wins.

Matthew Lewis.

Practical Magic is a “creative engineering” and full service production company working alongside the studio system in Los Angeles to build new production tools and techniques, and put them to work — telling stories in ways never before possible.

Leave a Reply

Your email address will not be published. Required fields are marked *