Befores & Afters Explores the AI, VFX, and Virtual Production Behind Robert Zemeckis’ Here

December 5, 2024

Before & Afters’ Issue 24 is an entire magazine dedicated to the visual effects, virtual production and machine learning tech behind Robert Zemeckis’ Here.

Journalist Ian Failes uncovers how this ambitious production came to life, with a whole issue featuring insights from the key crew including Visual Effects Supervisor Kevin Baillie, Dimension and DNEG 360 CTO and Virtual Production Supervisor Callum Macmillan, Metaphysic's Jo Plaete, and DNEG's Alexander Seaman and Martine Bertrand.

In Here, Robert Zemeckis adapts Richard McGuire’s 2014 graphic novel, spanning several decades from a single location. Shot at Pinewood Studios in London, the story is filmed from one fixed camera angle with a constant view outside a living room window.

Zemeckis and Baillie engaged Dimension and DNEG 360 for the second time, following our successful collaboration in Disney+’s Pinocchio (2022). For Here, we operated the LED stage and produced realtime content visible through the living room window over a period of over a hundred years.

The 30-foot by 15-foot LED wall (made up of Roe Bp2v2 panels with Brompton Tessera SX40 image processors), positioned outside the living room window on set, projected realtime environments over around 85 different eras.

"The virtual production team and the physical production teams would play off of each other to create an illusion that was exactly what Don [Burgess, Director of Photography] had in his head."

Kevin Baillie

VFX Supervisor

Dimension also built a significant asset build and asset tracking role. One particular asset library was a realtime car system made in Unreal Engine. This system allowed Baillie or the first AD to use an iPad to trigger cars as dialogue was being given. Beyond cars, assets included trees and foliage, plus the colonial house visible for much of the exterior views out the window, and weather, including blizzards, lightning storms, hailstorms, rain and wind systems.

We leaned on our custom build of Unreal Engine, incorporating a suite of rendering technologies and NVIDIA’s Ada Lovelace Architecture graphics cards to orchestrate the realtime imagery on set.

The LED volume also acted as a system of shooting matte passes - a term our team coined the ‘disco pass’.

"We would cue the disco pass and that would be the last thing we would record for a particular scene. This would give information to Kevin, and later to DNEG, to then enable reconstruction of elements and help with transitions."

Callum Macmillan

CTO and VP Supervisor, Dimension & DNEG 360

"We could create these mattes to enhance the background when we needed to – if we felt like we needed a little more reflection in this table – or to dim it because Bob was finding it distracting, then we had all the tools to do that"

Kevin Baillie

VFX Supervisor

We also ensured an accurate line-up between the live-action camera, the physical set, and the virtual world – aided by nLight’s HD3D realtime depth camera. The camera was able to combine the depth maps with the main camera streamed either on-set directly in Unreal Engine or in post-production using nLight’s DCC plugins.

“This depth data helped align the virtual set with the real world for continuity, ensuring everything was always lined up and capturing a realtime depth mesh for each scene for integration with additional post VFX.”

Callum Macmillan

CTO and VP Supervisor, Dimension & DNEG 360

Continue reading to learn further about the virtual production, visual effects, and machine learning innovations of Here in befores and afters Issue 24.