Technical interview and making of Avengers Endgame by Framestore.
Enrik Pavdeja is a Compositing Supervisor at Framestore, where he led the compositing team on the Oscar-nominated Avengers: Endgame. Some of his major credentials include Spectre, Star Wars Episode VIII and Jurassic World: Fallen Kingdom, for which he was nominated for the VES (Visual Effects Society) award for Outstanding Composting in a Photoreal Feature. He discussed with The Virtual Assist regarding the astonishing before and after of Avengers:Endgame.
Our sincere thanks to Madalina Grigorie (PR & Communications Manager, Foundry) for this exclusive interview.
Please share details of your educational background.
I did a BA in Computer Animation & Visualisation at Bournemouth University, UK. It’s the leading university in the country for Media and Visual FX, and a lot of artists in the industry have been through there.
What drove you to select the Animation and VFX industry?
It all started a long time ago, when my school sent us to a talk by Sony Computer Entertainment. They showed us the process of how they’d made ‘Primal’ – their newest game release at the time – and I was hooked on everything digital. From the art and design to the modelling and rigging of their characters. To how they designed and built their game levels etc. From there I learned that they used Maya and started seeking out all knowledge. Started buying 3DWorld magazine and using all their demo software. I was hooked. Before I knew it, VFX was now a thing, and it was all I was interested in.
From then on, I realised That i wanted to get into film, and once at university – having studied everything from fine art and cinematography to computer animation, mathematics and programming – I realised that I wanted to get into compositing, and one day hopefully be a compositing supervisor.
How has been your journey from Roto artist to Compositing Supervisor?
I very luckily got offered a job by DNEG (Double Negative) just as I’d graduated, and then had some amazing mentors at every step of the way. I quickly moved on from roto into paint when working on Inception. My colleagues, leads and supervisors at DNEG were very hands on (Tom Luff, Scott Pritchard and Graham Page amongst many others) taking hours out of their day to train me in roto, paint and comp.
I started my first comp gig on the movie Paul, and from then onto Captain America and John Carter. Similarly as I moved into comp, I had some great mentors again, largely the same people, but the culture was that everyone helped out. Once I became a lead, again I had some amazing mentors (Dan Snape, Marian Mavrovic and John Galloway). I’ve been very lucky to work with some of the best people in the industry, and outside of my own hard work, I owe a lot to the patience, talent and time of my peers.
Please list your recent credits:
Dr Dolittle, Avengers:Endgame, Jurassic World: Fallen Kingdom, Star Wars Episode VIII, Dr. Strange, Teenage Mutant Ninja Turtles, Star Wars Episode VII, Spectre, Ant-Man, Avengers Age Of Ultron.
You had worked with various leading studios. In your experience do the pipelines differ at different studios?
The CG and VFX pipelines are always different as each company is set up in a way that caters to them. The comp pipeline tends to be the most similar as the process of how we work tends to be the same. Ultimately a pipeline is just a tool for allowing access of content from one department to another, and I found that it was pretty easy to slot into each pipeline. Once you’ve worked on one, you can figure them all out. I doubt the perfect pipeline exists and feel they all have their strengths and weaknesses.
We understand that every project is different, but how do you ‘think’ to crack the shot?
It all starts with the brief, and understanding the client’s vision. Reference is a huge deal. Getting a lot of reference in front of the client early on goes a long way. Everything from images, art, other movies, natural elements, concept art etc.. A lot of the time you’ll have access to the art department to allow for visualising the overall shot. But equally a lot of the time we need to make moving versions of these, and it’s where comp can get involved and really help push a shot forward.
I usually encourage the comp team to do concept frames to get the right feeling of the shot or sequence. On quite a few films we’ve gone through key shots, and built a shot incorporating raw renders, a lot of 2D elements and broad grading to get the feeling of a sequence. From there it’s all about iterating, aligning our internal opinions with ones of the client and getting versions out that start to narrow down on the creative task.
How can one improve his/her technical and creative skills?
From a creative point of view, it always about understanding what you’re trying to achieve visually. A lot of the work we do is grounded in some form of reality and more often than not there will be scans to work from. Studying art and photography, films of any genre that have beautiful lighting and cinematography will help you to understand lighting, and how things really look when lit correctly. Reference is king.
From a technical point of view, going through all the educational material on the Foundry website is a good start. Opening up gizmos that others have built so that you understand how they work is always encouraged. It’ll help you learn how to develop your own. Understanding and studying how lenses and cameras work is always a bonus.
How has Foundry’s Nuke been helpful to you during your career?
Nuke has been the compositing software of choice for every studio I’ve worked at. It’s very versatile, integrates incredibly well to any pipeline and it’s easy to crate gizmos, plugins and tools for. There’s also a huge amount of educational content out there. It’s the cornerstone of VFX compositing, especially at larger facilities.
For your post production pipeline, have you developed any plugins / macros for Nuke? If yes, kindly share details.
We develop tools for Nuke all the time. They’re usually show specific and a lot of the time end up being facility specific, if they’re tools that address a certain problem. On Avengers we developed templates and gizmos for our hologram effects. We also developed pipeline tools to streamline the making of sequence contact sheets, as well as overall show templates.
Please let us know in detail about your department’s work for Avengers:Endgame.
We had a varied body of work on Avengers – everything from extensive character animation, digital suit replacement, fully digital environments, quantum time travel FX and holograms – all delivered through complex, oftentimes invisible photoreal compositing, while staying true to filmed photography.
A big part of our work centred around Smart Hulk. At Framestore we have the experience of doing many creatures in the past – and although we’d already done work on Hulk on Thor Ragnarok, Smart Hulk came with the need for us to innovate and develop new tech for our pipeline and push the envelope further.
While familiar in appearance, with Smart Hulk we now have a character that needs to be able to convey emotion – emotion that our audience can relate to, feel empathy for, and understand – and no differently to how Ruffalo would as a stand in actor – performing as Bruce Banner. He is now a lot more human and the clients wanted us to capture the essence of Mark Ruffalo’s performance.
With that in mind we run a lot of Medusa test footage through our machine learning system – which is essentially a camera rig capturing the actor performing extreme complex expressions. Based on this footage; our AI learning results; and our keyframe animation tests – we started developing more animation blend shapes. We started with around 100 but in the end ended up with about 400.
Our very skilled animation team then run a pass of key framed animation to tighten up and match Smart Hulk closer to the Ruffalo scan reference. Through this process we noticed that the solve provided a good starting base for animation, and in many ways got us quite far with the more intricate secondary animation, such as muscle twitches and vibrations, skin sliding, and soft tissue micro movement. To get to the final performance, we still required a finessed layer of keyframe animation for the final delivery.
As we developed the character further, our animators inherited the role of actors, as we needed to push the performance of Ruffalo to be more Hulk like, something the clients were keen to easily differentiate between.
In the end, Smart Hulk was incredibly detailed – with hairs, peach fuzz, pores on the skin and muscles under, wrinkles and eye/skin micro-motion. We were doing everything we could to get Hulk to appear as real as the tech will let us.
Rocket is a character that we designed and built for Guardians of the Galaxy. Although we’d originally designed and built him from the ground up, this being a new movie, we had to do it all over again as he was in a new costume.
In the scans we had plate reference from stand in – Sean Gunn which our paint & roto team had to painfully paint out in every shot. Thankfully we had a lot of clean plates shot as reference to help with this. We then had voice acting reference from Bradley Cooper, Our animation team will then use Sean Gunn and the voice acting reference from Bradley Cooper to deliver Rocket’s final key framed animation performance, with no help from AI this time.
We built very high detailed suits with a range of real life and imagined materials, and a design that appeared both practical and of course Avengers worthy cool. We had to develop a bespoke costume for each character, mainly because they have different proportions, but also, some are just different. Take Rocket and War machine for example – a tiny Racoon and War Machine – a hero already in a suit. Or the difference between male and female proportions between Cap and Widow. Or Smart hulk and everyone else.
The suits started out life as the hero’s original costumes as it wasn’t known during shooting what the quantum suits would look like. The process would begin with a very tight body track. As we were replacing the suits form the neck down, having a very tight neck track was paramount. A lot of this was frame by frame tracking.
As the suits didn’t line up exactly to the hero costumes, and because they moved slightly differently due to the clothing on set, our animators had to adjust the tracks and do an animation pass to get the suits to fit better and animated in a more natural way. We would then run a cloth sim with lighter areas of the suit that were stiffer, similar to the ant man suit and the darker regions a more flexible carbon fibre material.
Our paint and roto team would go in and digitally remove the plate costumes, and rebuild the heroes necks. This was more often than not necessary as some of the costumes have higher collars than our quantum suit. We also had a rendered neck from lighting to help comp with the plate neck integration.
Originally there were no helmets to the suits as the concept was to have a shimmery sheath as seen in Guardians 2. Eventually the clients were keen on the design of the Antman helmet, which was the film that had released prior to this one which in turn informed the general look of the helmet, similar in materials to the rest of the suit, inheriting the Antman look.
The helmet manifestation was based on the bleeding edge nano tech that we developed for the iron man suit in Infinity War. Our FX teams run simulations of the helmet manifesting from the collar of the suit. The CG suit with manifestation is then comped in with a 2D visor effect which we developed in nuke through utility passes from FX.
This is the same hanger that we’ve seen many times before in the Avengers franchise, but during shooting there was too much equipment to clear out and re-dress the set, so they had to put up a lot of greenscreens / chroma.
Having to replace and fill in a lot of the hanger with CG, we had no option but to build a photo real CG hanger. Everything was based on carefully captured reference, and replicated through high resolution modelling, texturing and shading. We also built the environment outside to cater to various weather requirements over multiple sequences.
In a lot of the hanger sequences, the only thing we’re keeping from the original scans is the hero’s faces, as a lot of the BG, their suits, and oftentimes their whole heads are full CG replications.
The quantum time travel effects started life out in the van. This was the scene we see post credits in the Antman film, where Scott uses the van. Having to match the overall look, we started approaching this in our own way, jazzing up the overall effect along the way.
We replaced the housing of the quantum van with a newly modelled and rendered version. We then received a bunch of utility passes from FX and got to work in comp land. We re-balanced the utilities, added interactive lighting. Distortions. Aberrations. Light glows. Flares and optical effects to complete the look.
Similar to the quantum gate van, as a pivotal part of the story our heroes develop the quantum gate – which in many ways is just a bigger quantum van gate with some more fancy parts added on. We went through a variety of concepts. Our compositing team worked closely with FX to quickly turn around different looks using nuke tools and a host of utility passes. We threw everything at it at this concept phase – auroras, plasma, distortions, energy, electricity etc. In the end, the clients started to lean more towards the original quantum van gate look. They wanted it to feel like a progression of the same Ant-man tech, but upgraded further by Bruce, Tony and Rocket.
For the time travel effect we start with the environment which is run through FX to create the cone stretching. This is then passed to lighting together with a wide range of utility passes. Once rendered these passes are then carefully balanced and treated in nuke by the comp department to create the optical effect. Similar to the quantum van gate, comp applied interactive lighting, distortions, aberrations, light glows, flares and optical effects to achieve the final look.
A lot of our work in Asgard was Rocket animation, and Thor eye treatments. A part of it was also environment extensions of the palace interiors, and asgard exterior. We had one particular shot which was an establisher of Asgard. We reused the environment that we’d built for Ragnarok, but had to redo our a lot of our layout and general lookdev (look development) for the purpose of this shot alone.
This shot was driven mainly by our environment department and once the renders finally made their way through, it was quite a nice creative task for comp. All we had to do was make a full CG city that couldn’t possibly exist, appear familiar, photographic and full of live life. We achieved this by adding layers and layers of atmos, careful grading, additional elements such as ships, asgardian birds, mist and final optical treatments.
Another one off environment that we built specifically for one shot, was Wakanda. It’s towards the end of the movie where the city is celebrating Thanos’ fall. We’d not seen Wakanda at night, and it the original scan it was just the Royal family on a balcony against a green screen. We built a fully digital Wakanda, threw in celebrating crowds, moving ships and a lot of supporting elements.
This was again a perfect marriage between environments and comp. We added layers of atmosphere, dressed in 2D elements and a lot of depth cueing. We run optical treatments on light sources, added flares and of course a lot of overall integration with the scan.
One last environment we worked on was Tokyo. Actually for this shot we did in fact have a plate, so the work consisted of mostly plate augmentation. The clients had filmed a fly over Tokyo, but again as with all these shots, there is a story point, and in this case it was that buildings in half the city were left abandoned after the snap. Between paint, roto and comp we turned off half the lights in the scan, removed half of moving cars, boats, people and just life in general. We then added CG clouds delivered to comp by the environment team, a lot of atmosphere and rain using our extensive 2D elements VFX stock footage library, and finally a CG quintet. All very carefully graded and balanced to give the right mood and feel for the
We developed new tech holograms for Avengers:Endgame. Clients wanted these to feel like familiar tech, but a progression of what’s been seen in the previous films. Holograms were a fully comp solution. Heroes were completely removed by paint, meticulously rebuilding the background. We had renders for Rocket and Captain Marvel’s suit which we integrated into the scans. The artists would then reveal back the characters through transparency in comp. We use 3D cylinders as the basis for the hologram screens. We then run the elements through numerous noise passes while emitting nuke particles from the rooted edges of each character. We then add localised interactive light, scan lines, flickering, flaring, aberrations, and optical effects to achieve the final hologram look – a one stop shop nuke solution for driving the look of these holograms.
Check out Framestore’s official video of VFX breakdown / before and after of Avengers:Endgame.
Based on years of professional experience, what is your advice to novice and experienced artists?
For young artists aspiring to make it in the industry I’d say study film. Learn how films are made, and how shots are filmed. Learn how lighting is key to achieving a particular mood or look. Watch tons and tons of movies. Go through all the online tutorials you can find. Make your own work. Go out there and film something, and have a go at comping matte paintings in, any CG you can get your hands on, work on integration etc.
For experienced artists I’d advise similar things. Study complicated shots and read material on how those are achieved from a technical point of view, but most importantly – from a creative point of view – study reference. It’s imperative that you have something that you’re aiming for that you can visually understand. Study films, look at renaissance art, read photography books, immerse yourselves with creative content that may appear mundane but you may get a feeling for how shadows look, how highlights react, optical effects that happen on the lens and VFX work done by others.
What are your future projects?
I’m currently one of the supervisors on “A Boy Called Christmas” – A film based on a famous children’s book, about the tale of our hero who journeys North and becomes Father Christmas.
Our huge thanks to Enrik Pavdeja and Madalina Grigorie for such great technical interview, with immense details of CG and VFX before and after of Avengers:Endgame.