DIGITAL MEDIA WORLD: Pirates of the Caribbean: On Stranger Tides
Adriene Hurst of Digital Media World interviews Simon Stanley-Clamp, VFX supervisor, and Michele Sciolette, head of VFX technology about Cinesite’s work on “On Stranger Tides”
The team at Cinesite opens ‘Pirates of the Caribbean: On Stranger Tides’ with a thrilling chase through the streets of old London, led by VFX Supervisor Simon Stanley-Clamp.
Of the over 300 stereoscopic visual effects shots Cinesite created for ‘On Stranger Tides’, roughly 200 were required for the daring and comical carriage chase scene through London. In fact, according to the production’s original storyboards, the scene was intended to be even longer but a large chunk of it was cut out before the shoot as the director Rob Marshall continued to change his mind.
First Cut
Starting their own plans with the original storyboards, Simon’s team took care to ask for the first edit early on because the post schedule on the sequence was very tight, among the first shot and delivered for the London location. The production also needed a rough cut back in the meantime to check how the edit was working. “That first edit was very long – over 10 minutes,” said VFX Supervisor Simon Stanley-Clamp. “Then it was cut back a few times before arriving at the 5 1/2 or so minutes you see in the final film.
“While awaiting the first edit, we went on set to gather what we needed to start designing CG buildings and environments to replace the very large blue screens in place at the three locations in and around London. We based the CG on existing architecture, often in side streets and mews. Well in advance of the plates arriving, we built 3D assets and tested our photogrammetry techniques in csPhotoMesh, the facility’s new software, to prove that we could capture stills of a building, model from the stills and create an effective environment.”
Once they were awarded the sequence, they had spent a week surveying the film’s main London locations at Greenwich, Hampton Court Palace and Middle Temple, plus the sets at Pinewood Studios in the docklands. The team captured numerous lighting and texture stills, including before and during set construction at Greenwich where Simon stayed on set each day of the three week shoot. They also managed to do a full set survey at Middle Temple, where the shoot had to be completed in a single day to accommodate the lawyers working there. The entire London section of the film was shot over three months.
Blue Screen Streets
Because the edit was uncertain for some time, the team wasn’t able to previs but used their stills and data capture to pre-build much of the architecture they would need for the different environments. For example, Greenwich had a distinctive look for which they built up a library of a dozen building variations, approved and ready to drop in, before they actually had any footage.
Street layout came from the stills, the storyboards and a few production designs, which Simon extrapolated. This helped them anticipate where they would shoot from. He would send back concept stills to production, showing the buildings he suggested to fill gaps and replace blue screens, sometimes with options. Few changes were made to these looks. The main criterion was that the buildings simply be completely unnoticeable and look ‘right’ for the situation.
The blue screens in some spots were enormous, in one case requiring that the team replace a full street of CG buildings. The justification for it was economic. The street was only needed for two or three shots, and an equivalent practical set would have been huge. Another massive blue screen was employed for the opening shot of London, passing through an arch with St Pauls cathedral in the distance, and others had to be replaced at either end of the Greenwich location to extend into the distance.
Pre-Composites
The first plates arrived in October and November. The team helped production trim the cut down and figure out was needed, shot by shot, by making rough pre-comps – taking still frames, putting them on cards and dropping them into position – so the editors could see the shots without blues screen and hone the looks as they liked. “It wasn’t true previs, but it wasn’t a finished comp either. The process also helped us decide what elements had to be shot,” said Simon.
“People walking through frame alone or in groups, shot on blue screen, were supplied abundantly for us to randomise in the plates, all shot in stereo. Later on, fire and smoke elements were shot in stereo as well. One critical smoke element was commissioned and shot only days before the sequence delivery. The story needed extra smoke as a neat escape route for Jack Sparrow, allowing him to swing undetected from a pub sign, but we didn’t have enough time for CG smoke development. We match moved the plate and layered up multiple bits of the practical smoke.”
“The first 10 or 12 minutes of the film’s opening London shots, coming across the water and leading into the sequence in King George’s palace before the carriage chase, were part of our award, the film essentially begins with our work,” said Simon. “We were even responsible for modelling and animating Jack’s CG cream puff for the palace dining room shots, which gets tossed around until it gets stuck on the chandelier. This chandelier was a massive, weighty practical prop controlled by giant pulleys and motors that had to support Jack. But it didn’t swing naturally, so apart from Jack’s rig removal, we needed to carry out variable respeeds to give its motion the expected look.”
Variable Respeeds
Usually variable respeeds are frame or vector based for the smoothest interpolation. But in stereo, the associated clean up must extremely thorough and precise. They built a tool in Nuke that would do a ‘nearest’, vector or blended respeed, or a combination of any of these as required. They could feed in a QuickTime reference, get an exact match and choose whatever process worked best. Then they made sure it was performed exactly the same way in each eye.
Simon feels that, at this stage in stereo development, respeeds are best kept to a minimum, especially the variable type. “Sometimes they have to come back and be redone to make sure they look right and match the cut. They can even affect the audio because of the lipsync. The editors were usually quite helpful on this,” he said.
The Stereo Hurdle
Creating CG elements for stereo 3D footage created a significant learning and equipment hurdle for Cinesite. The production shot virtually in parallel, slightly toed-in, which allowed convergence during compositing. In other words, convergence is not baked into the footage but can be pulled forward or back. The team extrapolated convergence data from the plate and, using 2D tracking, generated a stereo track and piped this into their Maya scenes.
“This way, what we rendered is an exact match to each eye. Then we can use tools in Nuke, a nudge tool, for example, to adjust and fine-tune shots so that all elements sit at their true stereo depth. But generally, rendering and tracking tasks simply take twice as long, which made the whole project take substantially longer than it would have as a 2D project. However, this was something that the production and vendors were all aware of from the outset.” Tracking was usually done with 3D Equalizer or a Nuke track to finesse elements into place.
Consequently, from shoot to post, emphasis was on avoiding too much clean up work, making sure that what got into the plate was supposed to be there. Stereo cleanup means painting things out in precisely the same way in each eye to avoid artefacts and floating masks that attract the eye. Perhaps what gave us the most help was having their own stereographer on board, almost from the start of the shoot. As the plates turned over, he analysed them and picked up errors the team could correct themselves.
Stereographer
Simon said, “Convergence information wasn’t always immediately provided from production, causing delays, but the stereographer could advise us on the best convergence to set to allow us to go ahead. It was really helpful, especially combined with the in-house tools and using Nuke for all of the compositing. Being able to produce stereo QuickTimes for client review, for example, made a lot of difference to the workflow.”
Equipping the facility was a major but essential step. All compositors now have their own 3D monitors and Cinesite’s main theatre has been converted to Dolby D. A dedicated suite with stereo viewing system was built for ‘Pirates’. “You can’t guess about the images. They all have to be checked in the same way that they will be viewed in cinemas. The stereographer needed his own suite with monitors as well. Virtually every project now has a stereo agenda or deliverable, and now we are set up for it,” said Simon.
Changing Light
As the chase sequence was shot during the northern autumn, October and November, light was not consistent from set to set. Greenwich featured flat white-greyskies, which was a good match for the classic, gritty old London look specified in the early stages. Middle Temple was shot within the same period and didn’t cause many problems but over Pinewood’s exterior set, the skies were bright and sunny almost every day. This meant hanging huge diffusers over the sets to prevent sharp shadows and where the diffusers failed, the team had to remove shadows, re-grade some shots and replace a few backgrounds with murkier environments.
Slight rainfall at Greenwich wouldn’t have been an issue except for the stereo factor. The Pace rig the production were using has an exposed polarising mirror at the front. If rain falls on it, the drops appear in the footage as floating artefacts in the foreground. Cleaning these out in post is very time-consuming and expensive. At the shoot such problems were sometimes handled with eccentric measures like driving the rig backwards down the road, which gave them reversed shots but kept the rain off the mirror.
A very talented Jack Sparrow double performed many of the trickier stunts for actor Johnny Depp, and other stunt men stood in for him on specific manoeuvres. “But whenever he is recognisable as himself, it really is him – there were no face replacements for him. Instead stunt rigs were often used that required substantial digital removal from the stereo footage. In one case they had to replace a digital building to make sure the cable was totally cleaned out.
Frog Engineering
The pirates encounter poisonous frogs in the jungle, which Captain Barbossa captures in a jar. For the scene, Cinesite created and animated a full CG frog complete in every detail – a complete rig, wet eyes, in red, yellow, green and blue. For its size, it’s highly over-engineered,” admitted Simon. “You might even miss it. It was one of the first shots we completed, based on plates from the Hawaii location. I’m always more confident when I can survey the set myself for tracking and lighting data but since we didn’t have anyone working over there, we had to rely on the set information supplied to us.
“The animation in particular took a long time to lock down. The frog drops onto the actor’s shoulder, falling from overhead, jumps from one side of the frame to the other and was such a tiny nuance of animation, in seven tricky shots. Although we kept the glass jar in the plate for the composite, we still had to model a 3D jar, building the glass with depth so we could create passes to reproduce the refraction and reflection that would occur with a real glass jar.
“When you look through the jar at the frogs, they ripple and change shape accordingly. We spent months on it, starting before the shoot with about eight different designs in myriad variations but the director wanted a simple, realistic frog. We completed four iterations, some with exaggerated limbs or other parts but the result is virtually true to life, a real poison-dart frog with a puffed throat and lens movements in the eyes.”
Walk the Walk
Barbossa’s peg leg was an effect that appeared in most vendors’ shots. They shared out the work on it within their awarded shots. A physical peg leg was built as a looks reference and, again, stills of it were captured on location, plus texture reference shots, measurements and a cyber scan for an exact build and texture. Simon had shots taken of it just after each shoot – sometimes it appeared close-up – with the RED cameras to record it exactly as it had looked in the rest of the sequence under the same lighting with all the actors and gear still in place.
The rig was straightforward. They had built a model of Barbossa, and used it for shots when he was walking around, match moving it into the shot and animating only his lower torso performing the required walk. Not surprisingly, the way the actor walks and the way he is meant to appear on screen are different. The leg was meant to be rigid from the hip and unable to bend at the knee but as Geoffrey Rush walks around in his blue sock with the tracking markers, he does bend his knee.
“The cleanup was the tricky part, often needing a bit more than only the peg portion. The animation required a few trials to lock down while making sure that leg was perfectly straight! We weren’t match moving what the actor is doing but what it would look like if he weren’t bending his knee.
No Magic
“The hardest shot was Barbossa’s first, when the camera follows him into King George’s palace hall and he is seen virtually in silhouette. This made tracking and clean up harder, and we needed to make a replacement marble floor to reveal reflections and shadows from the leg. Invisibility and realism was crucial in our shots. Our team was not dealing with magic, lasers and collapsing buildings.
“We had all lighting scenarios to deal with as well. When Jack clashes with Angelica, posing as himself, in the sword fight in the Captain’s Daughter pub, the fire-lit interior is nearly dark. As the pair fight in the rafters overhead, they are wearing complicated rigs against the smoky ceiling behind them. To clean out the wires, we replaced the roof in CG and we also built the CG barrels you see in the background.”
Meshing & Matching
Head of visual effects technology, Michele Sciolette, led Cinesite’s efforts to build the stereo production pipeline and develop new tools to address specific challenges. These included csStereoColourMatcher, an automated tool to compensate for colour differences between stereoscopic image pairs. The environmental artists were using csPhotoMesh to rapidly build up the large CG sets.
“csPhotoMesh is photogrammetry and 3D scene reconstruction software, which was useful for building the carriage chase sequence environments,” said Michele. “It is a simple, flexible way to capture geometry. Given a set of digital images of a static scene, it produces a textured 3D mesh accurately representing the scene geometry and 3D cameras matching the original photos’ positions. The function is automatic – you just drop all the images in a directory and run the command. This kicks off a reconstruction process on our render farm resulting in a 3D mesh and camera positions ready for texturing.
“csStereoColourMatcher is also fully automated. Colour differences between stereoscopic image pairs can be caused, for example, by the beam splitter in the camera rig, or other factors that introduce significant colour shifts across different stereo views. Derived from vector-based analysis, we built it into the front end of all our compositing work for the film. It requires no user supervision and is completely integrated into the Nuke compositing system. We used it to colour balance more than 300 shots for the film.”