american cinematographer - may 2016 - girish...

16
MAY 2016

Upload: others

Post on 01-Apr-2020

14 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

M AY 2 0 1 6

AM

ER

ICA

N C

INE

MA

TOG

RA

PH

ER

• M

AY

20

16

• TH

E JU

NG

LE B

OO

K –

DE

MO

LITION

– B

LOO

DLIN

E –

HA

P A

ND

LEO

NA

RD

– C

ON

TAIN

ME

NT –

AS

C A

WA

RD

S •

VO

L. 97

NO

. 5

Page 2: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

An International Publication of the ASC

32 Welcome to the Jungle ASC member Bill Pope employs new technologies

in the virtual environments of The Jungle Book

46 Stripped Down Yves Bélanger, CSC charts a widower’s emotional

journey in Demolition

58 Family, Friends and Infection The cinematographers of the series

Bloodline, Hap and Leonard and Containmentdetail their work

71 A Valentine to Cinematographers A pictorial celebration of the 30th ASC Awards festivities

DEPARTMENTS

FEATURES

— VISIT WWW.THEASC.COM —

On Our Cover: The orphan boy Mowgli (Neel Sethi) faces the snake Kaa (voiced by Scarlett Johansson) in The Jungle Book, shot by Bill Pope, ASC, who combined live-action and virtual cinematography on the production. (Image courtesy of DisneyEnterprises, Inc.)

10 Editor’s Note 12 President’s Desk 14 Short Takes: Everything Will Be Okay 20 Production Slate: Hardcore Henry • Too Late 86 New Products & Services 90 International Marketplace 91 Classified Ads 92 Ad Index 93 In Memoriam: Don Thorin Sr., ASC 94 Clubhouse News 96 ASC Close-Up: David Klein

M A Y 2 0 1 6 V O L . 9 7 N O . 5

46

58

62

67

Page 3: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

32 May 2016 American Cinematographer

D irector Jon Favreau is an ardent admirer of the 1967animated Disney film The Jungle Book, but his newfeature of the same name is more than just a re-imagin-ing of that story. Favreau opted for an essentially

unproven virtual-production methodology, and the result is analmost entirely digitally rendered and animated film that isintended to look completely photo-real. To realize this ambi-tion, the director tapped cinematographer Bill Pope, ASC. The film features a sole live-action actor, 13-year-old

Bill Pope, ASC pushes the boundaries of virtual cinematographyfor director Jon Favreau’s photo-real

The Jungle Book.

By Michael Goldman

•|•

Neel Sethi, who portrays the human boy Mowgli. Only thosepieces of the sets that Sethi directly interacted with are real;beyond them, all environments, and the entire cast of support-ing animal characters, are CG constructs. Among the film-makers’ key collaborators was Rob Legato, ASC, who servedas the film’s visual-effects supervisor and second-unit director-cinematographer. Favreau says the idea was to use a process “not unlike ananimated film, in that it started with a story department,storyboards and an animatic version. But we moved on to aprocess more like Avatar [AC Jan. ’10], whereby we created amotion-capture version of the movie and then shot the actor[in native 3D] while viewing him interacting with [animatedelements] in real time. Then we took all these pieces, like apuzzle, and assembled them using the latest technology. Inthat sense, we were standing on the shoulders of a giant inAvatar — that was the film whose process ours most resem-bled, but unlike that film, ours takes place [on Earth], andeverything in it had to look entirely real. We wanted to put alive-action sensibility into something that could only be donein a computer.”

Welcometo the JungleWelcometo the Jungle

Page 4: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

www.theasc.com May 2016 33

All of which raises the question:Is “cinematography” the proper term todescribe how The Jungle Book’s imageswere captured? Pope contends that the method-ology allowed him to make traditionalcinematography decisions for each shot,but “in the digital space.” He recalls,“Initially, I said I wasn’t sure how to doit. I told them it involved workingwithin a purely digital space, which isnot really my forte. I suggested maybethey should get someone who has donethis sort of thing before. They said, ‘Noone has really done this sort of thingbefore,’ and Jon said he preferred some-one who considered himself a live-action photographer but who had alsobeen down the visual-effects road andcould understand the things they weretalking about. So I met with several ofthe key players on Jungle Book: AdamValdez, visual-effects supervisor atMPC; Andy Jones, animation supervi-sor; Rob Legato, visual-effects supervi-sor; Joyce Cox, visual-effects producer,whom I knew from [Men in Black 3];and Chris Glass, production designer.And talking with them, I began to seethe process and how promising it couldbe. These are the people — along withMark Livolsi, the editor — who advisedJon on a daily basis as we moved

Opposite and this page,top: Mowgli (Neel Sethi), ahuman boy raised in thejungle by a pack of wolves,teams up with a panther, abear and other beasts in anattempt to find his wayback to human civilizationand save himself from avicious tiger in Disney’s TheJungle Book. Middle:Director Jon Favreau (left)and cinematographer BillPope, ASC discuss a scene.Bottom: Visual-effectssupervisor and second-unitdirector-cinematographerRob Legato, ASC reviewsfootage.

Uni

t pho

togr

aphy

by

Gle

n W

ilson

, cou

rtes

y of

Disn

ey E

nter

prise

s, In

c.

Page 5: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

34 May 2016 American Cinematographer

◗ Welcome to the Jungle

through the film. A visual-effects-basedfilm like this one absolutely relies oncommunication among the departmentheads. No one can move forward unlesswe all move forward. So for the follow-ing months, that group of people metwith Jon on a daily basis to make thismovie. After I signed on, we then movedimmediately into virtual production atDigital Domain.” Girish Balakrishnan, DigitalDomain’s lead virtual-production tech-nical director, notes, “Digital Domainwas the primary virtual-productionvendor on the film, in charge of every-thing from supporting the virtual artdepartment with production designerChristopher Glass, to running [DigitalDomain’s Lab department], to previsual-ization, motion capture, virtual camera,SimulCam and post-visualization.” Also key to making everythinglook “entirely real” would be the work ofvisual-effects vendors MPC, whichhandled all environments and most ofthe animals, and Weta Digital — whereDan Lemmon served as visual-effectssupervisor — which created primates foran important sequence. Both companiesbuilt upon earlier work done for therecent Planet of the Apes movies and Lifeof Pi, in order to create realistic animals

Top: Mowgli receivesguidance from

Bagheera (voiced byBen Kingsley).

Middle: Shere Khan(voiced by Idris Elba)searches for Mowgli.

Bottom: Favreaureviews a scene

with Sethi.

Page 6: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

www.theasc.com May 2016 35

of varying species that had to perform aslead and supporting characters for anentire movie. As virtual-camera layout artist JohnBrennan explains it, the Digital Domainteam “built the sandbox where thecreatives played. There’s a lot to learn andreconcile in a hybrid space like the DDstage, but there were and are certainmandates — one being that virtual cine-matography should be recognizable ascinematography. There are aspects ofvirtual production that are new anddisruptive, but I don’t think that’s thewhole picture. We worked under Rob’ssupervision to adapt cinematic roles andexpectations, and I think [that’s] anotherside to the story.” Thanks to new innovations in ray-tracing technology within Pixar’sRenderMan software, the visual-effectsartists were able to create stunninglybelievable CG characters and back-grounds. Given that the film will beexhibited in a limited number of theatersusing the new 3D-enabled Dolby Visionlaser-projection system (see sidebar, page40), audiences will be able to scrutinizethe team’s work down to the smallestdetail. Although he was stepping ontotechnology’s cutting edge and embracinga largely untested production method,Favreau stresses that his collaborationwith Pope was strikingly familiar. “Mycinematographer was a partner with mein the same way he would be in a live-action film,” the director says. “You scout[virtual locations] together, rehearsetogether, and then, finally, you capture theimages. Lots of times on effects films, yousit over someone’s shoulder while theywork in a box, designing previs. Or inanimation, you do layout with an artist.Here, I incorporated all the departmentheads that I’m used to collaborating withon live-action movies, and I enjoyed anongoing partnership with them.” Among the technologies requiredto enable such a collaboration within thevirtual space was an updated SimulCamprocess similar to the one used on Avatarand run by the same SimulCam supervi-sor, Glenn Derry. The SimulCam process

Top: Mowgli meets with Gray (voiced by Emjay Anthony). Middle: Pope captures Mowglirunning in a muddy action sequence. Bottom: Pope lines up the 3D camera rig.

Page 7: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

36 May 2016 American Cinematographer

utilizes multiple, movable OptiTrackmotion-capture towers — comprisingOptiTrack cameras hung onto trusseswith wheels that can be maneuvered foreach setup — to track a live-actioncamera’s position. The system then usesthat data to drive a virtual camera in theCG world. On the back end, theSimulCam system receives a live camerafeed, tracks the camera position in 3Dspace, and, in real time, keys out blue-screen and composites live actors withCG characters and environments. For The Jungle Book, this systemwas directly linked to a larger, newervirtual-cinematography system builtaround a rendering engine calledPhoton, used here for the first time on afeature film. It was also, Balakrishnannotes, “the first time a game engine wasdeeply integrated into the productionpipeline and used to previsualize theentire film from start to end.” Photon, asBalakrishnan explains, was developed asa collaboration between Microsoft andDigital Domain as a technology andpipeline, built on the foundation of acustom version of the Unity video-gamerendering engine, specifically geared forvirtual production. This innovation allowed Pope andFavreau to make detailed cinemato-graphic choices — including cameramovement, lens choice, depth of field,framing and lighting — in every stepfrom rough layout through final post.Legato refers to this methodology as“doubling down on the knowledge andexperience we [gained] on Avatar,”adding, “this is why Jungle Book looksand smells like a real film, shot on film. “This movie is a game-changer inthat sense,” Legato continues. “Theinformed live framing, staging andnewly added lighting choices, coupledwith dramatically choosing the properlens and camera moves, are now allsquarely in the province of the experi-enced live-action cinematographer anddirector. The tools were used primarilyto create the look and feel of a real live-action movie instead of the heightenedlook of an animated one. The goal was tomake the audience forget what was done

◗ Welcome to the Jungle

Top: Mowglijourneys through

the jungle withBagheera. Middleand bottom: The

crew capturesbluescreen action

sequences withSethi.

Page 8: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

www.theasc.com May 2016 37

on a computer and what was not — toremove the separation between visualeffects and cinematography.” For Pope, the process began whenhe walked onto Digital Domain’s 60'x30'motion-capture stage in Los Angeleswith what he calls a “glorified iPad on atripod” in order to “actually go on avirtual scout with Jon Favreau withinthat [virtual] space.” That “glorifiediPad” was in fact a custom carbon-fiberrig that incorporated a high-resolutionOLED monitor connected to twojoysticks and an arrangement of motion-capture markers, which allowed thedevice to be tracked throughout thestage, thereby capturing all of Pope’svirtual-camera movements. In turn, as Balakrishnan says, “thevirtual camera produced an imagerendered in real time on the computer —through the virtual lens of the cameraand displayed on the monitor” via thePhoton engine. The system’s interface,including the joysticks, enabled Pope todolly, boom and change lenses on the fly;he could also physically move thevirtual-camera rig — “OLED monitor,joysticks, mo-cap markers, and carbon-fiber body as a whole,” says Brennan —as if he were holding a real film camera.Favreau could also employ an OculusRift virtual-reality headset and Xboxcontroller on the stage, placing himwithin the volume — the space in whichthe digital imagery is “shot” — alongsidethe virtual characters and allowing himto maneuver 360 degrees as he madedecisions about blocking and the place-ment of environmental elements.Tracking markers on the VR headsetalso enabled Favreau to walk through thevirtual environment as he moved aboutthe stage. Balakrishnan notes thatFavreau’s preference of the Xboxcontroller as his virtual interface wasinfluenced by his experience as a video-game enthusiast. Pope explains, “We would movethrough that space, and the art depart-ment and animators and I would allwatch it, discussing if we could move thisway, or work over here on this part of[the virtual set]. So, basically, we carved

out a space to set the scene, and then,after the volumes were built, we wouldgo back to the mo-cap stage and blockout the scene, picking which angles weliked and so on.” Such capabilities meant the film-makers could build a detailed animatic,which Legato describes as “a templatefor every shot in the film.” Through thisanimatic, Pope could lay out shotspecifics, including the choice of cameraplatform — Technocrane, dolly, hand-held, Steadicam, etc. — whether theshot would ultimately be captured withlive action or created entirely as CGI. Brennan further describes theworkflow chronology: “The directordictated size and scale changes way backin preproduction in the main mo-capvolume at Digital Domain. He woreVR goggles or scouted with the virtual

camera while an artist would scale char-acters up and down in real time on acomputer. A change might still happenover the artist’s shoulder or via notes,but many decisions were made in thiscontext where ‘feel’ developed. Whenwe shot our previs pass on the movie —still at Digital Domain mo-cap — wedid so with stuntmen in mo-cap suitsplaying many of the movie’s animals;oftentimes two stuntmen werecombined into a single quadrupedanimal. While these stuntmenperformed, along with [Neel] in a mo-cap suit, Bill and Rob would shootrough camera ideas during the livetakes, in the moment with the v-cam.Finally, after shooting the entire moviein this manner, we then moved to thevirtual-cinematography volume at LosAngeles Center Studios, where all of

Top: Mowglifloats down ariver with Baloo(voiced by BillMurray). Bottom:Pope and crewwork within thevirtual space.

Page 9: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

38 May 2016 American Cinematographer

shots before committing to them on theactual live-action set.” Indeed, Popecould lay out virtual dolly tracks, drive avirtual crane, and operate a virtual camerain the capture volume, and then have allof those choices replicated in the virtual-camera layout. “Bill would work with me forquick animation changes, set adjustmentsand dolly-track creation, and he wouldwork with Girish to pull focus, createeffects and dial in the look of each scene,”Brennan adds. “His virtual-camera lenspackage matched the one he would useto photograph the real actor later. He wasfree to pick up the [virtual] camera forhandheld or attach it to anything hephysically lugged into our volume, or thatvirtually existed in the current masterscene. At the [Los Angeles CenterStudios set] for the live-action shoot, hehad a tracked SimulCam viewfinderavailable to him, with content from thesemaster scenes. He could then shoot theactor and the virtual animals and envi-ronment in context, and compose every-thing accordingly.” The virtual-camera environmentwas particularly useful in allowing Popeto test lighting as he would on a live-action set; the cinematographer had athis disposal a custom digital light kit thatincluded controls for things like key light,fill and bounce light, flags, gobo patterns,and soft boxes. As it had been on Avatar,the virtual-camera workflow waspowered by Autodesk’s MotionBuildersoftware, but on Jungle Book it was furtherlinked to the Photon rendering engine.According to Balakrishnan, the additionof Photon enabled advanced renderingcapabilities for lighting; depth of field;lens flares; motion blur; color grading;and various dynamic effects, includingfire, rain, fog, water and smoke. All of thismeant that Pope and his team could seeeven the subtlest virtual elementsrendered instantly. “Before Jungle Book, out-of-the-box MotionBuilder alone lacked theadvanced rendering fidelity required tomake these aesthetic decisions for shots,”Balakrishnan explains — while making itclear that MotionBuilder is the industry-

this data had been refined and assem-bled into scenes, and Bill — with inputand interplay with and from the direc-tor, the editor and Rob — shot theentire film in earnest, virtually with thev-cam, to stay one step ahead of theactual photography with Neel.” Based on his virtual-cameraexperiences, Pope worked alongsideartists — tasked with adding roughcharacter animation to the art depart-ment’s virtual sets — in DigitalDomain’s Lab department, laying outprecisely where he wanted things placedand directing the lighting for each shot.

“The ‘Lab,’” Balakrishnan notes,“is a term that originated from theAvatar days. It [consists] of a group oftechnical artists who take the sets devel-oped by the virtual art department —VAD — and combine motion-capturedata as well as hand-keyed animation todevelop a master scene. This masterscene then gets delivered to the virtual-cinematography stage, where the cine-matographer can lay out ‘cameras’ inthis virtual environment.” This “collaborative and iterativeprocess,” Balakrishnan continues, gavePope “the flexibility to experiment with

Top: Mowgli encounters Kaa (voiced by Scarlett Johansson). Bottom: Legato lines up the virtual camera.

◗ Welcome to the Jungle

Page 10: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector
Page 11: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

40 May 2016 American Cinematographer

This past January,director Jon Favreau

presented selected clipsfrom The Jungle Book atDisney’s El CapitanTheatre in Hollywood.As invited journalistsfiled out of the event,many commented on theway the clips had beenexhibited: in 3D, usingthe Dolby Vision laser-projection system. It wasclearly a brighter 3Dexperience with betterdynamic range than what the attendeeswere used to seeing. And that crowd was lucky to seeit. At the time of Favreau’s presentation,the El Capitan Theatre was the onlyAmerican venue where the 3D-enabledDolby Vision laser projector was up andrunning on a permanent basis.Hollywood’s Dolby Theatre wasconfigured to incorporate the 3Dsystem on an as-needed basis forpremieres of major films — and, as ofthis writing, Dolby officials have saidthat four more 3D installations areoperating in Europe and that a deal is inplace to bring 100 to China over thenext five years. Most importantly,though, 15 AMC Prime cinemas in theU.S. are preparing to debut the systemin concert with The Jungle Book’s arrival. Although Star Wars: The ForceAwakens and a few other films hadpreviously screened in this new format,The Jungle Book will be seen in morecinemas in 3D-enabled Dolby Visionthan any movie to date. Other companies are pursuinglaser, as well. Imax has rolled out its ownlarge-format system, known as ImaxLaser 3D, which has been installed inselected venues around the world.Furthermore, projector manufacturers— including Christie, with whomDolby partners to build Dolby Visionlaser projectors — are in the process ofpursuing other flavors across the indus-

try. All told, the initiative to roll outDolby’s system to multiple cinemaswith The Jungle Book’s release givessignificant teeth to the argument thatthe time has arrived for high-end laserprojection to definitively prove itself asa superior consumer 3D-viewingformat. “The new generation of projec-tion technology is going to be laser,simply because you can push more lightout of it than you can with xenon-basedsystems,” says Stuart Bowling, directorof content and creative relations forDolby. “But in our case, we alsoredesigned an optical path of theprojector to overcome challenges withregard to being able to produce a veryinky black compared to what [most 3Dsystems] offer today. We partnered withChristie to build the Dolby Visionprojector, combining the laser-projec-tion light source, redesigning opticalheads and some of the processes insidethe projector to get us to a nativecontrast inside the projector of a millionto one [while] producing 20 stops ofdynamic range. “And for 3D,” Bowling contin-ues, “our projectors are powered by a 6Plaser, which means three primary colorsper projector that are tuned specificallyfor our dichroic filters [in the lenses on]Dolby 3D glasses [that the audiencewears]. That means there is no filteringwhatsoever inside the projector — no

light lost — and thathelps us achieve 14 foot-lamberts, bringing 3D upto 2D light levels, mean-ing more light [and]more color saturation.Combine that with theimproved contrast ratioand you have significantimprovement over earliergenerations.”

Now that regulatoryissues involved withinstalling laser projectorsin public venues are no

longer inhibiting rollout, the primaryreason deployment is currently limitedis the significant expense involved inproperly retrofitting a theater for a lasersystem. Nevertheless, Bowling states,“Laser is now deploying; it will takeseveral years. You always run into thatwith advancements in technology:There is a high cost to jump in and bean early adopter. But it’s really [thoseearly adopters] who help get the ballrolling, and that in turn helps lower thecost for implementation. Until then,you will find laser projection typically inbigger auditoriums — what we call‘premium large format’ auditoriums.” Such venues, Favreau suggests,would be great places to see The JungleBook and learn firsthand how laserprojection might change the 3D exhi-bition experience. “We’re helping to introduce thistechnology to North America,” thedirector says. “This movie was meant tobe a rich experience, and new innova-tions and new technology have alwaysdriven audiences to theaters if youcombined them with great stories. Tosee this movie exhibited in 3D withextended dynamic range gives the storya very lifelike quality, which is what wewanted. We wanted audiences to thinkthey were seeing something real. In thatregard, this can help push our mediumforward.”

— Michael Goldman

•|• Jungle Book Shines in Dolby Vision •|•

Page 12: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

www.theasc.com May 2016 41

standard software used at the majorityof motion-capture facilities, allowing forsimple rendering/shading, and primar-ily enabling the virtual camera toaccomplish general blocking and stag-ing of the virtual environment andcaptured animation. “With the dynamic streamingarchitecture of Photon,” he continues,“creatives are now able to see multiplerendered views in the virtual world, withMotionBuilder acting as the hubcontrolling and recording the entirescene. Bill Pope, using his customvirtual-camera system, was able to notonly direct the framing of characters inshots — either live in a motion-capturevolume with actors, or virtually withanimated characters — but could alsodirect how dappled light through treesfell on a character, live-controlling thedepth of field, seeing how motion blurmight affect an action sequence, andtonally dictating the overall mood of ashot. “With each shot captured in thevirtual-camera volume,” Balakrishnancontinues, “Bill sat down with one ofour Photon artists and hand-lit the shotin the computer with the custom digitallight kit we developed for him. With

◗ Welcome to the Jungle

Mowgli meetswith King Louie(voiced byChristopherWalken).

Page 13: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

42 May 2016 American Cinematographer

these tools, he was able to approach thelive-action shoot understanding how a10-foot bear might sit next to a 4-footkid.” Important interfaces thatenabled Pope and others to interactwith the virtual world included iPadlighting controls, MIDI controllers,and pan-tilt wheels originally used byLegato in real time for previs on TheAviator (AC Jan. ’05). According toBalakrishnan, the pan-tilt wheel wasimportant because it provided “ananalog feeling of operating a camerahead on a computer-controlled dollytrack. “Each tool was carefully tuned togive Bill an accurate hand-operated feelin an otherwise digital world,”Balakrishnan continues. “The workflowgrew to be so natural that Bill wouldoccasionally even bring in hisSteadicam operator, Roberto DeAngelis, to operate certain shots in thevirtual-camera volume. Then, as shotswere captured, we had an editor on setingesting the live feed and live-cuttingthe material straight into an Avid,giving the camera operator and directorreal-time feedback on how a shotwould influence the edit.” “The idea,” Pope adds, “was to

make sure we did not make moves thatcould not be done by human beingsworking in a real jungle. So we limitedourselves to small crane moves, dollymoves, handheld and some Steadicam.” The live-action shoot — duringwhich the filmmakers shot Sethiperforming as Mowgli — took place ontwo adjacent 100'x50' stages at LosAngeles Center Studios. Pope’s teamshot the action in native 3D in the1.85:1 aspect ratio using dual ArriAlexa XT M cameras, which were setto open-gate mode and mounted inCameron Pace 3D stereoscopic rigs. Aflexible SimulCam volume was alsoincorporated into each live-actionstage, allowing Pope to see — via aheadset, iPad or computer screen —Sethi’s performance in concert with thedigital characters as the actor interactedwith puppeteers on the physical set. The Alexa XT M camerasrecorded to onboard 512GB Codexdrives, and were outfitted primarilywith Panavision Primo zoom lenses.Pope also carried Primo primes, but henotes, “On the 3D side, you don’t wantto change primes too much. In thevirtual environment, to mimic the qual-ity of those lenses on the animatic, wemimicked the millimeters of the Primo

lenses — lining a scene up and choosingthe same marks we would choose onthat lens [for live-action shooting].” Fiber-optic cable linked the 3Dcamera rigs to dual 3D engineeringracks — one for each camera. Pope and3D camera engineer/digital-imagingtechnician Robin Charters created looksfor each scene by building LUTs usingFramewright’s LinkColor software,connected to Blackmagic Design’sHDLink Pro for on-set color correction.The filmmakers could also view 3Dimagery on set using 42" Sony LMD-4251TD 3D broadcast monitors. Complementing the filmmakers’insistence that their camera movementsbe based in reality was their desire toincorporate subtle aberrations that alocation shoot in the jungles of Indiawould have introduced. “We wantedimperfect skies, lens flares, sky flares,”Pope explains. “We wanted all the thingsyou would really run into if you wereshooting in the jungle and trying tomaneuver there — all the things thosehardships would cause. So we built thatinto the photography. We wanted to putinto [the viewers’] minds that we reallywent there and did this.” Lighting-wise, this required care-ful tracking of lighting data back and

◗ Welcome to the Jungle

Left: Mowgli keeps a safe distance. Right: Pope mans the controls for the camera crane.

Page 14: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

forth between the virtual realm and thelive-action stages, as photographic deci-sions continued to evolve during thecourse of production. The filmmakerstook great care to follow their virtual-lighting blueprints — including theiranimatics, which incorporated virtual-lighting data and metadata — andgaffer Bobby Finley and lighting-console programmer Joshua Thatcheralso carefully logged all real-worldlighting setups, instruments and otherdetails for the visual-effects team sothat they could ensure the real andvirtual lighting always matched. “What’s interesting here,”Balakrishnan opines, “is that virtual-lighting data was actually dictated byPope in the virtual-camera layout inPhoton, and then was brought onto thelive-action set, all down to the exactposition and angles of the sun. MPCand Weta would then receive this light-ing information to ensure that the baselighting of their virtual sets matched thelive-action captured plates.” Meanwhile, for live-action film-ing, the art department devised asystem of breaking large sets into whatPope calls “palettes,” or sections onrollers — many of which were modularand could be reused and reconfigured torepresent multiple locations, whileothers were set-specific. Representing

the portions of the jungle that Sethineeded to interact with, these palettescould be quickly rolled onto either ofthe two stages, assembled, disassembledand rolled back off, keeping the produc-tion moving at peak efficiency. Supporting this strategy, bothstages featured identical bluescreensand lighting rigs so that the filmmakerscould move between the stages and“change lighting quickly so that wecould be anywhere — day or night,dusk or dawn, and [looking] in anydirection,” Pope explains. To light the stages’ bluescreens,Finley says he used LEDs — specifi-cally Cineo TruColor HS instruments— more than he ever had on a feature.Finley adds, “Mac Tech 960 [LEDSleds] were my overall ambient toplight. I had Quarter Grid bags made forthe 960s to give them a space-light feel,[and they were] bulbed at 3,200degrees. At either end of the stage, webuilt three 8-by-12-foot soft boxes,which were used to extend the skylight.Each rig was on chain motors, so wecould raise and lower them as needed.Each soft box had two 960 LED Sledsin them. To wrap around fill light whenneeded, [key grip Tony Mazzucchi’steam] hung eight 20-foot-wide by 40-foot-high Ultrabounces on travelertrack, in front of the bluescreen, into

Standing beside the 3D-camera rig, Favreau examines a setup for a live-action scene.

Page 15: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

which we bounced more conventionallighting instruments — Arri T12s and[Bardwell & McAlister] 12-light MacTech HPLs.” From a dimmer room outside thestage, Thatcher used his High EndSystems Hog 4 console to control thelight, which also included 24K tungstenFresnels configured both as floor unitsand mounted in bluescreen-coveredscissor lifts; according to Finley, the

Fresnels were most often projectedthrough tree-branch gobos for adappled effect. The stages were alsonetworked through two Hog 4PCsystems, which controlled two HighEnd Systems DP8000 DMX proces-sors. The production frequently usedinteractive light on set in order tocomplete effects that were begun in thevirtual jungle. For example, Finley

explains, “There is a point whereMowgli jumps into a ravine, where he issuddenly in the middle of a water-buffalo stampede. You can see his light-ing is influenced by the shadows of theanimals as they run by. For that scene,we used a 10-foot high and 18-footwide LED wall [supplied by VER] tolight him. The art and visual-effectsdepartments created a video clip popu-lated with water-buffalo shapes, andwhen played on the LED wall througha media server, it created variations ofshadow textures falling across his body. “There were also several sceneswhere he walks through the forest withBagheera [the panther, voiced by BenKingsley] and Baloo [the bear, voiced byBill Murray], and there is no set,” hecontinues. “In those situations, we litMowgli with projectors. Again, the artand visual-effects departments created avideo clip, this time of dappled light tomatch the [virtual background]. Whenprojected on Mowgli as he walked on a

◗ Welcome to the Jungle

Art directorJohn Lord BoothIII consults with

Pope.

44

Page 16: American Cinematographer - May 2016 - Girish Balakrishnangirishbalakrishnan.com/redesign/wp-content/uploads/2018/... · 2018-04-23 · 32 May 2016 American Cinematographer D irector

turntable, it gave the effect that he waswalking through the jungle. We createdboth daylight and moonlight scenarios.It was very important that the dappledeffect match closely with the [virtual]world to sell the illusion.” The digital-intermediate process— which was performed at TechnicolorHollywood with ASC associate StevenJ. Scott, Technicolor’s vice president oftheatrical imaging and supervisingfinishing artist — was also key to seam-lessly weaving together each of theproduction’s many facets. For this work,Scott and his colleagues Mike Sowa andCharles Bunnag used Autodesk’s Lustre2016 Extension 2 color-grading system. Favreau says the colors whereBaloo resides, for instance, were“inspired by the color palette of the[animated] film, in terms of the toneand humor.” Therefore, he adds, thisportion of the jungle represents “a moregolden world, where it is always bloom-ing and the sky is always blue.” Other

parts of the movie — particularly whenMowgli is in danger — are visuallymuch darker. Given the movie’s unique, hybridnature, Pope notes, “we were polishingto a much finer level than on an ordi-nary DI. It was more elaborate, involvedand collaborative.” “It was an unusually collaborativeprocess,” Scott agrees. “Jon and Billwould [sometimes] end up wanting togo in a bit of a different direction thanwhat was originally delivered. Theymight want to add a feeling of heat orwarmth as Mowgli is running throughthe jungle, for example. So, in somecases, we took brilliant CGI materialand finessed it, with Bill Pope settinglooks and Rob Legato also involved in aunique way, guiding the material thatwas coming to us from MPC andWeta.” Legato suggests that in manyways, The Jungle Book’s biggest innova-tion was the creation of a methodology

for filmmakers “who are fluent in analogstorytelling to be able to tell their storieswith the same fidelity [using] digitaltools. My basic thesis is that all film-making and all creativity is analog. Youneed to feel a pencil on a paper — orpluck a string to create a musicalcomposition. So here, we let traditionalfilmmakers look through a camera —look up, down and sideways — andmake the thousands of little decisions inreal time about what inspires themabout a shot or scene.” ●

45

TECHNICAL SPECS

1.85:1

3D Digital Capture

Arri Alexa XT M

Panavision Primo