computer graphics world 2009 08

44
$6.00 USA $8.25 Canada www.cgw.com August 2009 Spies Like Us It’s mission accomplished for Imageworks and the superhero G-Force team

Upload: nenajela

Post on 29-Sep-2015

218 views

Category:

Documents


2 download

DESCRIPTION

art magazine

TRANSCRIPT

  • VirtualArchaeology

    $6.00 USA $8.25 Canada

    www.cgw.com August 2009

    VirtualArchaeology

    Spies Like UsIts mission accomplished for Imageworksand the superhero G-Force team

    Spies Like Us

  • Make your design unforgettable with ATI FirePro 3D graphic accelerators. Give yourself the power and speed to take projects to the next level. To bring your imagination to life. To finalize jobs before their drop-dead dates. To make the competition green with envy. After all, ATI FirePro 3D graphic accelerators detect, optimize, and tune applications for superior performance. Plus, they give you rock-solid reliability for extreme productivity. What does that mean for your final digital content? Images that make people stop, stare, and stare again. And for you? A digital design force to be reckoned with.The time has come. Put ATI FirePro 3D graphic accelerators to work for you.

    Learn more at www.amd.com/atifirepro/dcc Image courtesy of Anton Bugaev. 2009 Advanced Micro Devices, Inc. All rights reserved. AMD, the AMD Arrow logo, ATI, the ATI logo, FirePro, and combinations thereof are trademarks of Advanced Micro Devices, Inc. Other names are for informational purposes only and may be trademarks of their respective owners. 47056A

    0 1 2 3 4 5 6

    V3700 vs FX 380

    V5700 vs FX 1800

    V7750 vs FX3800

    ATI FirePro Nvidia Quadro

    3d S

    tudi

    o M

    ax 9

    OG

    L S

    PEC

    apcS

    M C

    ompa

    rison

    47056A_Firepro_Ad_CGW_June09_HHOG_FP.indd 1 6/3/2009 4:34:18 PM

  • August 2009 1

    ON THE COVER

    SEE IT IN

    While the secret operatives starring on screen in G-Forcemay prefer to work covertly, their human creators at Sony Pictures Imageworks happily shared their mission and its successes with CGW. See pg. 18 for the story.

    Editing Michael Bays Transformers: Revenge of the Fallen. The directors of Ice Age: Dawn of the Dinosaurs. The colorist/DP relationship.

    FeaturesTrickery and Tweaks

    10Fire and water, as well as a host of amazing VFX, help set the stage (thanks to the efforts of studios such as Industrial Light & Magic, The Moving Picture Company, and Cinesite, plus others) for the action-packed Harry Potter and the Half-Blood Prince. By Barbara Robertson

    Pushing the Bounds

    18 Sony Pictures Imageworks creates CG guinea pigs that assume a major acting role alongside humans in the stereoscopic spy movie G-Force. By Barbara Robertson

    The Rendering Race

    24 Rendering is hardly new, but with a number of recent technology advances, it is hardly the same process that it once was. Whats new? Whos new? And, is it a hardware or software game? By Kathleen Maher

    Wet n Wild

    30 In a commercial for the Whole brand of bottled drinking water, Fusion CI Studios devised a unique fl uid simulation that enabled the realistic-looking liquid to form various shapes, one after the other. By Karen Moltenbrey

    Building the Perfect Pipeline One Student at a Time

    36 An underground, open-source project at Pratt Institute enables students to experience a working production pipeline before they enter the job market. More immediately, the OpenPipeline project aided students in producing their thesis fi lms and senior works. By Rob ONeill

    COVER STORY

    August 2009 Volume 32 Number 8 I n n o v a t i o n s i n v i s u a l c o m p u t i n g f o r t h e g l o b a l D C C c o m m u n i t y

    DepartmentsEditors Note Kings of their Craft

    2At SIGGRAPH 2009, a number of notables in the fi eld of computer graphics were recognized for their amazing accomplishments and contributions to the industry. Because of their work, the milestones of today and tomorrow are possible.

    Spotlight

    4 Products StudioGPUs MachStudio Pro. HPs xw9400. Christies Mirage WU Series projectors. News Despite the recession, the 3D modeling and animation market is strong, as people continue to play games and go to the movies, and new markets open up.

    Viewpoint

    6 Sony Pictures Imageworks Rob Engle details the unique work that went into creating the stereoscopic effects for G-Force. Portfolio

    34SIGGRAPH 2009 Computer Animation Theater.Back Products

    39 Recent hardware and software releases.

    10

    30

    24

    36

    18

  • August 2008

    CHIEF [email protected]

    EditorsNote

    Kings of their Craft The Magazine for Digital Content ProfessionalsEDITORIAL

    KAren moltenbreyChief editor

    [email protected] (603) 432-756836 east nashua roadWindham, nH 03087

    COnTRIbuTIng EDITORsCourtney Howard, Jenny Donelan,

    Audrey Doyle, George maestri, Kathleen maher, martin mceachern,

    barbara robertson

    WIllIAm r. rIttWAGe Publisher, President and Ceo,

    CoP Communications

    SALES lIsA blACK

    national sales managerClassifieds education recruitment

    [email protected] (903) 295-3699 fax: (214) 260-1127

    Kelly ryAnClassifieds and [email protected]

    (818) 291-1155

    editorial office / lA sales office:620 West elk Avenue, Glendale, CA 91204

    (800) 280-6446

    PRODucTIOnKeItH KnoPF

    Production DirectorKnopf bay Productions

    [email protected] (818) 291-1158

    mICHAel VIGGIAnoArt Director

    [email protected]

    CHrIs sAlCIDoAccount representative

    [email protected] (818) 291-1144

    Computer graphics World Magazine is published by Computer graphics World,

    a COP Communications company. Computer graphics World does not verify any claims or

    other information appearing in any of the advertisements contained in the publication, and cannot take any

    responsibility for any losses or other damages incurred by readers in reliance on such content.

    Computer graphics World cannot be held responsible for the safekeeping or return of unsolicited articles,

    manuscripts, photographs, illustrations or other materials.Address all subscription correspondence to: Computer graphics World, 620 West Elk Ave, glendale, CA 91204. subscriptions are available free to qualified individuals

    within the united states. non-qualified subscription rates: usA$72 for 1 year, $98 for 2 years; Canadian

    subscriptions $98 for 1 year and $136 for 2 years; all other countries$150 for 1 year and $208 for 2 years.

    Digital subscriptions are available for $27 per year. subscribers can also contact customer service by calling

    (800) 280 6446, opt 2 (publishing), opt 1 (subscriptions) or sending an email to [email protected]. Change of address can be made online at http://www.omeda.com/cgw/ and click

    on customer service assistance.

    Postmaster: send Address Changes to Computer graphics World, P.O. box 3551,

    northbrook, IL 60065-3551 Please send customer service inquiries to

    620 W. Elk Ave., glendale, CA 91204

    2 August 2009

    A s I write this editorial, the world continues to mourn the loss of one of the worlds biggest pop-culture icons, Michael Jackson. Much can be said about himand it was, in just about every media outlet possible. Controversy aside, Jackson was an entertainer and an ambassador of his craft. Throughout the years, Jackson had been lauded for his musical genius and groundbreaking endeavors. On the surface, some of his contributions belie their complexity. A single glove. Four sliding steps backward.

    A music video. But in context, they were unforgettable moments that transcended history. A style. The Moonwalk. The Thriller video that defined the MTV generation.

    Jackson received adulation during his lifetime for the seemingly end-less list of contributions to his industry, for instance, having been induct-ed into the Rock and Roll Hall of Fame twice. All too often, though, it takes a tragedy before great contributors to their field receive their just recognition. Rather, we should celebrate such achievements while these folks are in their primeas their star, and talent, continues to rise. For-tunately, that is something SIGGRAPH does annually during the first day of its conference for special contributors to the CG industry.

    This year, the following luminaries received these SIGGRAPH awards in recognition of their efforts in the field of computer graphic technology:

    Michael Kass, senior scientist at PixarThe Computer Graphics Achievement Award is given yearly to an individual for outstanding long-lasting achievements in computer graphics and interactive techniques. ACM SIGGRAPH presented this award to Kass for his extensive and significant contributions to computer graphics, ranging from image processing, to animation, to modeling, and, in particular, for his use of optimization for physical simulation and image segmentation.

    Robert L. Cook, VP of advanced technology at PixarThe Steven A. Coons Award is given in odd-numbered years in honor of a persons lifetime contribution to the field of CG and interactive techniques. Cook was given this award for his foundational contributions to physically-based reflectance models and distribution raytracing, and his enduring work on behalf of the SIGGRAPH community.

    Wojciech Matusik, senior research scientist at AdobeThe Significant New Researcher Award is given to a researcher who has made a recent significant contri-bution to the field of CG and is new to the field. The intent is to recognize people early in their career who have already made a notable contribution and are likely to make more. Matusik is receiving this award for his innovative work in data-driven material representations and systems for data acquisition and display.

    Lynn Hershman Leeson, professor emeritus at University of CaliforniaDa-vis and chair of the film department at San Francisco Art Institute. Roman Verostko, professor emeritus at Minnesota College of Art and DesignThe Distinguished Artist Award recognizes artists who have created a substantial and important body of work that significantly advances aesthetic content in the field of digital art. The first recipients of this award are: Leeson, for her paradigm-changing innovations in applying emergent media to visionary forms of creative expression with insightfully cultural discourse; and Verostko, for his contributions to the aes-thetics of algorithmic art, by fusing his knowledge of computer programming with a long engagement with diverse cultural traditions to produce masterful prints.

    As the year and the stunning achievements in CGI progress, lets not forget who laid the initial building blocks for the work, and who built the solid foundation that made these works possible. n

  • Maximum Speed. Zero Drag.

    With LightWaves insane speed across the board, not to mention

    its flexibility and ease of use, it has no equal. For WarDevil, it has

    become the core of the project and continues to provide us with solutions where other 3D applications give us dead ends.

    LightWave v9 Get it done.

    LightWave 3D Kelly Myers, VFX Supervisor, The WarDevil Project, Digi-Guys, Ltd

    LightWave and LightWave 3D are registered trademarks of NewTek Inc. NewTek Inc. 2009. All rights reserved. 2008 Digi-Guys. WarDevil is a registered trademark of Digi-Guys Ltd. All rights reserved.

  • 4 August 2009

    HP Unveils xw9400 Workstation

    PRODUCT: WORKSTATION

    HP announced the immediate integra-tion of the highly anticipated six-core AMD Opteron 2400 Series proces-sor into the companys family of HP workstations.

    Ideal for high-end workstation applica-tions in fi elds such as engineering, 3D digital content creation, oil and gas, and science, the HP xw9400 taps the power of the new AMD Opteron processors to deliver higher productivity, especially for multithreaded applications, multitasking, and mega-tasking environments.

    The HP xw9400 workstation can

    accommodate up to two six-core AMD Opteron proces-sors, for a total of 12 cores, each of which offers as much as 34 percent more perfor-mance per watt over the previous-generation quad-core processors.

    AMD HyperTransport 3.0 technology (HT3) increases interconnect rates from 2 gigatransfers per second (GT/sec) up to a maximum 4.8 GT/sec. Additionally, the xw9400 can be confi gured with the ATI FirePro V7750 3D workstation

    graphics accelerator. The machine includes an 80 PLUS power supply, which is more effi cient than a standard power supply.

    The HP xw9400 workstation is priced starting at $1899.

    PRODUCT: RENDERING

    StudioGPU recently released MachStudio Pro, a profes-sional 3D workfl ow and rendering package. MachStudio Pro offers a seamless way to create and interact with cinematic-quality 3D objects and environments in a nonlinear work-space by leveraging the horsepower of off-the-shelf profes-sional GPUs to deliver real-time and near real-time workfl ow performance on a desktop workstation.

    MachStudio Pro streamlines real-time 3D workfl ow, allow-ing artists to easily manage and interact with complex light-ing, caustics, cameras, shaders, materials, ambient occlu-sion, and color grading.

    With MachStudio Pro, render times can be dramatically reduced from hours to minutes, and minutes to seconds or sub-seconds. Comparable fi nal scenes are consistently rendered with MachStudio Pro at rates of 500 to 900 times faster than with traditional rendering packages. A complex 1.98 million polygon high-defi nition image, for example, renders in 14 seconds using MachStudio Pro, while the same scene rendered with a traditional rendering package can take more than three hours to complete.

    Powered by a real-time rendering engine, MachStudio Pro software fi ts into the creative pipeline after all 3D models are produced, and provides lighting, cameras, materials, compositing, and fi nishing capabilities. The MachStudio Pro workfl ow is akin to a virtual 3D studio environment,

    allowing artists, designers, directors, and TDs to work with lighting, camera views, and multi-point perspectives for a real-time view of frames as they will appear in the fi nal rendered format.

    Developed in a true high-end production environment, MachStudio Pro rendering features can eliminate the need for expensive renderfarms. Key product features include: an optimized shader and material pipeline, the ability to adjust and view all render passes independently, a full materials library, interactive ambient occlusion, fully confi gurable lighting and animation constraint system, animatable and keyframable properties for all objects, lights, cameras, and materials, real-time HDR cameras and lighting, and a host of others.

    An out-of-box, high-performance solution, MachStudio Pro ships with an AMD ATI FireGL V8650 3D workstation graph-ics accelerator card featuring 2GB onboard graphics memory and a parallel-processing Unifi ed Shader architecture.

    StudioGPU is now shipping its MachStudio Pro for Microsoft Windows XP Professional and Windows Vista operating systems.

    MachStudio Pro carries a special introductory price of $4999, which includes a full year of technical support and product maintenance updates. Education and volume licens-es are also available upon request.

    StudioGPU Ships MachStudio Pro

  • August 2009 5

    NEWS: MODELINGANIMATIONNEWS: MODELINGANIMATION

    3D Modeling and Animation Market Shows Resilience Jon Peddie Research (JPR), the industrys research and consulting fi rm for graphics and multimedia, has just released a new report on the 3D modeling and animation market. And the news is that the market is poised for growth. The same software that is used primarily for fi lm/TV production and game development is also being put to work for render-ing and visualization in architecture, manufacturing, and science, and is on the verge of major breakthroughs due to demand from new vertical markets as well as hobbyist and consumer sectors.

    Like all others, the 3D modeling and animation industry is going through a period of contraction and consolidation. However, as diffi cult as it is for all participants, the JPR study points out that this is often a prelude to growth.

    The 3D modeling and animation market includes software tools that are used for TV and movie special effects, creating content for games, product design, and the Web. Over the years, the industry has grown steadily, but the tools for this area are still expensive and used primarily by professionals.

    Mainstream Markets Open Up Beyond traditional industries, new markets are also opening up for more casual users of 3D modeling and animation tools, defy-ing barriers posed by high cost and complexity. Free 3D model-ing and animation tools are becoming available, and millions of copies are being downloaded every year, suggesting a pent-up demand for easy-to-use 3D tools.

    In addition, there is a hard core of hobbyists and casual users who are using 3D tools, even though the learning curve is steep. New distribution models are just now opening up, as well, including online worlds, YouTube, MyToons, the Daz communi-ties, and more. By the end of this decade, new growth will come in mainstream markets, the report states.

    Meanwhile, the DCC market, in general, and the 3D segment, specifi cally, have suffered, as industries in which 3D tools are used but are not core to their businesses tighten their belts and look for the inessential markets they can cut from their sales and marketing budgets. Some of the areas that have been hardest hit include the advertis-ing industry, marketing, visualization for architecture, science and research, and manufacturing.

    Ironically, although the 3D modeling and animation market has been one of relatively slow growth, it has been more stable than other graphics markets during this economic downturn, says JPR analyst Kathleen Maher, author of the new report. The market did not grow as fast, but it did not decline as dramatically as other industries involved in digital content creation. In fact, 2008 was a record year for the market.

    This report includes data on the number of users (casual and professional), market share for the companies, the market segments they participate in, and geographic distribution. The latest edition of the study is available in electronic and hard-copy editions for $3500 (single-user license) or $5000 (full-site license) at www.jonpeddie.com.

    PRODUCT: PROJECTORS

    Christie launched its new Mirage WU Series, an expansion to the fi rms three-chip DLP, 3D active-stereo projectors. The four projectorsthe Mirage WU3, WU7, WU12, and WU1offer more pixels, with WUXGA 1920x1200 native resolution, and brightness options of 3000, 6600, 12,000, and 18,000 ANSI lumens.

    The WU Series is a purpose-built, cost-effective 3D stereoscopic projector with a 16:10 aspect ratio, resulting in greater display fl exibility. The projectors feature built-in edge blending and Christie Light Output Control for constant brightness tracking and monitoring of lamp output. The offerings also feature comprehen-

    sive color adjustment, allowing the actual RGB channels in individual projectors to be tweaked for even color matching across multiscreen applications. The optional Christie Twist module for inter-nal image warping allows projection on virtually any screen or surface.

    The low-maintenance WU Series is compatible with a range of stereo and non-stereo sources. It joins Christies lineup of high-res projectors for simulation and visualization.

    Due to their unique image-processing technology, the projectors use far less computing power and work with virtually any computers GPU, delivering up to

    120Hz output from up to 60Hz input over standard, single-link DVI-D or analog.

    The projectors are complemented by the new Spyder X20 video processor, which is capable of up to 20 MPX band-width, regardless of image type, and meets multi-windowing, multiple display, and processing requirements. It allows for the rotation of images from any output source to project in landscape or portrait orientation. In addition, the Spyder X20s stereoscopic SSO option can provide seamless displays and fl exibility in a mixed 2D/3D image environment.

    Pricing for the WU Series monitors was unavailable.

    Christie Expands Mirage Series

  • August 2009

    Moving in Stereo

    In my job as 3D effects supervisor on a wide variety of stereo-scopic projects at Sony Pictures Imageworks, I have been for-tunate to have witnessed firsthand the emergence of a differ-ent kind of cinematic medium. While stereoscopic imaging can be traced back to the mid-1800s and the invention of the first stereoscope, it wasnt until the recent advent of digital production techniques and digital exhibition tools that the ability to create and present truly high-quality 3D has become possible.

    During the past five years, I have had several moments when my expectations for the possibility of 3D entertainment have been expanded. My first eureka moment came in 2004 when I saw the test images for The Polar Express: IMAX 3D Experience. It was then that my expectation for what a cinematic experience could be was forever transformed. There are those who think of 3D films and believe they are a gimmick and, oftentimes, associate their poor, headache-inducing experiences watching anaglyph (red/cyan) home-video releases (and, sadly, some theatrical releases) with modern 3D. Anyone who has seen a modern 3D film projected either in an IMAX theater or with digital projection knows that the medium has the power to connect you with characters and into the action in ways that are impossible with traditional cinema.

    It should be made clear, though, that it is a different medium and one that deserves and demands a different set of rules and cinematic language. Approaching 3D filmmaking using strictly 2D thinking will not lead to the real breakthroughs that are just on the horizon.

    My second eureka moment came just last year when I saw the concert film U2 3D chronicling the Vertigo concert tour of the iconic rock band. The film featured multiple layers of imagery at different depths and 3D graphical elements that wouldnt typically

    belong in a narrative film but, in this case, allowed it to transcend the term concert film. I believe the film should be viewed as a key step in the innovation of stereoscopic filmmaking.

    For Disney and Jerry Bruckheimers G-Force, director Hoyt Yeatman came to Imageworks to create the digital visual effects and animation, bringing the hero guinea pigs and their gadgets to life (see Pushing the Bounds, pg. 18). The films subject mat-ter spans a wide range of scales, including everything from a tiny housefly to an eight-inch-tall guinea pig and an 80-foot-tall killer robot (yes, this film has giant killer robots, too).

    In order to capture this action, Yeatman deemed it important to be able to film using traditional 2D camera rigs and not be limited by existing 3D camera technology. With that decision made, he came to us again to discuss a postproduction process to convert the plate photography and integrate the visual effects elements into the resulting stereoscopic image. While a postproduction process had been used before on already-completed films (notably on The Nightmare Before Christmas), nobody had attempted a day-and-date-release of a feature film using a mix of converted photography and true stereoscopic elements.

    With about three-quarters of the running time of the film fea-turing visual effects already being executed at Imageworks, it was a natural fit for us to tackle the stereoscopic adaptation, farming out the remaining work to two other facilities. With this ap-proach, for the 3D release of G-Force, every shot in the film is now a visual effect in which the second eyes viewpoint has been created using information from the original photography. At a very high level, the process consisted of isolating all the elements within the shot, offsetting them through projection or 2D techniques and then filling in the holes created by revealing occluded regions. For shots featuring guinea pigs, for example, the additional step of ren-dering the character from the other viewpoint and integrating it into the dimensional plate would complete the effect.

    The end result of using this technique is that we have freed the director to use all the tools and techniques for 2D plate photog-raphy developed over the past 100+ years of filmmaking and have also allowed the depth choices for each shot to be deferred until postproduction. While good composition is still important during

    StereoScopyBy Rob EnglE

    Rob Engle is 3D effects supervisor at Sony Pictures Imageworks, where he most recently worked on the stereo feature G-Force.

  • New DeckLink HD Extreme has Dual Link 4:4:4/4:2:2 SDI, HDMI and analog connections in SD, HD and 2K!

    The new DeckLink HD Extreme is the worlds most advanced capture card! With a huge range of video and audio connections plus a hardware down converter, and Dual Link 4:4:4/4:2:2 3 Gb/s SDI, advanced editing

    systems for Microsoft Windows and Apple Mac OS X are now even more affordable!

    Connect to any Deck, Camera or Monitor

    DeckLink HD Extreme is the only capture card that features Dual Link 3 Gb/s SDI, HDMI, component analog, NTSC, PAL and S-Video for capture and playback in SD, HD or 2K. Also included is 2 ch XLR AES/EBU audio and 2 ch balanced XLR analog audio. Connect to HDCAM SR, HDCAM, Digital Betacam, Betacam SP, HDV cameras, big-screen TVs and more.

    Hardware Down Conversion

    If youve ever wanted to monitor in both HD and SD while you work, then youll love the built in high quality down converter. Use the Dual Link SDI outputs as a simultaneous HD and SD output, or you can switch back to Dual Link 4:4:4 when working in the highest quality RGB workflows. Select between letterbox, anamorphic 16:9 and even center cut 4:3 down conversion styles!

    Advanced 3 Gb/s SDI Technology

    With exciting new 3 Gb/s SDI connections, DeckLink HD Extreme allows twice the SDI data rate of normal HD-SDI, while also connecting to all your HD-SDI and SD-SDI equipment.

    Use 3 Gb/s SDI for 2K and edit your latest feature film using real time 2048 x 1556 2K resolution capture and playback!

    Microsoft Windows or Apple Mac OS X

    DeckLink HD Extreme is fully compatible with Apple Final Cut Pro, Adobe Premiere Pro, Adobe After Effects, Adobe Photoshop, Fusion and any DirectShow or QuickTime based software. DeckLink HD Extreme instantly switches between feature film resolution 2K, 1080HD, 720HD, NTSC and PAL for worldwide compatibility.

    $995DeckLink HD Extreme

    Learn more today at www.blackmagic-design.com

    NEW 4:4:4

    MODEL!

    NOW FOR CS4!

  • n n n n Viewpoint

    August 2009

    plate photography in order to achieve a good, final 3D shot, there are fewer constraints on the filmmaker during the production phase. Additionally, because we were using a virtual stereoscopic camera rig for all the 3D (whether plate photography or CG char-acters), we were able to use the same multi-camera rig techniques (using different stereo parameters on different objects in a shot) refined on past all-CG features. This gave us flexibility that would be impossible with stereoscopic photography.

    The traditional 2D release of G-Force is formatted for widescreen presentation, but in preparing for the 3D release, we quickly real-ized we could use a simple illusion to help the film feel even deeper than it really is. The filmmakers and the studio were keenly inter-ested in making G-Force as immersive an experience as possible, while still maintaining a comfort level that would keep people from leaving the theater with headaches. Taking inspiration from an illustration for the 3D effects in the 1953 film Bwana Devil, in which a lion is portrayed as leaping off the screen, I suggested we present the entire film as a fullscreen (1.85:1) image letterboxed to appear as a widescreen film (2.40:1). We would occasionally break the mask and allow objects to appear over the top or bot-tom letterbox area, giving a stronger illusion that objects were in the audience space. This effect was used sparingly for the first part of the film, and then used much more frequently during the end battle sequence. This technique has never been employed to this level in a feature film, and it was gratifying that the filmmakers were so enthusiastic about its potential.

    Stereos Next StopAs we look to the future, the next step in the evolution of 3D en-tertainment is likely to be its extension into the home. There are

    already 3D-capable television sets sold for home use, and enthusiasts have found ways to use these displays with PC-based gaming. This holiday season and the next will see the introduction of more 3D monitors. With the work being done to standardize delivery of con-tent over cable, satellite, and physical media, it wont be long before we see high-quality 3D video games, movies, sporting events, and concerts in the home. Lastly, we will probably see the introduction of 3D home video and still cameras to complete the picture and enable everyone to create 3D at home. Of course, stereoscopic films are best enjoyed on a big screen with a few heads in front of you as a point of reference. As such, the home experience will probably never compare to that of seeing a 3D film in a theater.

    I am often asked for advice on how to get started with 3D filmmaking. My advice to prospective 3D filmmakers who are un-sure about how to dip their toe in the waters is to dive in. The best way to learn about 3D is to actually make some pictures. Look at them on the small screen and then on the big screen. The same ad-vice applies to almost any creative endeavor. If you are looking for help getting started, look for a studio with an internship program, such as the IPAX program offered by Imageworks.

    There is still so much room to innovate in 3D entertainment. I am waiting for a filmmaker to come along who can take this new medium and truly revolutionize it, taking our perceptions of 3D entertainment and turning them on end. If you are reading this, give me a call! n

    Rob Engle served as 3D visual effects supervisor on G-Force. For feedback on this article, contact him at [email protected].

    In G-Force 3D, at times the group would break the mask and allow objects to appear outside the letterbox area.

    2

    00

    9 D

    isney Enterprises, Inc. and Jerry B

    ruckheimer, Inc.

  • The new DeckLink Studio has SD/HD-SDI, loads of analog connections, down converter and more for only $695!

    Learn more today at www.blackmagic-design.com

    Turbocharge your creativity with DeckLink Studio, the SD/HD broadcast video card that costs hundreds of dollars less than SD solutions! With SD/HD-SDI and enhanced analog connections, DeckLink Studio connects to a massive range of equipment such as HDCAM, HD-D5, Digital Betacam, Betacam SP and more!

    More Video Connections!

    DeckLink Studio includes 10 bit SD/HD-SDI, component, composite, S-Video, 4 ch balanced analog audio, 2 ch AES/EBU, reference, RS-422 deck control and

    a built in hardware down converter. High speed 1 lane PCI Express gives you more HD real time effects and supports advanced video formats such as ProRes(Mac), DVCPro HD, JPEG, DV, HDV playback and even 10 bit uncompressed capture and playback!

    Hardware Down Conversion

    For monitoring, youll love the built in HD down converter thats always active on the SD-SDI, S-Video and composite video output connections.

    The built in hardware down converter lets all video outputs remain active in both capture and playback mode, and in all HD video formats! Instantly switch between letterbox, anamorphic 16:9 and center cut 4:3 down conversion styles.

    Built in SD KeyerDeckLink Studio includes a built in internal SD keyer that lets you layer RGBA images over the live video input. You can also use the included Photoshop plug-ins for broadcast graphics! DeckLink Studio also supports external SD keying with key and fill SDI out.

    Windows or Mac OS X

    DeckLink Studio is fully compatible with Apple Final Cut Pro, Adobe Premiere Pro, Adobe After Effects, Adobe Photoshop, Fusion and any DirectShow or QuickTime based

    software. DeckLink Studio instantly switches between, 1080HD, 720HD, NTSC and PAL for full worldwide compatibility.

    $695DeckLink Studio

  • August 200910

    Visual Effects

    CG fi re created at Industrial Light & Magic whips around Dumbledore, who conjured up the fl ames to repel the digital Inferi crawling up the crystal island.

    Images 2009 Warner Bros. Entertainment, Inc.

  • Its a dangerous world now for magicians and Muggles, as readers of JK Rowlings enormous-ly popular Harry Potter series of books know, and as movie audiences will discover. Warner Bros. sixth film in the franchise, Harry Potter and the Half-Blood Prince, puts the villainous Lord Voldemort back on the attack, with his acolyte Draco Malfoy helping to double the trouble. And, as always, the good wizards, Hogwarts Professor Dumbledore and his student Harry Potter, the chosen one, must stop them.

    But Harry and his fellow students of magic have other problems to cope with. Theyre teenagers now, as are the actors who portray them, and hormonal distractions play a major role in this film, sometimes with a magical twist. The kids are growing up. The plot is grittier and darker. And the result is one of the most critically acclaimed Potter films yet.

    David Yates, who directed the previous film Harry Potter and the Order of the Phoenix, returns to lead the crew of Half-Blood Prince and will continue on for the final two films in 2010 and 2011both based on the seventh, and final, Potter book. So, too, does senior visual effects supervisor Tim Burke, who has now worked on five of the six films, and the acting crew who have starred as the students of witchcraft and wizardry from the beginning:

    Daniel Radcliffe as Harry Potter, Emma Watson as Hermione Granger, Bonnie Wright as Ginny Weasley, Rupert Grint as Ron Weasley, and Tom Felton as Draco Malfoy.

    As is typically the case with Potter films, several VFX studios located primarily in London conjured up most of the effects, with Industrial Light & Magic (ILM) doing some heavy lift-ing from the US. We talked with ILM, the Moving Picture Company (MPC), and Cinesite, which created some of the most interesting effects, about their work on the movie. In addi-tion, CEG Media, Double Negative, and Rising Sun Pictures contributed digital effects.

    ILM: FirestormDumbledore and Harry are on a scouting mission to find one part of Voldemorts soul, a so-called Horcrux, hidden in a crystal cave. ILM provided shots for Harry and Dumbledores entrance to the cave, and created the cave itself as well as the lake inside. After entering, the two wizards row across the lake to the island, a set piece, and find the Horcrux. And then, the action begins. Dumbledore collapses. Harry dips a shell into the lake to bring him some water. As soon as he touches the water, hundreds of CG Inferi, trapped souls that guard the Horcrux, splash up and pull him beneath the CG water. The camera follows Harry underwater.

    To rescue Harry, Dumbledore, now recovered, conjures up fireballs that he slings into the lake. As jets of fire slash through the water, we see thousands of Inferi, arms and limbs wrapped around one another, on the sides of the island. They look like coral, and its no surprise to learn that ILMs Aaron McBride, who created concept art for Davy Jones crew in Pirates of the Ca-ribbean, helped design them. The fire scares the Inferi. They release Harry, and as the camera

    Because ILM created the digital fire using a combination of low-res 3D fluid simulations and high-res 2D simulations, TDs can produce quick results. From left to right, a data plane showing the underlying sim, particles colored according to temperature, and an initial composite.

    August 2009 11

    Visual Effects n n n n

    Images 2009 Warner Bros. Entertainment, Inc.

  • August 200912

    n n n n Visual Effects

    follows his swim to the surface, we see an orange glow on the water. The visual effects wizardry has only just begun.

    On the island, Dumbledore is directing a raging firestorm that swells to 100 feet high and 200 feet in diameter. Waves of fire swirl around him. And then, like Mo-ses parting the Red Sea, Dumbledore says a magical word, and the churning wall of flames splits to create a safe passage across the lake for the two wizards.

    Tim Alexander led a crew of approxi-mately 60 people at ILM who worked on the sequences 160 shots for about a year to create the crystal cave, the CG water, the CG Inferi, and the CG fire. When they be-gan, they did not have the ability to create such a fire, so Chris Horvath took on that problem in November 2007. This year, SIGGRAPH accepted the technical paper he and Willi Geiger submitted on the pro-cess titled Directable, High-Resolution Simulation of Fire on the GPU.

    I wanted to emulate the ability of really skilled compositors to paint with filmed elements, Horvath says. So the original intention was to come up with something that would allow us to shape the use of filmed elements with particle simulation.

    With this in mind, Horvath first tried to bend and stretch sprite elements along a particle path. It didnt work. Horvath believed that might have been because the filmed elements he had to work with didnt have the long trailing streams of fire he needed. So, because he didnt have the pho-

    tographed elements he wanted, he decided to use CG elements instead. And thus, the science project began.

    The goal was to not have anything partic-ulate, Horvath says, no sprites and none of the artifacts that had been present in the past. That was the basis of our technique, which is essentially image-based simulation.

    The result is a two-stage process that technical directors used to create the fire-storm. The process begins with a 3D par-ticle simulation using Geigers FakeFlow system. It does a coarse, quick, low-reso-lution enforcement of the Navier-Stokes non-divergence component and adds a little viscosity, Horvath explains. One of the things we learned is that a little of this goes a long way. We used tiny fluid grids64 cubes at the most, and some-times 32but that gave us a base fluidity as a starting point.

    The simulation results provided the flowing motion that Horvath needed, but not the detail. If we had an infinitely large computer that could hold a 2k x 2k x 2k grid, we could theoretically perform a full-blown fluid sim, but it would take days and days and days, Horvath says. So we found a way to split the grid into slices and exploit parallelism in the GPU.

    This is where the magic happens. First, since the only thing that matters in visual effects is what the camera sees, the second part of the process is oriented to the frus-tumthe cameras field of view. Rather than running a 3D simulation, Horvath

    stacked, in effect, a series of rectangular, two-dimensional slices of a 3D grid. He spaced these simulation planes equally apart and faced them to the camera from the foreground to the background, to pro-duce high resolution close to the camera and appropriate resolution farther back.

    All the particles from the coarse [3D] sim-ulation are drawn on each plane, Horvath explains. But a weighting function deter-mines their size. If theyre close to the plane, theyre drawn strong. If theyre far, they fade off, and the size of that fall-off is important. The particles become the input to another

    The goal in creating the firestorm surrounding Dumbledore was to create highly detailed and directable fire that didnt look like it was made of particles.

    Let It SnowFor a wintery sequence that de-manded more than the fake flakes caught on camera, Cinesite drew on proprietary image-based track-ing software called Motion Warper to dust the actors with snow. The studio started by photographing

    the fake snow to create tiny ele-ments. Motion Warper then tracked the live-action image and applied the elements.

    It does pixel analysis on the whole image, says Ivor Middleton, CG supervisor. It isnt just a cor-ner-pin application. We were able to snow up Hagrid, with his huge beard and furry cloak blowing in the wind, more or less straight out of the box with a little tweaking. Its some-thing weve also used to track scars and for face replacement. BR

  • August 200914

    n n n n Visual Effects

    Navier-Stokes fluid-flow sim, a variation of the same equations we ran earlier.

    Because this simulation is two-di-mensional, thoughthe slice is only a planethe simulation can run at 2k, even at 4k resolution, which adds a tremendous amount of detail. The result of the simu-lation is a set of images that contain den-sity, temperature, texture, fuel, and velocity data. These are combined by our renderer to produce the final image, Horvath says. Theyre volume-rendered from near to far. Although the simulations run sepa-rately for each plane, the underlying par-ticle simulation provides continuity.

    Horvath created the images using a clev-er hack that treats texture maps like large input data arrays and a final display like a large output data array. This was before CUDA was available, he says, so we used OpenGL, the GL shading language GLSL, and GPGPU, which tricks OpenGL into using textures as computation planes. And instead of using shaders to draw things, we used them to compute things. Because everything is stored in a texture, we can dis-play everything immediately. You can have our simulation on screen in front of you and hit a key to switch between different data planes [the different sets of images].

    Horvath used EXR files to store the temperature and density data, and GTO files to store the simulation controls. For

    most of the shots, the TDs ended up us-ing 128 slices. Theres no reason to choose 128, Horvath says. Theres nothing spe-cial about that number. Its just that pro-grammers are geeks and like to use pow-ers of two. Typically, the number of slices required is a feature of how much depth complexity we need.

    To simulate a fireball, for example, the technical directors used only eight slices, and the simulation took only 10 minutes. But, for the parting of the so-called Red Sea shot, during which Harry and Dumb-ledore row across the lake between walls of fire, the TDs needed to run six simulations that they layered from back to front.

    Thats the only shot in which the main body of the fire is multiple elements, Hor-vath says. Thats because we needed more than a thousand slices, and that was too big for the renderer to hold in memory. It was our nightmare shot.

    Fiery tunnels aside, for the most part, by using the GPU fire-simulation process, the crew could look at a fully rendered fire in about half a day. It was great, Alexander says. We had never done photorealistic fire, although wed tried many times in the past. The standard approach, rendering a heavy-duty 3D particle sim volumetrically, can take days on end. And, now we had su-per highly-detailed photorealistic fire that we could change twice a day.

    Cinesite: Magical ExplosionsAll told, Cinesite created 474 shots, of which 204 had CG elements and the rest were cleanups. Andy Robinson was the compositing supervisor, and Ivor Middle-ton, the CG supervisor. To the studio, Hermiones lovebirds in scene 63 were the most important.

    Weve mostly done environments for Harry Potter, so having CG creatures to work on, as well, was nice, Middleton says. Theyre little lovebirds that reflect Hermiones mood. At the beginning of the shot, as Harry comforts a distressed Herm-ione, the birds twitter around her head. But when Ron Weasley enters with another girl, the birds turn into angry little darts.

    They become aggressive and zoom toward Ron, Robinson says. When he exits, they hit the door and burst into an explosion of feathers.

    To create the feathers, the 3D artists made small planes, rather than using hairs. We didnt need to build individual barbs, Middleton explains. We could simulate the feathers with texture maps and an anisotropic shader. Once the birds hit the door, the dynamics switch on. Lead techni-cal director Holger Voss turned the birds into the explosion of feathers using a com-bination of Autodesk Mayas rigid-body dynamics and nCloth, which handled in-dividual feather dynamics.

    The second explosion that Cinesite created for the film involved inanimate objectsa stained-glass window in Hog-warts great dining hall. They didnt shoot anything for it, Robinson says. The shot was an afterthought.

    The shot takes place near the end of the film. The Death Eaters, who are invading Hogwarts, explode the stained glass win-dow. Plates slide off the tables, and cups smash onto the floor. The magic candles move with the shock wave, and the flames go out.

    Although Cinesite had created the magic ceiling and magic candles for the great hall,

    Cinesite created magic candles for the great hall, and then for an explosive scene near the end of the film, blew them out. For that sequence, the studio digitally detonated a stained-glass window and shattered the dinnerware.

  • August 2009 15

    Visual Effects n n n n

    as the studio has done in the past, for this scene the artists had to reproduce the set, as well. We remodeled the entire room from two camera views, Robinson says. We rebuilt the tables, cups, plates everything in 3D using digital stills for textures.

    After consulting with Jose Granell, Cinesites model unit supervisor, the team decided to create 100 different explosion events as the detonation moves down the great hall, to mimic how Granell would rig a series of practical explosions to create a dense effect.

    We staggered the explosions and previsd it with Tim Burke to get the timing, Middleton says. For the sequential explosions, Voss used Blast Code software to shatter the stained glass, animate the plates moving off the tables, smash the cups, and manage the interaction with the environment.

    The dynamics were rigidbody simulations with Blast Code, Middleton says, but with Holger [Vosss] careful choreography. Lastly, Robinsons team composited dust and debris passes rendered with Pixars RenderMan into the explosion using Apples Shake.

    MPC: QuidditchNow that the contestants are teenagers, Yates and Burke wanted to give the Quidditch matches a little edge. Even though its still kids on magical brooms, we tried to make the sport more of a sport, says Nicolas Aithadi, visual effects supervisor at

    MPC. We decided to create more credibility by giving the camera more of a television feel. So, we looked at everything having to do with aerial shows and Formula 1 races. They created the broadcastsports feeling and dynamism they wanted by using the virtual equivalent of 500mm lenses to flatten the perspective and by having the cameras catch up with the action.

    During Half-Blood Prince, we see a tryout and a final match. For both, although

    the crew shot the actors on wires on a bluescreen stage, in 90 percent of the final shots, the competitors are CG. We wanted something more controllable than what we could shoot on set, Aithadi says, and more extreme.

    Knowing the digital athletes would be close to camera, MPC decided to use videogrammetry to create the photoreal characters. Its like photogrammetry, but with moving images, Aithadi says. To capture the moving image, they positioned four cameras around an actor wearing 80 tracking markers on his or her face and sitting in a chair. Two cameras were in front, high and low, and two were at 45degree angles on each side.

    The idea was to use that data to animate the CG face, and at the same time, acquire the textures, Aithadi says. Wed have moving geometry and moving textures, and the texture would match the geometry exactly.

    The first test took a week and a half and had promising enough results that the crew decided to go ahead and capture all

    Mystery TrainDuring a particular shot, created with help from The Moving Picture Company (MPC), the camera focuses on students Ginny, Luna, and Dean Thomas, who are riding in a train. The camera pulls out of the window, a dining car rolls past, and then, the camera goes inside the window of a third car to focus on Harry, Ron, and Hermione.

    But there was no trainonly the in-side of one train car on set, configured and then reconfigured to simulate the three train cars, and filmed with a mo-tion-control camera. MPC created the rest and put the jigsaw puzzle together.

    It was insane, says Nicolas Aithadi, visual effects supervisor. We had one person working on the shot for six months, creating everything that didnt exist on setthe train walls, the track, everything outside the windowsdealing with all the motion-control elements, and keeping all the motion-control and CG ele-ments in sync. At one point, I thought, Enough already. Well do it all in 3D. Ill pay for it. But, it would have been a long process in 3D as well, and the real characters help sell the shot. It has a nice feel to it, and the way it fits into the story is quite cool. BR

    MPC created a digital stadium for the extreme Quidditch sport and used videogrammetry to put real faces on digital athletes that star in 90 percent of the shots.

  • August 200916

    Visual Effects

    the Quidditch competitors performing a library of expressions. Before the crew started the capture, though, they had each actor do a minute and a half warm-up.

    We had done the test with Rupert Grint, who plays Ron Weasley, Aithadi explains. And, we asked him to do an ex-treme movement. When we got the data back, we could see blood moving inside his face in a specifi c way. I had never thought of that before. So the warm-up helped get the blood fl owing in the actors faces.

    After shooting each actor, the artists at MPC used camera projection to apply the moving images on a CG model of the ac-tors head, and then stitched the edges and

    The fastest way for a wizard to transport is by apparating: He or she thinks of a destination and magically teleports there. But, its dangerous, so wizards cannot receive a license to legally apparate until they turn 17, Harrys age now. The question for the fi lmmakers was, what does ap-parating look like?

    To answer that question, artists at The Moving Picture Company (MPC) created drawings to visualize VFX supervi-sor Tim Burkes idea of something like chewing gum that causes Harry and Dumbledores bodies to pull and deform, and blend into each other.

    That was our biggest mistake, says Nicolas Aithadi, MPCs visual effects supervisor. Tim loved our ideas. And [director] David Yates loved them. But we didnt have a clue about how to do them. Id explain the ideas to our modelers and animators, and theyd look at me with empty eyes. The shot is eight seconds long, and its the most complex we had to do, completely abstract from start to fi nish.

    In the fi nal shot, Dumbledore and Harry are standing in a train station. Dumbledore says, Take my hand, Harry, and they apparate. The world spins around them and sucks them inside. They collide, their bodies distort and twist, and their clothing and skin blend together until you cant tell one from the other. For Dumbledore, its magic as usual. For the nov-ice Harry, its painful. Then, they pop out into a village.

    The team spent the fi rst two or three months tossing ideas like hot potatoes back and forth between the animation and modeling departments. Finally, they decided to start with

    models based on fi ve images that represented the stages Harry and Dumbledore go through as they transport.

    Meanwhile, the matchmove department rotoscoped Harry and Dumbledorethat is, actors Daniel Radcliffe and Mi-chael Gambonfrom footage shot at the train station and in the village. And, the character development team designed a rig that allowed animators to stretch low-res models of the characters in abnormal ways. In addition, one modeler cre-ated a train station in CG and then, without changing the topology, turned the train station into a village.

    Hand modeling also helped transform the characters. We didnt have time to do R&D, Aithadi says. We made high-resolution characters for the rest poses, and then every fi ve frames for eight seconds, a modeler remodeled the blend-shapes with the same topology. When the bodies over-lapped, we created blends as if they were the same object. We had 10 or 15 modelers die during the project, he adds with a chuckle. Once modeled, animators used special rig-ging to drive the way the blendshapes worked.

    To control the skin and cloth textures, the team rendered 15 versions of the animated characters, including one with everything made of skin, another with everything made of velvet, and a variety of lighting passes. But, they also ended up blending the textures in 2D.

    I know I should have taken this artwork and thought about it before loving it so much, Aithadi says. And I did love it. It is so beautiful. But, thats why we like to work on Potter. The effect is very differenta bit of something new. BR

    The digital stadium for the Quidditch tryouts is not yet draped with large pieces of cloth, so MPC needed to build every post and beam that you can see behind Harry Potter and Ginny Weasley.

    Apparating Sucks

  • August 2009 17

    Visual Effects

    converted the images to UV maps. It was very freaky to see the mouth and eyes mov-ing on the UV map, Aithadi says.

    Furtility, the hair system MPC devel-oped for 10,000 BC (see Making His-tory, March 2008), had to be upgraded to help the groomers work with hair thats thinner and not as long as mammoth hair, and to create hair that moves but retains a part. I think the shots with Ginny are so good because the hair helps sell the whole thing, Aithadi says.

    For cloth simulation, they relied on Sy-fl ex to move three diff erent fabrics used in the team uniforms. More diffi cult than the kids capes, though, was the cloth covering the stadium. It has 50-meter-tall pieces of cloth that are so big you need small de-tails or youll kill the scale, Aithadi notes. But, the more detail you have, the more polygons you have. Oliver Winwood spent three months getting that simulation right, and then he had to add the wind, and the problem with wind is that you can quickly see a pattern when the mathematical wav-ing is too uniform.

    e tryouts, however, happen in an uncloaked structure. All you can see is the wooden structure, Aithadi says, the stairs inside the towers, everything. We had to build every beam. We ended up with 35,000 pieces of geometry.

    In addition to the CG digital doubles and the two stadiumsone bare and one dressed with clothMPC built the surrounding environment, a task that took the matte painters six months. ey started with stills taken by the produc-tion crew in Scotland and a panorama that served as a background, but that background was fi ve miles from the sta-dium. e matte painters needed to fi ll the space between. We didnt want fl at depth, Aithadi says. We wanted haze between trees and to have that haze changing every 200 meters.

    To give the artists a sense of scale, the crew built a 360-degree environment based

    on the stills and placed spheres at various depths within that environment, starting with a sphere the size of a foreground tree. Because of the spheres, we could defi ne depth, which gave us the ability to create a 2D environment that looked 3D, Aithadi says. Using those spheres in depth, we cre-ated level curves, and then created geom-etry at the right depth that we camera-pro-jected with photographs. e only problem was that the trees were fl at, so compositors manually rotoscoped tree edges to place haze behind them.

    Growing UpWhen Warner Bros. released the fi rst Harry Potter fi lm eight years ago, the actors play-ing the teenagers now were children. To

    create their witchcraft and wizardry, how-ever, the visual eff ects have been state of the art for each Potter fi lm, and the magical ef-fects have helped drive the fi lms huge suc-cesses. But, only one, Harry Potter and the Prisoner of Azkaban, has received a visual eff ects Oscar nomination. Now that the children have moved to-ward adulthood, thoughboth the actors and the students in the storiesthe eff ects have grown edgier and more sophisticated, too. So, perhaps this Potter will break the spell. at is, unless the Muggles take Pot-ter eff ects for granted.

    Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at [email protected].

    Water ZombiesThe lake inside the crystal cave where Harry and Dumbledore travel to retrieve a Horcrux churns with Inferi, corpses bewitched to do a dark wizards bidding. Indus-trial Light & Magic (ILM) created these naked, soulless creatures, which Voldemort gave the task of protecting the Horcrux. They look like people but skinnier, says visual effects supervisor Tim Alexander. We have thousands in some shots.

    For shots with up to 100, primarily those that attack Harry and pull him into the water, animators keyframed the performances. To populate the entire under water cave with thousands of Inferi all tangled together and writhing, the crew used motion cycles created by the animators that they instanced onto cards driven by a particle simulation. We rendered 800 frames of seven or eight cycles that were spread throughout the cave and randomly chosen at different times, Al-exander says. When theyre cards underneath the water, we built them so we could relight them. We didnt bake prelighting into them. We can treat them like sprites, but they have normals.

    For the water, ILM used the studios PhysBam software to create a 3D simula-tion for swells, splashes, and ripples from the Inferi. But for the shallow water, CG artist and developer Chris Horvath created a GPU-based solver using a sur-face plane. We were doing 1K and 2K shallow-water sims in real time, he says. Its the same Navier-Stokes equation used for 3D simulations, but we collapsed one of the dimensions in the assumption that the water is shallow. In the 3D sim, the Inferi are geometry. In the 2D sim, which is image-based, they are elements.

    Underwater, there is no simulation. The fi lmmakers created the illusion by fi lm-ing Daniel Radcliffe (Harry) underwater, to have his hair and clothes fl oat realisti-cally. ILM animated the Inferi as if they were moving in water, and then compos-ited Harry and the Inferi into swirling CG particulate matter lit by the glowing fi reballs that Dumbledore shoots at the zombie-like creatures. BR

  • Pushing the Bounds

    Images 2009 Disney Enterprises, Inc. and Jerry Bruckheimer, Inc.

    August 200918

    CG CharactersStereo

    Agent Darwin, leader of the G-Force team, uses his specially-

    rigged guinea pig hands to help fi ght an evil menace.

  • G-Force, Walt Disney Pictures live-action summer kid fl ick, packs talking animals, superheroes, robots, and spies into a rollicking action-comedy in which every one of the 1861 shots is a visual eff ects shot. Its producer Jerry Bruckheimers fi rst stereo 3D fi lm, and the directorial debut for Hoyt Yeat-man, who had received an Oscar ( e Abyss) for best visual eff ects, an Academy Technical Achievement Award, and two Oscar nominations. And, its arguably the fi rst VFX-heavy, live-action fi lm shot with standard cameras to be converted into stereo 3D.

    Sony Pictures Imageworks created all the CG creatures and masterminded 1287 VFX shots. Asylum VFX added on-screen graphics and graphic playbacks to 133 shots. In- ree converted the rest of the fi lm, the shots without CG eff ects, into stereo 3D.

    e story centers on a group of animals trained by the gov-ernment as covert agents. eir mission? Stop a gazillionaire tyrant who attempts to take over the world by using house-hold appliances.

    e core members of the squad are guinea pigs: indomi-table team leader Darwin (Sam Rockwell), extreme weapons ex-pert and toy racecar driver Blast-er (Tracy Morgan), and sexy martial-arts practitioner Juarez (Penelope Cruz). Mooch, a fl y that wears surveillance equip-ment, is the reconnaissance expert, and a mole named Speckles (Nicolas Cage) is the team computer geek. In addition, the G-Force team encounters Hurley, a slacker pet-shop guinea pig voiced by director Jon Favreau, and Bucky, a territorial hamster voiced by Steve Buscemi, as well as some CG animals and creatures that dont talk. e hero critters perform their secret-agent duties in a live-action world populated by actors Zach Galifi anakis, Bill Nighy, and others, and fi ght to save the world from transforming household appliances and an 80-foot robot.

    Flat WorldImageworks Scott Stockdyk, who won an Oscar for Spider-Mans visual eff ects, supervised a crew of 326 artists who worked on the 2D version of the fi lm, which the studio calls the fl at fi lm. Rob Engle supervised a second crew

    of 150 visual eff ects artists at Imageworks who converted each shot created by the fl at-fi lm team into stereo 3D (see Viewpoint, pg. 6). is is Engles sixth stereo 3D fi lm at Imageworks. We needed to take traditionally shot fl at pho-tography, add dimension, and then integrate our CG visual eff ects characters in a way that would seamlessly blend them together, as if they had been photographed at the same time with a real stereo 3D camera, Engle says.

    First, the fl at fi lm.Within Imageworks 60 minutes of animation, 40 min-

    utes entailed swinging the CG creatures into live-action plates; the other 20 minutes were all-CG shots. In addition to the guinea pigs and Speckles the mole, the crew created

    an army of cockroaches, mice, a snake, robot appliances that come to life and terrorize people, and an 80-foot-tall robot that stars in many of the all-CG shots. Made from pieces of machinery assembled apparently randomly, the asymmetri-cal machine has a somewhat organic feel.

    Its unique, says animation director Tony Saliba of the robot. Its not only a character, its an environment. ere were challenges all over the place. e environment is con-stantly moving, and the guinea pigs have to make their way into the heart of it.

    e stars of the show, however, are the guinea pigs and the mole. I think that what sells the movie is that you believe the real guinea pigs could do the stunts, Stockdyk says. Its like Stuart Little combined with Spider-Man. e guinea pigs swing on cables, but they dont do superhero kinds of

    August 2009 19

    CG CharactersStereo

    The all-CG G-Force teamBlaster, Darwin, and Juarezconsiders its next move. Imageworks animators could quickly switch the guinea pigs from a bipedal stance to a four-on-the fl oor scamper.

  • August 200920

    n n n n CG CharactersStereo

    things. That gives it more impact, humor, and fun. Even though the films target is kids, we wanted it to have a sophisticated action-film look and feel. We wanted to use as much realism as we could to make kids feel like they could have their own Transformers or Terminator.

    But, guinea pigs are never real in the film. Imageworks painted out any of the furry critters used on set, and replaced them with CG counterparts that could stretch beyond the real animals physical limitations. We wanted the guinea pigs to feel real, visceral, and tangible, but to tell our story, we needed to have them do some things the real animals couldnt do, Stockdyk says.

    Covert CrittersModelers and animators worked in Auto-desks Maya to create guinea pigs that differed from the real animals slightly in their looks and sometimes more wildly in their actions. We angled their eyes a little forward because otherwise we could look at only one eye at a time, and we changed their fur patterns subtly, explains Saliba, who led a crew that ex-panded to include 68 animators at the peak of the films production.

    The fur stylists used Imageworks in-

    house hair system. One of the biggest issues was in getting the hair and fur to interact with all the gadgets, says Seth Maury, digital effects supervisor. For that, JJ Blumenkranz, the CG supervisor, and hair lead Dustin Wicke came up with a magnet system.

    Rendering happened primarily through Pixars RenderMan for the characters, and primarily through the studios Arnold software for the backgrounds. Composi-tors, meanwhile, used Imageworks own Katana system.

    There was a tendency by the executives and producers to want the guinea pigs more human, with big muscles and big arms and small heads, but Hoyt [Yeatman] dug his heels in, Saliba says. He was adamant that they play like photoreal guinea pigs. But, in a single shot, they could go from acting like people to skittering across the floor like little rodents, and when theyre around real animals in the pet shop, they have to act like real guinea pigs. So, we had to walk a fine line.

    That meant the team needed to create photoreal guinea pigs they could stylize without drawing attention to the carica-ture, and that could change from quad-ruped to biped within a single shot. Al-though modelers based the characters on

    real guinea pigs, because their physique couldnt cope with the extreme poses called for by the G-Force, modelers changed the bone structure of their legs and hands. For example, their hands could handle equip-ment and stunts, yet still perform in a guinea pig way.

    Their hands needed the same function-ality as their feet because they run on all fours, Saliba says, and their feet rig needed to allow them to roll from heel to toe eas-ily. Also, they have amorphous bodies that change quite a bit depending on whether theyre climbing, stretching, bunched in a ball, sitting, or lying down.

    Shaper tools that worked within the Maya rig helped with the shape-shifting bodies. And rigs populated with precise controls allowed the animators to quickly switch from, for example, a forward- kinematics mode for their hands when the guinea pigs stood up, to an inverse-kine-matics mode when their hands become feet on the ground.

    The animals can talk via a translator gizmo invented by the scientist who orga-nized the G-Force team, which converts their squeaks into human speechan idea also used by Pixar for the dogs in Up (see The Shape of Animation, July 2009). That gave the animators freedom to con-centrate on the CG stars physical perfor-mance without always having to lip-sync dialog, something the Pixar animators also used to their advantage.

    Guinea Pigs Eye ViewTo help envision the shots, the Image-works animators would often videotape themselves acting to the dialog or perform-ing stunts. They import the video into a Maya scene and use that as reference, Saliba says. For additional guinea pig refer-ence, they found 20,000 videos of the pop-ular pets posted to YouTube, and observed some office pets.

    A search on YouTube to find reference footage for Speckles, the star-nosed mole,

    The pint-sized G-Force team must stop an 80-foot CG robot made from an assortment of con-stantly-moving mechanical parts.

  • August 2009 21

    CG CharactersStereo n n n n

    however, produced only 76 results, not all of them useful. For most people, theyre pests, not pets, Saliba says. They have tendrils coming off their nose in a star pat-tern that are prehensile, like another set of hands. Theyre not cute and cuddly.

    On set, because the main characters are less than a foot tall, Yeatman brought the cameras down low. Hoyt and the crew had special lenses and grips that allowed the first-unit camera operator to hand-hold a camera at guinea pig level, Stockdyk says. Once the shots moved into post, the crew would often add a slight zoom, change the framing a touch, and tweak the timing. In some cases, if we pushed the limits, he adds, wed go completely virtual. Almost everything had virtual backup.

    To capture the lighting setups, the crew used Yeatmans ChirpiCam. Maury de-

    scribes it thus: Its a one-and-a-half-foot cube that has five cameras with fish-eye lenses mounted inside that takes expo-sures every two stops, from eight seconds to an eighth of a second up and down. We put it in the middle of the set each time the lighting changed, and pressed Go. It stitches the images into HDRI maps. Maury says the camera received its ChirpiCam name because it chirps as it captures the HDRI images.

    By the time the production unit fin-

    ished shooting, ChirpiCam had provided the postproduction crew with 800 HDRI images in 360 degrees from a guinea pigs point of view. Maury followed the images all the way through the digital interme-diate process at Company3. It was im-portant for our comps to hold up at the DI house, so we bracketed every image up

    and down six or seven stops to make sure the CG highlights would look good in all the different formats, Maury explains. We pushed the images around a lot, es-pecially the stereo images because they are not as bright.

    By choosing to shoot the film with stan-dard cameras rather than bulky stereo 3D cameras, Yeatman and the cinematogra-phers were free to choose any lens and light-ing, and to shoot at guinea pigs eye view. But, that choice created interesting chal-

    lenges for the Imageworks stereo 3D crew.In the 2D version of G-Force, about

    two-thirds of the film includes visual ef-fects, and the rest of the film is live-ac-tion plates, Engle says. In G-Force 3D, every shot in the movie is a visual effect. So, the first thing we did was divide up the workload.

    Stereo SolutionsImageworks gave all the shots without CG effects to In-Three to convert those plates to stereo 3D. They cut out elements by pulling keys and by using rotoscoping to isolate and offset them, to produce the per-spective you would see with the other eye, Engle says.

    For the plates that would include the CG characters, Imageworks needed to take a different approach, but even so, relied on a typical pipeline.

    Our crew is organized like most visual effects crews, Engle says of the stereo 3D artists. Were basically using visual effects processes for everything: Layout takes the matchmoves, adds extra geometry to mim-ic the real world and make sure its logical,

    Director Hoyt Yeatman put the cameras low to help the audience relate more directly with the star-nosed mole (at left) and the guinea pigs, including the one above. Similarly, the stereoscopic 3D team tuned its cameras to the stereo depth that a guinea pig might see.

  • August 200922

    n n n n CG CharactersStereo

    the camera crew creates the stereo cameras and puts renders in front of me, the light-ers reproduce the other eye for all the CG objects and creates the stereo render of the plate, and the painters clean up the prob-lems. The thing thats different is that we have to fill holes if anything has been cut out of the plate.

    As they would with any visual effects pipeline, the stereo 3D artists started with the matchmove of the plate that the flat-film team had generated, which gave them a virtual worldrough geometry representing what was filmed on location as seen from the point of view of the cam-era used on the setwith the CG animals positioned in that virtual world.

    Then, we added more detail to the matchmove model that the 2D team didnt have to think about, Engle notes. Some-times the 2D team only matchmoves a 2D surface, like a table, so if a guinea pig is on the table, wed add other objects. And, we had to fill any holes.

    Once they had filled the holes, which often happened on a per-shot basis, they created a stereo pair of virtual cameras in the computer and rendered images for each eye. For CG objects, it was easy,

    Engle says. Wed just render the other eye to create the stereo pair. To make the plate itself 3D, though, the easiest way to describe it is that we projected the plate photography onto the geometry in the virtual world and took a picture from the other point of view.

    New Stereo TechniquesWhen the camera was at guinea pig level, the artists tuned the stereo depth to what a guinea pig might see. Guinea pigs eyes are on the order of a couple centimeters apart, Engle says. Humans eyes are six to six and a half. So, we tended to keep the cameras close together, especially when [the guinea pigs are] in a terrarium, to make the environments feel real.

    The artists also developed new ways of working with the stereo image to heighten the experience. The 2D release is a wide-screen, 2.40 format, Engle says. But the studio also wanted a 1.77 format for high-definition home video. So we decided to take advantage of the extra information for the 3D version. The theaters are projecting the film as if its 2.35, but were delivering 1.85 with letterboxing on the top and bot-tom, so it looks square-ish. But, every once

    in a while, we allow objects to extend into the letterboxed region. To the audience watching with 3D glasses, it looks like the objects are coming out of the top and bot-tom of the screen.

    One of the shots in the film is of a snake trying to attack a guinea pig, Engle says. The point of view is from the guinea pig behind a sheet of glass. The snake rears up, coils, strikes, and hits the glass with its jaws wide open. Its a great 3D moment. We let the jaw go over the top and bottom of the 2.35 picture so it literally feels like the snake is in the audiences space. Breaking the mask this way allowed us to have an overt way of saying, This is in your space, without bringing objects out into the space and having the audiences eyes cross.

    In 2008, the film Journey to the Center of the Earth broke new ground by becoming the first to incorporate CG visual effects in a live-action film shot with stereo 3D cameras. G-Force breaks a second barrier. Rather than shooting the film in 3D, we added the stereo in postproduction, says Engle. This is the first time that a live-ac-tion film integrated with CG visual effects has been converted to 3D.

    By using visual effects techniques to create the stereo version of the film, Imageworks discovered new ways in which directors and cinematographers might want to create deeper worlds: Rather than using stereo 3D cameras to film the action, they can contin-ue working with the less-limiting and more-familiar systems. Whether or not they have superagent CG guinea pigs as stars.

    I think the way we ended up doing the conversion gives filmmakers a lot of flexibility, Stockdyk says. If they have a new movie coming to the table and theyre thinking about stereo and live action, I think they have to consider the extremely efficient way we did it. n

    Barbara Robertson is an award-winning writer and a contributing editor for Computer Graphics World. She can be reached at [email protected].

    When the guinea pigs meet Hurley (above) in a pet shop, the animators performed them as if they were real animals, not secret agents. A snake attack, one of the most interesting stereo 3D effects in the film, takes place in the pet shop.

  • CGW :808_p 7/16/08 11:45 AM Page 1

  • Rendering, yes, rendering, is the hot topic of 2009. Its possible you have been following CAD and engineering long enough to remember that rendering pro-grams were offered as an essential add-on more than 20 years ago. You probably also know that the rendering capabilities within most CAD programs have improved con-siderably. So, why is rendering a hot-button issue now? Again?

    In a way, the reason is the same now as it was then: because you can. Rendering is the first love of many graphics doctoral students, and every year, universities crank out new, brilliant PhDs with new, brilliant approaches to rendering. In the past five years, the emphasis has been on shader technology, which the OpenGL and Mi-

    crosoft DirectX APIs made available to hardware processors. In this new golden age, rendering is jet-fueled with hardware acceleration and rendering algorithms that are better than ever. The broad availability of 64-bit processors with 64-bit operating systems means that ever-cheaper systems can handle ever-higher amounts of data.

    Raytracing, the process of calculating the path of light to create photoreal effects, has re-emerged as a practical application for computer graphics now that powerful, new multiprocessor systems have arrived to go to work on the problem. Also, those talented doctoral candidates have devised clever compression algorithms. Given all that, its not surprising that there are several contend-ers jockeying for position in the rendering

    sweepstakes, and there are plenty of inter-ested, materially interested, onlookers.

    The PlayersAmong the companies that are lining up excitedly to gain approval in the CAD world are Bunkspeed, Onesia, StudioGPU, ArtVPS, Autodesk (with Showcase), Re-altime Technology (RTT), and Luxology. Longtime major-league player Mental Im-ages is redoubling its efforts in the CAD market, and even Pixar has revamped the venerable RenderMana product thats been doing its job for more than 20 years in the movie industryto take advantage of multiple processors and 64-bit technology.

    Whats interesting is that companies with rendering technology are taking different

    August 200924

    n n n n Rendering

    Rendering is new again, thanks to significant advances by a number of vendors. Here, David Burgess illustrates the power of Bunkspeeds HyperShot for this Ford Taurus image on behalf of Ford Motor Co.

  • approaches, and theyre looking at diff er-ent points in the workfl ow. In fact, many of these rendering companies are hoping to insert themselves into points in the work-fl ow where they have never been before.

    Luxology has worked directly with Solid-Works (a subsidiary of Dassault) and Bent-ley to create modules that work within the SolidWorks and MicroStation products, respectively, allowing engineers, architects, and designers to see and communicate design ideas with the push of a button. Luxology comes from the entertainment world. e use of its products have grown quickly in a few short years because the product is attractively priced, its powerful, and the Luxology development team has attracted a loyal following over the years.

    ( e companys founders, Stuart Ferguson, Allen Hastings, and Brad Peebler, were also developers of NewTeks LightWave.)

    Peebler, Luxologys CEO, reports suc-cess as he branches out into new fi elds. e Luxology rendering modules are in-cluded for free in the professional versions of Micro Station and Luxology, and Peebler reports hearing that CAD users are giving these tools a try because theyre available, theyre fast, and theyre easy to use.

    Peebler says hes been told repeatedly that the ability to create a quick render with materials and lights is something people are trying out because its there, and as a re-sult of being able to render quickly, theyre better able to sell an idea or identify a prob-lem. e point is that these are people who

    wouldnt normally render a model; its just not in their workfl ow.

    Bunkspeed was among the fi rst to off er the world a fast, relatively low-cost renderer with its HyperShot raytracing visualization tool. Like Luxology, Bunkspeed also re-ports fast growth over the few years it has off ered HyperShot, and at last years SIG-GRAPH, the company introduced Hyper-Move, an animation tool that adds phys-ics and lets users add realistic motion to a 3D scene. And of course, Autodesk off ers Showcase, for preparing, processing, and presenting 3D CAD data. More recently, Autodesk has beefed up the product with raytracing technology from Swedish com-pany Opticore following the purchase of the companys assets two years ago.

    August 2009 25

    Rendering

  • August 200926

    n n n n Rendering

    Were seeing the rise of products that enable users to quickly put a model in a scene, adjust a few lights, play with materials and colors a bit, and, voila, a photorealistic scene is born. Germanys RTT has a variety of products to do this at various points in the pipeline. Also, Nvidia last year acquired Utah raytracing startup RayScale for its Mental Images division, and is also building tools for fastrendered scenes.

    There are new customers for renderingor, at least, thats what all these firms hope. The idea is that there might be users out in the design pipeline. A product has been designed, and the proud designers want to show

    off what theyve got. They are not the ones who will create the TV ads or the fullspread magazine ads, but they can communicate their latest ideas to the full team and maybe to marketing to get everybody onboard.

    To this end, Autodesk has just released Showcase 2010 in three versions: Viewer, Presenter, and Showcase, and all three now have raytracing and integrated global illumination. In this instance, Showcase takes advantage of the computers CPUall of them. The more cores, the better. Autodesk has expanded the materials capability in Showcase with a unified and calibrated materials library, including hundreds of texture maps and fully editable materials so users can create their own and share them.

    Two Parts of the BrainFor the longest time, there has been a divide between the entertainment world and the CAD world, even though the technologies are very similar. All the work CAD users do goes into the creation of a digital version of something that is going to be manufactured or built. In general, it matters less if it is beautiful, but it must be accuratelives, careers, and dollars depend on it. The entertainment world, of course, is all about appearances. It must live, but it doesnt need to be real.

    That divide is gradually getting filled in, and the tools used to create fantasy are also

    coming into play in the alltooreal world of CAD.

    Consider Luxology, for example. The companys roots are deep in the broadcast industry, but its sights are fixed on the broader CAD industry. As Peebler notes, Theres hundreds of thousands of customers in the entertainment business; theres millions in CAD.

    The lowcost Brazil renderer (formerly from Splutterfish) was a tool used in conjunction with Robert McNeel & Associates Rhino. Raytracing upstart Caustic Graphics recently acquired Splutterfish as a tool to reach out to new markets in CAD as well as entertainment. Another new entrant is StudioGPU, a company that has

    just arrived on the scene with enthusiasm and some crazy ideas that just might work.

    StudioGPU has experience in game development, film, and video. Brothers David and Yanni Koenig and Robert Knaack decided to create better rendering tools after too long living the nightmare as creatives in entertainment and advertising. As Yanni Koenig puts it, the artist is still subject to a priesthood that dictates when they can really see their work. Modelers, animators, and artists have to pass their work off to be rendered, all the while waiting hours for a fast render or overnight for a complete view. Every change, every experiment has to wait for the rendering process, which happens elsewhere. Jobs go to the farm, and theyre delivered back.

    The StudioGPU renderer is fast, really fast. It takes advantage of GPUs and shader technology, and can make changes at the push of a button. Its also clever technology, taking advantage of rendering tricks to create reflections that seem as if theyve been raytraced, but the technology is all rasterization. In the future, say the Koenigs, theyll add raytracing technology, but the system they have now is capable of professional rendering that can free the creative process from the drudgery of the render cycle.

    In their earlier incarnation in 3D design and development, the StudioGPU team worked on games, movies, advertising campaigns, and architectural renderings. Theyre pushing MachStudio Pro first as a tool for previsualization, for small productions and advertisements, and for visualizations. For example, in the case of creating previsualizations for products, the group thinks it can offer a rendering tool that can help determine the look of the production throughout the process, instead of at the very end.

    And, the StudioGPU guys are well aware that they have their work cut out for them when it comes to changing the way productions are organized.

    Rendering newcomer Caustic Graphics has developed an API that takes advantage of a workstations CPU or GPU to combine raytracing and rasterization technologies.

  • August 2009 27

    Rendering n n n n

    Unfortunately, no matter how great a renderer is, it has to make sense for the user. In every great wave of enthusiasm for rendering within the technical and engi-neering fields, theres been a corresponding tsunami of disappointment for the people who have put their heart and soul in these products only to find that the people they hoped would use them, dont.

    Rendering is a workflow issue. Who does it, and when? Come to think of it, another good question would be, Why?

    For the most part, rendering has remained the terri-tory of artists and placed outside the ordinary workflow

    of most CAD job descriptions. So the an-swer seems to be simple, everyone in the design workflow has a job to do and usu-ally not enough time to do it. For most of these people, creating pretty pictures is a luxury, not a requirement. The time for a rendering traditionally has been early, in the original design phase to visualize and sell ideas, and then again as a product is

    being readied for production and sale as advertising and marketing materials are be-ing prepared.

    However, the new wave of renderers hope to offer products that are fast enough and easy enough to change the way people work and to make rendered images and animations a tool throughout the ideation and design processes.

    Stakeholders: The HardwareSelling 3D modeling and animation soft-ware is a lot about love and a little less about money. The developers of visualiza-tion software, maybe especially rendering software, are all enabling creativity. On the other hand, rendering software has the po-tential to sell a lot of processors, and thats why companies like AMD, HP, Intel, and Nvidia are supporting and pushing the de-velopment of graphics software.

    Nvidia has been a serial investor in

    Luxology has expanded its presence in the CAD industry, offering modules that work with SolidWorks and MicroStation, enabling engineers and the like to create images such as this one by Wiek Luijken.

    HighPerformanceCameraTracking

    Use SynthEyes for animated critter insertion,xing shaky shots,virtual set extensions, making 3D movies, architectural previews,

    accident reconstruction, virtual product placement,face and body capture, and more.

    32-bit only$399Windows-32/Windows-64/OSX-32 /OSX-64

    See the website for detailsof the latest version!

  • August 200928

    n n n n Rendering

    rendering. The company bought Exluna, maker of Entropy, to develop its Gelato rendering tools. It bought RayScale for its fast raytracing technology, and it bought the very well established Mental Images to take it deeper into the process of rendering for entertainment and design visualization. These moves have paid off for the compa-ny. Nvidias expertise and development in software graphics has enabled it to develop powerful professional graphics systems. The companys Quadro line has greatly benefit-ted from the relationships, and Nvidia is determined to keep leveraging its work in graphics software to sell hardware systems.

    Nvidias Mental Images has worked closely with companies to integrate its ren-dering software into Autodesks 3ds Max, Maya, and Softimage, as well as other companies 3D content creation software. Mental Images shader creation technol-ogy, Mental Mill, now is an integrated tool within 3ds Max 2010. It lets designers create custom shaders with adjustable pa-rameters that are available to artists as they work. Complex effects can be created by combining shaders into phenomenaand, the Mental Mill promise is that these tools can be created without the user having to

    program. Mental Images has developed an o