Getting More Immersed with Indiewire


  The Penske Media purchase of Indiewire has resulted in an expansion of my role as crafts and awards season contributor.  Beginning this week, I begin Emmy coverage of below-the-line contenders along with my usual Oscar season crafts reporting, working closely

Immersed in Blu-ray: Hitchcock and Bogart


The WB Archive Collection gets Hitch and Bogie on Blu-ray and they've never looked better for home viewing. In Kent Jones' indispensable doc, Hitchcock/Truffaut, he reminds us that Truffaut was on a mission to correct misconceptions about Hitch as a lightweight

Immersed in Books: Farber on Film


For the first time, the complete writings of film critic Manny Farber is available from Library of America, edited by Robert Polito (Savage Art: A Biography of Jim Thompson). Manny Farber (1917-2008) was the first modernist film critic to write like a modernist.

Virtual Production

Autodesk and Disney Pact on XGen Tech

Posted on by Bill Desowitz in Animation, Movies, Tech, VFX, Virtual Production | Leave a comment

Autodesk obtained an exclusive five-year licensing agreement for the XGen Arbitrary Primitive Generator technology (XGen), used most recently by Walt Disney Animation Studios (WDAS) in the hit animated film Tangled. XGen technology was first presented by WDAS in a research paper at SIGGRAPH in 2003 for the creation of computer-generated fur, feathers, and foliage. Since that time, XGen has been used to create the fur, hair, feathers, trees, leaves and rocks in Bolt;  the trees and bushes in UP; the dust bunnies, debris, trees, bushes, clover, and flowers in Toy Story 3; and the grass and trees in Cars 2.

In Tangled, WDAS used XGen to bring the lavish CG-animated world to life: from Rapunzel’s perfectly groomed golden locks to the film’s lush, vegetation-filled landscapes, including bushes, flowers, vines, grass, weeds, moss, thistle, ground mulch, fallen leaves, sticks, rocks, butterfly fur, airborne dust, leaves and trees, plus props such as roof tiles, arrow fletchings, a broom, and paint brushes.

XGen is a comprehensive system for generating arbitrary primitives on a surface. The system advances the state-of-the-art in the industry in several ways with its versatility, durability, and impact. XGen allows techno-artistic access to interpolation in an intuitive manner for artists, empowering them with a powerful and flexible framework for primitive generation, which is highly art directable. The genesis of XGen was a collaboration between the WDAS production and software teams to provide its artists with intuitive, creative tools for 3D animation — such as “grooming” tools for fur and hair — so that they can develop the look and feel of their characters and environments more quickly and easily. Senior Development Software Engineer at WDAS Tom Thompson was an initial creator and remains the chief architect of the software. Walt Disney Pictures’ agreement with Autodesk will enable Autodesk to make this technology available to artists to create digital entertainment.

“Twenty years ago, visual effects artists creating computer graphics were mostly mathematicians and scientists using highly technical and complex software tools that required significant amounts of custom programming,” explained CTO Andy Hendrickson, Walt Disney Animation Studios. “Back then, off-the-shelf software could not create the required details of nuance and emotion. Today, we were able to create XGen as an effective artistic tool because Autodesk provides studios like ours with comprehensive tools and a flexible, extensible platform to develop on. The Autodesk customizable toolset helps visual effects artists do their best work.”

“A key challenge in the visual effects industry continues to be the need to constantly evolve creatively while somehow controlling rapidly escalating production costs,” added Marc Petit, svp Autodesk Media & Entertainment. “To help customers better address this challenge, Autodesk has been working with industry leaders like Walt Disney Animation Studios to help them innovate faster and to make these new technologies more broadly accessible. Digital Entertainment Creation users are sure to benefit from developments designed by industry visionaries and proven in production.”

Walt Disney Animation Studios Director of Studio Technology Dan Candela said, “A primary focus for my team is to ensure that the production pipeline is streamlined in order to efficiently produce the best possible CG animation. With Autodesk’s Maya as a core piece of our toolset, we’ve developed over 100 plug-ins and extensions for the platform to enable our artists to create a movie of the quality of Tangled within necessary time and budgetary limits. Sharing our technology with the VFX and CG animation community raises the creative bar for the entire industry.”

SIGGRAPH 2011 News

Posted on by Bill Desowitz in Animation, Events, Tech, VFX, Virtual Production | Leave a comment

Highlights from SIGGRAPH 2011:

Fusion-io, a provider of a next-generation shared data decentralization platform, is collaborating with NVIDIA, Thinkbox Software, and Tweak Software, to accelerate entertainment production by demonstrating full resolution, real-time digital content creation for many of the industry’s most powerful applications.

“Entertainment artists who use Fusion’s ioMemory technologies can now spend more time creating and less time waiting for content to load, playback and render,” said Vincent Brisebois, Fusion-io Product Manager. “Multiple SSDs configured in a RAID can provide basic throughput, but struggle to provide the low latency required for delivering interactivity in powerful content creation applications. By working with our innovative partners NVIDIA, Thinkbox Software and Tweak Software, we are helping studios and artists unlock their creativity. Now, not only can artists do more faster, but with the flexibility offered by Fusion-io and our partners, studios can focus on the artistry that separates good from great.”

In the NVIDIA booth (#453), the Fusion-io video wall showcases how Fusion ioMemory technology combined with the NVIDIA QuadroPlex 7000 Visual Computing System provides the throughput necessary to play 12 full HD(1080p) uncompressed video feeds simultaneously off a single workstation with interactive graphics processing unit (GPU)-based color correction. The video wall demonstration will be running on an HP Z800 workstation equipped with the NVIDIA QuadroPlex 7000 and Fusion ioMemory modules.

“Working with Fusion-io, we’ve created an impressive, large-scale visualization technology demonstration at SIGGRAPH for show attendees,” said Jeff Brown, general manager, Professional Solutions Group, NVIDIA. “By combining Fusion’s ioMemory technology with our powerful QuadroPlex 7000, we’re demonstrating how to enable real-time color correction and processing of a dozen simultaneous uncompressed HD video streams – without being bottlenecked by disk speeds.”

At Autodesk booth #429, Fusion ioMemory technology will accelerate demonstrations of Autodesk Composite software, which is included in the 3ds Max, Maya, and Autodesk Softimage software applications. The Autodesk software packages feature integrated 3D modeling, animation, rendering, and compositing tools that enable artists and designers to quickly ramp up for production.

“Autodesk Composite software can be enhanced by technologies like Fusion ioMemory to help artists see their visions come to life more quickly,” said Rob Hoffmann, senior product marketing manager, Autodesk.  ”When 3D artists can immediately see the impact of each tool and adjustment, their imagination is freed to try new and innovative approaches to creative storytelling.”

Fusion ioMemory will be also integrated into a Supermicro SuperServer 8046B-6RF server in the Thinkbox Software Pacific Rim suite at the Fairmont hotel. This system provides increased speed and efficiency in demonstrations of Krakatoa, Thinkbox’s production-proven volumetric particle rendering, manipulation and management toolkit. Krakatoa provides a pipeline for creating, shaping and rendering vast quantities of particles at unprecedented speed to represent natural phenomena like dust, smoke, silt, ocean surface foam, plasma and even solid objects.

“We have clients working with billions of particles per frame to create photo-real smoke, fire, water, creatures made of ink, and photorealistic visualization of volumetric objects such as bones and skin. When saving or loading those particles, we have found nothing faster than Fusion-io,” said Chris Bond, Thinkbox Software CEO and founder. “We first tested Krakatoa 1.0 with Fusion-io. When we realized the potential of ioMemory, we optimized Krakatoa 2.0 to take advantage of its capabilities, and now our loading performance is an order of magnitude better.”

Meanwhile, Thinkbox Software launched a new Professional Services offering. Thinkbox clients can now tap the company as outsourced R&D to customize Thinkbox software, integrate it into their pipelines and/or develop custom software tools.

“With tight deadlines and increasingly high client expectations, studios are continually challenged with creating new and compelling imagery and managing efficient workflows while at the same time integrating new software. This is a challenge we know well, as many of us have been developing software and custom artist tools on the job for feature films for years,” said Thinkbox CEO Chris Bond. “Not only do we have the expertise, but beyond our commercial software we have an extensive and diverse codebase that we can tap to customize solutions for our clients.”

The company recently completed its first Professional Services projects, one of which included custom development and early access to X-Mesh, a highly specialized mesh renderer and geometry caching toolset Thinkbox has been developing that supports 3dx Max and Softimage 3D animation software, for the animation and visual effects studio Blur.

In booth #963, Tweak Software will be utilizing ioMemory technology from Fusion-io to accelerate its flagship RV software. RV supports dual stream output for stereo playback, embeds audio in the SDI signal, and takes advantage of RV’s flexible tools for review, editing, collaboration, an notation and comparison of media. At SIGGRAPH 2011, RV will be demonstrating its integration package that combines RV’s real-time playback with the compositing abilities of The Foundry’s Nuke software and Fusion ioMemory. The integration allows artists to save various iterations of their Nuke renders on the ioMemory and then immediately play them back in real-time in RV.

“Artists get a big benefit by combining the blazing fast memory technologies from Fusion-io with RV’s advanced image and sequence playback abilities,” said Seth Rosenthal, co-founder of Tweak Software. “The ability to stream film-res, stereo, high-dynamic-range imagery on the artist desktop or in the screening room gives artists immediate feedback so they have more time to try new things and get better results. This is all made possible by the remarkable data throughput and reduced latency offered by Fusion-io.”

Giving Rise to Apes at IndieWIRE

Posted on by Bill Desowitz in Animation, Movies, performance capture, Tech, VFX, Virtual Production | Leave a comment

Everyone seems to be going Ape today, so I take the opportunity at IndieWIRE to dig into Weta’s great performance capture advancements on Rise of the Planet of the Apes and how Andy Serkis is the beneficiary with his remarkable performance as Caesar. It looks like Deathly Hallows: Part 2, Transformers: Dark of the Moon, and The Tree of Life have some Oscar VFX competition as we head into the second-half of the year.

http://www.youtube.com/watch?v=8YyMqmDeoxI

First Looks at Superman and Catwoman

Posted on by Bill Desowitz in Animation, Movies, Tech, Trailers, VFX, Virtual Production | Leave a comment

So what are we to make of our first glimpses of Henry Cavill’s Man of Steel and Anne Hathaway’s Catwoman? Cavill, who was deemed too young for Bond, strikes a familiar if grittier pose in keeping with the presumably more grounded reboot being directed by Zack Snyder and shepherded by Chris Nolan. “I’ve never gone after an actual character in making movies from graphic novels or comic books,” Snyder told me a while back. “I’ve gone after literary or thematic concepts. Where I feel like with Superman, you’re going after a mythology in general. Very different… It’s funny because the thing about Superman that’s stylistically interesting to me is that he’s relevant if he’s real. That’s what Chris Nolan and I talked about early on. The only way I could do this is if Superman were living in the real world with us. And I think that helps him to be credible. It’s just funny because, for me, I haven’t made a real film.”

Amy Adams plays Lois Lane; Laurence Fishburne is the new Perry White; Kevin Costner and Diane Lane portray Clark Kent’s adoptive human parents, Jonathan and Martha Kent; Russell Crowe commands Superman’s Kryptonian father Jor-El; and Michael Shannon recreates villainous General Zod.

As for Hathaway’s Selena Kyle, there’s barely a hint of a feline disguise, though she’s certainly high-tech like Batman with her goggles and cycle. Nowhere near as sexy as Emma Peel but could be a good foil to the grieving Bruce Wayne, who must also battle the menacing Bane (Tom Hardy). With Marion Cotillard as the new ally, Miranda Tate. Nolan vows this will end the trilogy with a sense of realistic and satisfying closure.

The Dark Knight Rises opens July 20, 2012 and The Man of Steel bows June 14, 2013.

Will Zemeckis’ Yellow Submarine Resurface?

Posted on by Bill Desowitz in 3-D, Animation, Movies, performance capture, Tech, VFX, Virtual Production | Leave a comment

With yesterday’s Hollywood Reporter announcement of Robert Zemeckis’ ImageMovers resurfacing at Universal with a two-year, first look deal, does this mean that his performance capture-animated Yellow Submarine is back on track? We’ll know soon enough.

However, when reporting on Mars Needs Moms in March, the last movie made for Disney at  ImageMovers Digital in Marin County, production designer Doug Chiang told me that Zemeckis was still very enthusiastic about re-imagining the 1968 Beatles classic and proud of the test, and apparently Paul McCartney was supportive as well. It was previously announced that the director had secured the rights from Apple Corps. to use 16 Beatles songs, and that Cary Elwes, Dean Lennox Kelly, Peter Serafinowicz and Adam Campbell would portray the Fab Four.

So, even though the ImageMovers Digital gang has disbanded (Chiang wants to direct, Kevin Baillie co-founded Atomic Fiction in Emeryville, and Huck Wirtz launched Bayou FX in San Rafael and Louisiana), they confirmed that they’d be willing to regroup when Zemeckis has a new project. Then again, Zemeckis could return to Sony Pictures Imageworks, where he helmed Beowulf and The Polar Express and produced Monster House. (He’s currently attached as producer at Sony Pictures Animation to adapt Chuck Sambuchino’s book, How to Survive a Garden Gnome Attack.) He could also make it at Digital Domain (Tron Legacy and The Curious Case of Benjamin Button), with its own performance capture prowess.

Who knows? There still might be a new 3-D journey to Pepperland to fight the Blue Meanies.

More Spielberg and Jackson on Tintin at IndieWIRE

Posted on by Bill Desowitz in 3-D, Animation, Books, Movies, performance capture, Tech, Trailers, VFX, Virtual Production | Leave a comment

I’ve just posted more extensive Tintin coverage from my trip to Weta last week at IndieWIRE’s TOH. There will be more coverage to come from Weta about both Tintin and Rise of the Planet of the Apes.

Trailering Red Tails

Posted on by Bill Desowitz in Animation, Movies, Tech, VFX, Virtual Production | Leave a comment

Great aerial action from ILM (supervised by Super 8′s Russell Earl) underscores Red Tails, exec produced by George Lucas, about the The Tuskegee Airmen: the first African-American fighter pilots in U.S. military history that helped turn the momentum in Europe during World War II. The Lucasfilm Ltd. production will launch on Jan. 20, 2012, from Twentieth Century Fox, directed by Anthony Hemingway (Treme, The Wire, Battlestar Galactica), and produced by Rick McCallum and Charles Floyd Johnson.

Red Tails stars Cuba Gooding Jr., Terrence Howard, Bryan Cranston, Nate Parker, David Oyelowo (Rise of the Planet of the Apes), Tristan Wilds, Cliff Smith aka Method Man, Kevin Phillips, Rick Otto, Lee Tergesen Andre Royo, Ne-Yo, Elijah Kelley, Marcus T. Paulk, Leslie Odom Jr., Michael B. Jordan, and Daniela Ruah.

“I’ve wanted to do this film for a great many years,” said Lucas. “So it is especially gratifying to see it all come together. It has been a real pleasure to work with Anthony and the extraordinary cast on a project that we all passionately believe in. The Tuskegee Airmen were such superb pilots that it was essential for us to create visual effects that would live up to their heroism and put audiences in the cockpit with them. They were only in their early 20s when they performed these amazing feats,” Lucas added.  “They became the best of the best—the top guns. It is an honor to bring to the screen a story inspired by their heroics.”

Spielberg and Jackson Show More Tintin at Weta

Posted on by Bill Desowitz in 3-D, Animation, Books, Movies, performance capture, Tech, Trailers, VFX, Virtual Production | Leave a comment

I attended a special Tintin press visit earlier this week at Weta in Wellington, New Zealand, where Steven Spielberg (via polycom) and Peter Jackson showed an exclusive sneak peek of a thrilling seaplane chase in 3-D that included the first mix from John Williams’ rousing score.

It’s a frantic and funny scene that typifies the tone of the film, capturing the essence of Herge’s illustrative style and slapstick humor along with Spielberg’s iconic cinematic signature. While Tintin (Jamie Bell) attempts to pilot a seaplane in the rain pursued by baddies, a nervous Captain Haddock (Andy Serkis) attempts to grab a bottle of Scotch (whose contents hardens), and then winds up climbing outside to burp into the engine when they run out of fuel.

We also saw the same reel shown at Comic-Con containing lots of action and some exposition between the intrepid Tintin and cantankerous Haddock (an Odd Couple, according to Jackson). Judging from the footage this looks like the best performance captured film yet, utilizing the latest Weta advances in facial modeling and subsurface scattering. Indeed, we saw a presentation on how they use silicon facial casts to achieve finer detail through displacement maps and painting in Mari.

During a Q&A afterward, Spielberg explained that it was a “crazy and very worthwhile learning cure.” He told me that “it all gets down to the basics: story, plot, narrative, and characters, especially with the Herge books… to exonerate these characters in a way that if Herge were with us, he could look up at the screen and say, ‘Yep, that looks like Captain Haddock to me.’”

Spielberg also said that he shot The Adventures of Tintin (Dec. 23) like a conventional movie. In fact, it reminded him of using a Super 8 Kodak camera during his youth. “I was running around with a PlayStation controller with a 6″ monitor in between the handles,” he added. “I had all the x/y buttons on my right and I could crane up and down, I could dolly in, dolly out; I could basically be the focus puller, the camera operator, the dolly grip. I wound up lighting the movie with some of the artists at Weta. And so I did a lot of jobs I don’t normally do myself on a movie, and it gave me the chance to actually start to see the picture cut together.”

By getting into the volume with the actors, he was able to bring a conventional wisdom to the set each day (he shot in sequence for 32 days in LA), and maintain objectivity nearly two years later when he was able to tweak camera, lighting, atmospherics, and expressions to emphasize different story points.

Afterward, Jackson gave us a tour of the MoCap stage at Weta, using a slightly different virtual mockup camera than the wheel controller made for James Cameron that Spielberg used. Jackson was absolutely giddy, shooting his two performance capture actors in the volume. All the assets are built in advance so the director can compose shots while viewing low-res versions of the animated characters in their CG environments. Here’s hoping that Jackson gets the chance to direct the next one. He’s still open about which book to adapt, but promises a little more from The Crab with the Golden Claws and Red Rackham’s Treasure.

Spielberg and Jackson Tout Tintin at Comic-Con

Posted on by Bill Desowitz in 3-D, Animation, Books, Events, Home Entertainment, Movies, performance capture, Tech, Trailers, VFX, Virtual Production | Leave a comment

Sorry I’m unable to report directly from Comic-Con’s Hall H in San Diego to bring you Steven Spielberg’s historic appearance, but, rest assured, I will have some very privileged Tintin access very soon. However, according Rebecca Keegan of the Los Angeles Times, Spielberg showed off some action-packed footage of Tintin engaging in both a gun fight and fist fight, and pursuing some baddies on wet cobblestone streets. The celebrated director also discussed raising the performance capture bar at Weta with his surprise guest, Peter Jackson, who still plans on directing the second installment if The Adventures of Tintin proves popular after its North American release on Dec. 23.

“Do I shoot this live-action with a digital dog or do I shoot this computer animated?” he originally questioned. “This was the medium which was begging us to use it.” While he wanted to capture a physical resemblance to the Herge comics, he didn’t want them to look cartoony, which is why the photoreal skin textures were applied to the characters.

Like Cameron, Spielberg had a virtual camera to see the rough performance capture renders and shot the whole thing using the V-Cam; this gave him a lot more freedom with action sequences than he’s accustomed to with a real camera. He also enjoyed the intimacy with the actors: “This is much more of a direct to canvas art form.” He was amazed at the emotion they were able to achieve with the animation. As for the virtual technology, he praised it for being “realistic to the point where the animators can create the musculature, nerves, and replica of a human body which responds the same way as we do.”

Oh, by the way, Spielberg took the opportunity to announce that, among his many projects, is Jurassic Park 4 (Universal Home Ent. releases Jurassic Park Ultimate Trilogy on Oct. 25).

http://www.youtube.com/watch?v=uqn_rjQudps

Nuke & Mari Showcased at DreamWorks

Posted on by Bill Desowitz in Animation, Events, Movies, Tech, Trailers, VFX, Virtual Production | Leave a comment

The Foundry showed off the latest integration of Nuke compositing and Mari paint at the DreamWorks Animation Glendale campus last night (July 21). Artists I spoke with were especially impressed with the potential now for full Photoshop/Mari layering

Brandon Fayette, CG supervisor at Bad Robot Prods., demonstrated the denoise tool on noisy plates in Super 8 as well as the efficiency of the new Nuke/Mari bridge workflow; Joe Farrell, compositing supervisor from Scanline VFX in L.A., revealed the layering complexity achieved by Nuke on last year’s The Hereafter. Judging by the host venue and the fact that DreamWorks screened a trailer for Puss in Boots (Nov. 4), we look forward to learning more about how Mari played a role in its latest animated feature.

Now shipping, Mari 1.3v2 delivers a focused workflow to provide Nuke artists with dedicated 3D paint tools, making digital environment and projection work more efficient and final composited scenes more believable. This was especially attractive to artists working in Photoshop that require better layering.

Released this week, Nuke and NukeX 6.3 offers Deep Image compositing, 3D particles, Planar Tracker, updated Denoise with cutting edge algorithms, updated and redesigned spline & grid warping, and audio scratch track.