(Originally published on praxistheatre.com as part of Harbourfront Centre's HATCH 2014.)
It looks like he’s wearing a bike lock on his head.
A protruding horizontal rectangle shadows the actor’s face—it’s attached to him by a black headband. The tiny camera is suspended just inches in front of his nose. A cable comes out of the contraption leading somewhere behind the curtains...
Centre stage, a huge animated head is projected, a donkey whose lips are moving in time with the man’s, whose head turns when his does—a big 3D cartoon puppet. It's March 2013, Shakespeare’s A Midsummer Night's Dream is getting a high-tech treatment over at York University, and poor Bottom has been turned into an ass for real this time.
Adam Bergquist as Bottom in the 2013 Theatre@York production of A Midsummer Night's Dream
The Dream's director Alison Humphrey first met Pascal Langdale during this production, introduced by creative producer Vanessa Shaver of Invisible Light.
Langdale is a RADA-trained actor with more than 33 television and film credits to his name, including the interactive movie Heavy Rain and the series Bitten, currently airing on Space. He’s also a performance capture specialist and business developer for Dynamixyz, the company that provided the hardware and software for this live-animated glowing blue donkey adventure.
Humphrey has a master's degree in theatre directing and another in digital media, and has had an interest in experimental storytelling since writing for Global's "instant drama" Train 48 and producing one of the earliest web-based alternate reality games to promote Douglas Adams's Starship Titanic.
Together, they think motion capture technology, and the real-time animation it makes possible, belong on the stage. Their new work, Faster than Night, is one of four projects chosen for HATCH, Harbourfront Centre's annual performing arts residency programme. Facial capture is a big part of the work.
It’s actually a head-mounted camera from Dynamixyz, a contraption made up of a helmet, a miniature video camera, an illumination strip (visible or infrared light), and a 9V battery. The camera tracks facial movements, sending video data at up to 120 frames per second to a computer backstage, either wirelessly or by USB cable.
The information then meets a suite of software called Performer, which takes the data and breaks it down into points of movement. The software re-targets it onto a pre-existing animated face, already programmed to the actor's range of motion and expression.
Facial capture used to require painted dots on the actor's face, but this system is markerless. Dynamixz’s camera is sensitive enough to read a person’s wrinkles, even their blushes. Each filmed pixel acts as a marker on the actor's skin and as it is tracked, the system creates a collection of interconnecting motion, a sense of realistic physicality. The actor opens his mouth, the animated face opens its mouth.
It turns a 3D computer model face into a marionette, controlled by the movements of the live actor's face.
Motion-capture has, for a long time, been the domain of videogames and Hollywood blockbusters. Other versions of the technology exist, and not just to work with the human face, but with the whole body. It’s helped game developers (and biomechanical researchers) model human movement with incredible realism: How does the rest of my body react when I move my leg? How does breathing affect my shoulders? In the long run, many large game companies find it cheaper to invest in mocap technology and wire up a couple of actors than to hire crews of animators to model and labor over every possibility the game offers.
|Zoe Saldana as Neytiri in James Cameron's Avatar|
Sitting around a kitchen table, Langdale and Humphrey are working on the script for Faster than Night. The sci-fi narrative hinges on a moral dilemma set on a spaceship in the far future, but the question they’re currently discussing is a bit more down-to-earth: Where will their 3D model astronaut be looking when he answers a question live-tweeted by a member of the audience? Should he make eye contact? Or should he maintain the fourth wall?
Motion capture technology onstage is exciting, a futuristic version of mask work and puppetry, but with its own risks and rewards. Like traditional puppets, it can’t keep still without looking a little bit dead, and to turn away from the audience risks losing the effect, just as with any mask.
In some ways, this technology is a thespian’s dream, a chance to literally become someone or something else, to totally transform into a role. But the head-mounted camera is an unfamiliar piece of paraphernalia to have on the body – it can be distracting both for the actor, and for an audience. If we're meant to watch only the projected animation, where does the man in the headcam go? If he’s on-stage, how do we write in his funny hat? Or maybe instead, as the Wizard of Oz suggests, we should "Pay no attention to that man behind the curtain."
|Pascal Langdale as Caleb Smith in Faster than Night|
(photo by Vanessa Shaver, 3D model by Dionisios Mousses/SIRT Centre)
And breakthrough tech doesn't come risk-free. No, this is live theatre at its height. In movies and games, there’s a chance to edit the footage, to make it perfect. But during a live show any number of things can go wrong, from lighting mishaps, to headcam battery death, to a range of motion the system hasn’t been calibrated for...
But that’s show business, right? Even in the 21st century.
Heather Gilroy is a Toronto writer/editor whose work has appeared in a variety of formats and publications, from the Toronto Animated Arts Festival International to BlogTO to Raconteurs (formerly MothUP Toronto, in association with the popular podcast, The Moth). Follow Heather at @HLGilroy.