Tuesday, July 19, 2016

Live Cello Mocap and Projection Mapping

59 Productions have some wicked plans in store for projection mapping in live performance, but here's Sol Gabetta performing Elgar's Cello Concerto in E minor at the BBC Proms, as an example of what they can already do:


(h/t to Liz Barber)

The video below shows the how. It highlights four key challenges with motion capture and projection mapping in live performance.





The first issue is the projection surface itself. Creative Director Richard Slaney explains that the cello is "quite a difficult projection surface for us because it's not a regular shape and size, and also it's moving – very fractionally, but it's moving. So as Sol plays the cello, she wants to be free enough to move the cello as she plays. So therefore we need to track where that cello is in 3D space, to allow us to project back onto the right object."

A traditional cinema screen is flat and white and generally stays where you put it. Theatre has been incorporating film or video projections for decades, but has always had to grapple with the question of how to design a halfway decent and decent-looking projection surface into a three-dimensional playing space. When you start projecting onto something like an actor's body or the face of a cello, you add colour, texture and surface undulation into the mix, and unless you're clamping them in a vise, you've also got to deal with a greater or lesser degree of lag between their movement and the projection mapping system's ability to keep up with them.

The second challenge is where to put the markers. (If you're using a markerless system, as we did with the Dynamixyz facial capture for A Midsummer Night's Dream and Faster than Night, you still have to figure out a good in-fiction explanation for the honking great head-mounted camera on your actor.)

In traditional mocap, priority is given to the tech, in order to capture the cleanest data, so you can put your actors in ridiculous-looking spandex suits with little reflective pingpong balls stuck all over their body, and control the lighting, set and prop design for minimal interference. Photos or video taken in a mocap volume are usually about as ugly as it gets.

But in a theatrical setting, all the priorities change, and you've got to work around other considerations including visual design and technical stability. In this case, even though the piece was recorded in a studio with seven infrared cameras, the limitation seems to be that the IR emitters or beacons could only be attached by what looks like rubber bands to the very few non-original parts of the 300-year-old musical instrument.

The third difficulty the video points out concerns the perspective of live performer themselves. Given their position in relation to the projection surface(s), the artist usually has a significantly reduced or skewed sightline. If the projections are on a scrim or Musion Eyeliner in front of them, at an extreme angle, or simply on a screen behind them which they can't turn around and look at, the performer may barely be able see the projections at all. The production may need to provide them with audio cues, or marks to hit on the stage floor, in order to help them stay in sync. Ironically for a live art, the actor must usually wait to watch the video afterward to get a proper view of their digital scene partner.

The final issue is the question of what it is that you're actually going to project. As captivated as I was by Nobumichi Asai's astonishing Omote when it went viral two years ago, its progression of effects reminded me a tiny bit of the experience you get when, drunk with new-app power, you put All The Filters (or if you're old enough to remember, All The Fonts) into a single work.

I recall realizing we had done the same thing on the Dream, when we tried to fit all our various (and mismatched) motion-tracking effect projection experiments into the production simply because we didn't want to waste any of our R&D. Creators may also change up effects out of fear that once past the first flush of awe at the work's technical amazingness, the viewer will get bored, which is an admirable concern, but risks making the content itself into a glorified screensaver.

The trick here is to find a way to make the visual effects click with the context, to give them an internal logic and perhaps even an evolving narrative. Creatively, this may be the biggest challenge of all.