Tuesday, October 4, 2016

Viral Vanier

So on the very same day the Vanier Canada Graduate Scholarships are announced, I learn that not one but two of my PhD classmates in Cinema and Media Studies at York are through to the next round for next year. Go, Claudia and David!

And congratulations, Zachary, Jesse and Syrus Marcus!! Reading about their work, and the research abstracts for the 162 other new Vanier Scholars, feels like looking at one of those gorgeous rainbow MRI scans of Canada's brain...

 Henrietta Howells, NatBrainLab, Wellcome Images (Creative Commons by-nc-nd 4.0)

Tuesday, July 19, 2016

Live Cello Mocap and Projection Mapping

59 Productions have some wicked plans in store for projection mapping in live performance, but here's Sol Gabetta performing Elgar's Cello Concerto in E minor at the BBC Proms, as an example of what they can already do:

(h/t to Liz Barber)

The video below shows the how. It highlights four key challenges with motion capture and projection mapping in live performance.

The first issue is the projection surface itself. Creative Director Richard Slaney explains that the cello is "quite a difficult projection surface for us because it's not a regular shape and size, and also it's moving – very fractionally, but it's moving. So as Sol plays the cello, she wants to be free enough to move the cello as she plays. So therefore we need to track where that cello is in 3D space, to allow us to project back onto the right object."

A traditional cinema screen is flat and white and generally stays where you put it. Theatre has been incorporating film or video projections for decades, but has always had to grapple with the question of how to design a halfway decent and decent-looking projection surface into a three-dimensional playing space. When you start projecting onto something like an actor's body or the face of a cello, you add colour, texture and surface undulation into the mix, and unless you're clamping them in a vise, you've also got to deal with a greater or lesser degree of lag between their movement and the projection mapping system's ability to keep up with them.

The second challenge is where to put the markers. (If you're using a markerless system, as we did with the Dynamixyz facial capture for A Midsummer Night's Dream and Faster than Night, you still have to figure out a good in-fiction explanation for the honking great head-mounted camera on your actor.)

In traditional mocap, priority is given to the tech, in order to capture the cleanest data, so you can put your actors in ridiculous-looking spandex suits with little reflective pingpong balls stuck all over their body, and control the lighting, set and prop design for minimal interference. Photos or video taken in a mocap volume are usually about as ugly as it gets.

But in a theatrical setting, all the priorities change, and you've got to work around other considerations including visual design and technical stability. In this case, even though the piece was recorded in a studio with seven infrared cameras, the limitation seems to be that the IR emitters or beacons could only be attached by what looks like rubber bands to the very few non-original parts of the 300-year-old musical instrument.

The third difficulty the video points out concerns the perspective of live performer themselves. Given their position in relation to the projection surface(s), the artist usually has a significantly reduced or skewed sightline. If the projections are on a scrim or Musion Eyeliner in front of them, at an extreme angle, or simply on a screen behind them which they can't turn around and look at, the performer may barely be able see the projections at all. The production may need to provide them with audio cues, or marks to hit on the stage floor, in order to help them stay in sync. Ironically for a live art, the actor must usually wait to watch the video afterward to get a proper view of their digital scene partner.

The final issue is the question of what it is that you're actually going to project. As captivated as I was by Nobumichi Asai's astonishing Omote when it went viral two years ago, its progression of effects reminded me a tiny bit of the experience you get when, drunk with new-app power, you put All The Filters (or if you're old enough to remember, All The Fonts) into a single work.

I recall realizing we had done the same thing on the Dream, when we tried to fit all our various (and mismatched) motion-tracking effect projection experiments into the production simply because we didn't want to waste any of our R&D. Creators may also change up effects out of fear that once past the first flush of awe at the work's technical amazingness, the viewer will get bored, which is an admirable concern, but risks making the content itself into a glorified screensaver.

The trick here is to find a way to make the visual effects click with the context, to give them an internal logic and perhaps even an evolving narrative. Creatively, this may be the biggest challenge of all.

Saturday, July 9, 2016

Prospero meets Ariel at the Imaginarium

More fascinating glimpses behind the scenes on preparations for the Royal Shakespeare Company's new production of The Tempest: "Simon Russell Beale, playing Prospero, and Mark Quartley, playing Ariel, meet for the first time, coming together with Imaginarium crew and Intel technology to explore how Ariel's avatar will work."

Thursday, July 7, 2016

Performance Capture for Stage, Film and Games

A round-up of links for further exploration of performance capture, offered with respect and gratitude to the massively talented graduating RADA actors who participated in the first-ever Shadowpox workshop – Fehinti Balogun, Natasha Cowley, Sayre Fox, Skye Hallam, Tom MartinPolly MischAbraham PopoolaMaisie Robinson and Jamael Westman.

Classic capture: on screen

Acting for video games

The next frontier: capture in live performance

Tuesday, June 14, 2016

Science + Fiction + Theatre

[This post was originally published in March 2015, but I'm reposting it because it's good background for a workshop I'll be doing next week. As collaborator Simon Eves of PLASTIK pointed out recently, there's still not enough sci-fi onstage...]

At a recent conference on public health history at the University of Toronto, I had some intriguing conversations about crossovers between science and art/entertainment, particularly how science fiction can welcome audiences deep into issues in public health.

It brought back a chat I had last year with Conall Watson of the Department of Infectious Disease Epidemiology at the London School of Hygiene and Tropical Medicine.

He described his work on a promenade theatre event called Deadinburgh, set during a zombie outbreak besieging the Scottish capital.

His team ran the public health advisory cell, "tasked with guiding the audience through different approaches to controlling the zombie epidemic; giving them insight into the usually back-of-house practices of the public health authorities.

We also had input into the epidemiological parameters and narrative of the overall show."

In a paper titled "Deadinburgh: zombie epidemics, citizen power and public health", he and his colleagues Kate Harvey and Nigel Field of University College London, describe the scenario:

"An unknown pathogen was ravaging Scotland’s capital in April 2013, turning unlucky infected souls into bloodthirsty, ambling beasts. The city was under military lock-down and scientists were working around the clock to identify the pathogen and develop means of control.

"Each night, 250 uninfected citizens reached the safe zone at a former veterinary college, taking democratic responsibility for the public health and military response.

"Whether immersive theatre and simulated situations can get people to engage with public health on a larger scale and help build trust and empathy with the way that science is used to inform public sector decision-making remains to be seen. What we do know is that people like science; people like zombies; and the two combined can help us to reflect on our own practice as public health professionals."

In the video below, Kate Harvey says, "Bringing in something from popular culture helps to appeal to a wider audience.... Public health has both art and science at its core. Public health is the art and science of promoting health and preventing disease and prolonging life.... But maybe what we haven't done so much of is using it as a means of communication, and actually putting some of the science back into art as well."

Conall Watson and Kate Harvey discuss
"Deadinburgh - the science of zombies"
in a London School of Hygiene and Tropical Medicine podcast

Tuesday, April 12, 2016

So long, green screen

As I work with a Kinect, a projector and a digital video camera to drive and record real-time interactive projected effects, I'm already fantasizing about hacking the Lytro Cinema system together with one o' these.

Come on, it's only a $125,000 rental...

(H/t Jos Humphrey)

Wednesday, February 17, 2016

Live-Animated Simpsons

This just in from the Hollywood Reporter: the May 15th episode of The Simpsons will feature a three-minute segment with Homer Simpson taking questions... live on air. In a genre of animation that normally takes at least six months to produce, this is a 180
Showrunner Al Jean told THR that the series... will use a motion capture technology in which Homer's voice and motions will be depicted in an animated scene talking about things he "could only be saying live on that day."
"HOMЯ", Simpsons episode 9, Season 12
Apparently the segment will look just like the rest of the episode, which means they'll probably be using a 3D model, but toonshading it to flatten the look, which is what we did on Faster than Night and The Augmentalist.

Sports aside, not much these days can get audiences to tune in live, so it's a good stunt for Fox. Of course, the biggest creative conundrum for real-time animation is getting the audience to believe it is in fact actually live:
The installment will feature Homer — voiced by Dan Castellaneta — discussing topical subjects and responding to fan questions for the final three minutes of its regularly scheduled episode."... 
As far as I know, this is the first time that's been done by any animated show," Jean said, noting the live bit works with the theme of the episode, which explores improv comedy. "And Dan is a great improviser."
Which only leaves the question of how to convince the 99.999% of the audience whose questions won't get answered that the ones Homer is responding to are not themselves plants. Especially since the questions have to be submitted eleven days in advance:
Fox is urging fans to tweet their questions using the hashtag #HomerLive beginning Sunday, May 1, through Wednesday, May 4.

via legend166, neogaf.com
UPDATE 18 May 2016: Cartoon Brew tells How ‘The Simpsons’ Used Adobe Character Animator To Create A Live Episode.

Sunday, February 7, 2016

What is twenty-first-century magic?

More from the team behind the Royal Shakespeare Company's performance-capture Tempest:
"What's twenty-first-century magic? Well, I guess it comes in some kind of digital form these days. We've started to see a lot more in film, in the world of movies, but we've never really explored that in the theatre context.... It's the creative process that drives the technology, and not the other way around. So it started to be a match made in heaven, from that point of view." 
– Stephen Brimson Lewis, Director of Design, RSC

"With The Tempest we're really trying to redefine theatre, in some respects, and find a way to bring in new digital technology and really leverage it to make the story deeper, to find new ways to connect with a character, and maybe a different audience... an entirely new generation."
– Tawny Schlieski, Research Scientist, Intel

"The most exciting thing about what we're doing at the moment is enabling an actor to have a real connection with an avatar. It's the sort of thing that I do on a daily basis in the studio for films and video games. In this instance, we're doing something that's going to be a live theatrical experience. What you see is what you get." 
– Ben Lumsden, Head of Studio, The Imaginarium

"The play demands a spectacle. There's a masque in the middle of the play, an insubstantial pageant, which fades into nothing, but it creates wonder. I want to let the guys at Intel know a bit about what that tradition was, just how elaborate those masques were, and how they were pushing the envelope." 
– Gregory Doran, Artistic Director, RSC 

Thursday, January 14, 2016

Capturing Coldplay

The making of Coldplay and The Imaginarium's Adventure of a Lifetime video – one of the best behind-the-scenes on performance capture I've ever seen, and definitely the most fun!

And here's the original video: Coldplay – Adventure of a Lifetime.

(H/t to Neil Richards)

Wednesday, January 13, 2016

So this goes on the Christmas list...

New prototype high-speed projection mapping system – at CES 2016, Panasonic will unveil its projector technology for high-speed and synchronized mapping on unmarked target objects:

"Prototype." I hope that doesn't mean Christmas 2018...

Monday, January 11, 2016

"Stratford upon Avatar"

(Sorry, couldn't resist nicking the Daily Telegraph's headline.)

Very exciting announcement today by the RSC:
"Today the Royal Shakespeare Company announced a new production of William Shakespeare’s late play The Tempest, produced in collaboration with Intel and in association with leading performance capture company, The Imaginarium Studios. The companies will combine their passion for storytelling and innovation to create a truly revolutionary production as part of the RSC’s winter season 2016.

Directed by RSC Artistic Director, Gregory Doran, with RSC Associate Artist, Simon Russell Beale, as Prospero, and designed by Stephen Brimson Lewis, this partnership will see the RSC’s skills at theatre-making come together with Intel’s digital innovation and the expertise of The Imaginarium in pushing technical boundaries to create a truly innovative production for a new generation to mark Shakespeare’s 400th anniversary.

For the very first time, performance capture technology will be used to render an animated character - Ariel the sprite - live on the Royal Shakespeare Theatre stage.

The technology works by capturing an actor’s facial expressions as well as their movements, which ensures that an actor’s full performance is translated into the animated character. It has most famously been used in films and gaming, but together the RSC, Intel and The Imaginarium have undertaken more than a year of research to bring digital avatars to life on stage in real-time, interacting with live actors."
Intel expands:
"Today the Royal Shakespeare Company announced a new production of William Shakespeare's play The Tempest, produced in collaboration with Intel and in association with The Imaginarium Studios. The companies will collaborate to create a revolutionary production as part of RSC's winter season 2016. For the first time, performance capture technology will be used to render an animated character live on the Royal Shakespeare Theatre stage. Standard Intel technology is used to manage massive data processing required for live digital content projection - from Intel Xeon to Intel Core i7 processors."
I heard some tantalizing murmurs about this plan last summer, and can't wait to see what they and The Imaginarium cook up together!

The Telegraph's report is the most extensive so far:

Such stuff as dreams are made on:
Ariel to appear as '3D digital apparition' in RSC's The Tempest 

"More than 400 years ago, Shakespeare was busy dazzling his very first audiences with the latest in baffling theatre technology, from trap doors to bursting fireworks and the sound of rumbling thunder. 
This year, the Royal Shakespeare Company is aiming to conjure the same effect, as it becomes the first theatre in the world to incorporate live digital avatars to join actors on stage.  
The company has partnered with the very latest digital technology to build an all-action hologram effect, for a new production of The Tempest starring Simon Russell Beale. 
They will work with the studios run by Andy Serkis, best-known for his astonishing transformation to Gollum in the Lord of the Rings film franchise, and Intel to utilise cutting-edge technology on a live stage for the first time. 
While the details of the project are still in development, it hopes to transform the character of Ariel into a mystical 3D digital apparition which will react in real-time as it is projected onstage, just as effectively as the human members of the cast.  
The final version will incorporate and update the technology already used regularly in Hollywood, which sees actors dressed in specially-designed suits, whose movement are then mimicked by a character on screen. 
Rather than being pre-recorded and projected on stage each night, it is hoped the 2016 Ariel will be seen in live motion capture, with an expert controlling the movements from backstage. The teams are currently experimenting with special effects who allow it to appear covered in fire or water, changing shape to captivate a new generation. 
The high-tech special effect will be matched by traditional, centuries'-old techniques such as Pepper's Ghost, utilising a mirror image to allow objects and people to fade or transform.... 
Gregory Doran, the RSC’s artistic director, said: 'Shakespeare includes a Masque in The Tempest. They were the multi-media events of their day, using innovative technology from the Continent to produce astonishing effects, with moving lights, and stage machinery that could make people fly, and descend from the clouds. In one such masque apparently, Oberon arrived in a chariot drawn by a live polar bear! 
'So I wanted to see what would happen if the very latest cutting edge twenty-first century technology could be applied to Shakespeare's play today. We contacted the leaders in the field, Intel, and they were delighted to come on board. And we have been developing our ideas with them, and Andy Serkis's brilliant Imaginarium Studios to produce wonders...."
Read the full article here.

Alec McCowan (Prospero) and Simon Russell Beale (Ariel) in Sam Mendes's 1993 Tempest. 

I had the luck to see Beale's Ariel many centuries ago – if ever there was a Prospero who needed no digital augmentation, he's yer man. All the more intriguing to imagine what they'll bring to each other...

Tuesday, November 17, 2015

The movement of air

More magic from Adrien M / Claire B in October 2015's Le mouvement de l'air / The movement of air:
A front show designed for three dancers in an immersive environment shaped by projected images. Those are computer-generated for the dancers to play with, making up a digital score performed live by a digital interpreter. The performance matches seemingly impossible visions: images look alive while bodies fly, defying gravity. The acrobatic and digital choreography outlines a body language that involves a new relationship to time, space and the whole world. Beyond looking for technical achievement, what matters is the attempt at creating a motion dreamscape by way of images.

Monday, August 17, 2015

Haiku in motion

La compagnie Adrien M / Claire B presented Hakanaï at the International Symposium on Electronic Art (ISEA) in Vancouver last Friday. Those of us who couldn't see it live will have to be content with the trailer:

Hakanaï / trailer from Adrien M / Claire B

PS: I do love what these guys do...

Le mouvement de l'air: Work in progress from Adrien M / Claire B

XYZT - 2015 from Adrien M / Claire B

Tuesday, July 7, 2015

What's New, Pussycat?

The moment I saw our Bottom up, I figured it was a matter of months till we saw the Cheshire Cat mocapped on stage. And here it is! The one thing I can't tell from these videos is whether the motion capture is live or pre-recorded.

Lysander Ashton of 59 Productions talked with The Guardian about wonder.land's visual inspirations, and also touched on the question of capture:
Q: Will you be making any technological innovations? 
A: We’ll be using quite a lot of motion-capture, which is quite new and novel in a theatrical environment, if not in a film-making one. Because there are going to be moments where the actor onstage interacts with a projected character, we’re going to have a motion-capture suit in the rehearsal room and have the scene played out between the two actors.
That sounds like it might possibly be pre-recorded, but I'm still not sure. I'll keep digging to see if I can find more in-depth intel on the making-of.

On a not unrelated note, fellow Manchester International Festival artist Ed Atkins explains his more conceptual Performance Capture to Aesthetica magazine.

(hat tip to Ryan Porter and Alan McLaughlin)

Friday, June 26, 2015

One step closer to the Holodeck...

Connected Worlds by studio Design I/O: a large scale immersive, interactive ecosystem developed for the New York Hall of Science.
"The installation is comprised of six interactive ecosystems spread out across the walls of the Great Hall and connected together by a 3000 sqft interactive floor and a 45ft high waterfall. Visitors can use physical logs to divert water flowing across the floor from the waterfall into the different environments, where children can then use their hands to plant seeds. As the different environments bloom creatures appear based on the health of the environment and the type of plants growing in it. If multiple environments are healthy creatures will migrate between them causing interesting chain reactions of behaviours. 
"The installation is designed to encourage a systems thinking approach to sustainability where local actions in one environment may have global consequences. Children work with a fixed amount of water in the system and have to work together to manage and distribute the water across the different environments. Clouds return water from the environments to the waterfall which releases water to the floor when it rains."
(h/t Nick Pagee!)

Stills after the break...

Tuesday, June 16, 2015

Full-length video of The Augmentalist at AWE 2015

This live presentation, the grand finale at last week's Augmented World Expo 2015, features Pascal Langdale as "The Augmentalist", with real-time animation powered by Dynamixyz Performer.

The eight-minute demo is followed by a four-minute behind-the-scenes talk by Pascal Langdale (performer, co-writer, producer) and Alison Humphrey (director, co-writer, producer) about The Augmentalist and the 2014 project that informed it, Faster than Night.

This story didn't actually start here, though. 

The previous day, our "augmented mentalist" worked his digital psychic mojo one-on-one with any visitor who stopped by our fortune-telling booth on the Expo floor. For more details, see these photos from the expo floor, and this teaser trailer:

And throughout the Expo's last day, leading up to his presentation as the grand finale, The Augmentalist introduced several keynote speakers, starting with the grandfather of VR and AR, Tom Furness:

Followed by renowned science-fiction author David Brin:

And finally Google Project Tango lead Johnny Lee:

Aug-ust company!

The Augmentalist
Presented June 10th 
at Augmented World Expo 2015 
in Silicon Valley, California.

Performed and co-written by Pascal Langdale
Directed and co-written by Alison Humphrey
Facial capture operator Solène Morvan

Performance capture powered by Dynamixyz
Facial animation by Centaur Digital and Dynamixyz

Produced by Motives in Movement
in collaboration with Dynamixyz and Augmented World Expo 2015

With development support from
and the City of Toronto through the Toronto Arts Council

Monday, June 15, 2015

Photos of The Augmentalist at AWE 2015

Pics from Augmented World Expo... Huge thanks to Ori Inbar, Tom Emrich, Gal Yaguri and the whole AWE 2015 team for three incredible days of futurevision!

Friday, June 12, 2015

Synetic Theater Gets Wet Wet Wet

I need to see this company live. At first I thought the water was a projected digital effect (as in Crystal Pite's The Tempest Replica), but nope – those folks are soaked!

Synetic Theater's The Tempest – multimedia design by Riki K.

(Hat tip to Matthew Olwell)