An Opera Set in Your Home
25th January, 2021

An Opera Set in Your Home

Early in 2018 Google’s Creative Lab approached Opera Queensland and invited them to explore what might be possible if Augmented Reality (AR) were to become part of the process of staging an opera.

Early in 2018 Google’s Creative Lab approached Opera Queensland and invited them to explore what might be possible if Augmented Reality (AR) were to become part of the process of staging an opera. The driving artistic question of the project would be: what would happen if we could step outside the traditional setting for opera and render an operatic performance in a user’s home? Could we make live performance more accessible – by offering everyone a front row seat – and at the same time allow a user to feel a deeper connection to the music and the performers?

Opera is perhaps the least mobile art form of all. Producing an opera requires the combined forces of an orchestra, an ensemble of singers, a chorus, complex sets, grand costumes, and a small army of technicians. The idea of using cutting edge digital technology to present operas in peoples’ living rooms, not as a recording of a pre-existing production but as a wholly new experience, in which the user’s home became the site for the performance, was as daunting as it was irresistible.

A New Storytelling Grammar

At the core of our vision was a desire to make a user feel an opera performer was ‘right there in their living room’, singing for them. With this in mind we began a process of interrogation and discovery that pushed everyone involved to the limits of the known universe – in digital terms at least – to create a prototype for an app that allows anyone with a mobile device to stage an opera in their home.

How do you begin the creative development for an experience that involves such an unfamiliar fusion of art and technology it is yet to develop its own ‘grammar’?
In every aspect of the creative process for this project – music, dramaturgy, scripting, set design, user experience, not to mention the visual properties of the operatic performances themselves – we had to question our expectations.

While the emotional experience of watching an opera in the theatre can be intensely dynamic, the audience’s relationship to the action is static, fixed from the perspective of whatever seat they are in. AR revolutionises this experience by placing the action in the audience’s home, where they can interact with it in whatever way they choose.

Using a mobile device as a lens, the user can zoom in and out of the action of the scene as they wish – sitting back in a wide shot, or zooming in to circle the singers as if they are standing beside them, and every variant between.

Reflecting on this newfound freedom for the audience, more questions emerged:

– How do you think about set design when the viewer’s home, which you have never seen before, is the set?
– How do you design the flow of dramatic action without any knowledge of the layout of the house?
– How do you direct the action when you have no control over what a viewer will be looking at?
– How can you compel a viewer to physically follow the action around a house?
– How should you represent the opera performances? With motion capture, animation, or something more film-like?
– How long should the experience be?

Before we could explore any of these questions, we needed to determine what opera we wanted to work with. We chose Mozart’s The Magic Flute as it offered numerous narrative and design possibilities. We extracted the story of the character Tamino to create a condensed 15 minute ‘hero’s arc’ with the aim of connecting the user intimately with his journey through the story. From the opening scene, as he attempts to fight off an angry, fire-breathing dragon, Tamino would speak ‘down the camera’ directly to the user as they held their phone – beckoning them to join him on his quest to rescue Pamina, his newfound love.

A Theatrical Lens

The more we investigated, the more we realised the task of helping a viewer navigate an entertainment experience through a physical space – in this case their own home – would be in part informed by the language of live performance. Given the intimate relationship the user has to the action, we figured them as another character within the scene, free to move as they choose. This meant thinking about the dynamics of the action from a three-dimensional perspective, unlike cinema where the flat screen imposes a rigid two dimensionality.

Using techniques such as blocking (how characters are placed in relation to each other), lighting to conjure an emotional as well as spatial effect, eyelines to indicate where other characters are present within the space, and sound to spatially indicate direction and focus, we slowly started to establish the parameters of our own performance grammar. These were being discovered on the fly, with us writing and rewriting the “rules” with each new discovery made.

We found that charting visual pathways on the floor plane – for instance, by laying down a ‘path’ of digital forest flowers – we could compel a user to move from location A to location B surprisingly consistently. It became our own, digital version of the yellow brick road.

In a similar vein, rather than blocking every piece of action – which was impossible without knowing the layout of a user’s house ahead of time – we experimented with magically ‘dissolving’ characters from A to B – using a trail of digital elements in the air.

Shooting in 3D

As singing is the beating heart of opera as an art form, the integrity of the opera singers’ performances was paramount. We had to believe that the digital characters we were watching in the app were really singing.
This meant capturing them on film – rather than say animation or motion capture. Prompting the question – how do we film a performer and make them viewable in three dimensions? Enter a new film technique called ‘volumetric capture’, which surrounds an object in cameras to film it from every possible angle.

This photo-real, 360 degree view of the performer is then recreated as a ‘volume’ in 3D space, allowing it to be placed ‘in the real world’ as seen through a smartphone’s camera.

The performer can be inspected from all sides by the viewer, and is also imbued with the properties of a real object, for instance the ability to cast a shadow, and as a result becomes a surprisingly believable new ‘element’ in the room.

There are very few volumetric capture studios in the world, but we were fortunate to work with a Google team in Los Angeles to shoot the performances and develop the production pipeline that allowed us to display the output of the film capture on a mobile phone.

Technical Challenges

Rendering volumetrically captured performances in real-time on a mobile phone is a huge undertaking, involving revolutionary capture technology, machine learning, and state-of-the art mobile hardware.

Our volumetric capture stage (pictured above) consisted of 90 1.2MP cameras and hundreds of programmable LED lights inside a geodesic sphere. This generates a massive amount of data: just 20 seconds of footage outputs over a terabyte (>1,000GB) of information, which needs to be processed and compressed.

Even after it’s processed, every frame of volumetric data is roughly 5MB – about the size of a pop song in MP3 format. So to play three opera performers in real time at 30 frames per second, you’re looking at displaying ~450MB of data per second. That adds up to 4.5GB – about the size of a full-length feature film in high definition – every 10 seconds.

We had to build every aspect of the film pipeline, from capture and processing to streaming and playback in AR from scratch, all the while without features we take for granted with regular video such as play, pause, and rewind!

Displaying this data in AR requires an extremely powerful smartphone which uses sensors and machine learning to build a digital map of the viewer’s environment.

When looking at the performers in the context of the app, they may seem relatively simple but when considered in the context of the above, it is a phenomenal achievement and the technical team deserve enormous kudos for what they have achieved in so short a space of time.

Where next?

The prototype we created is not a finished vision, nor even a specific proposal for an opera. It is an exploration that attempts to map some early contours in the emerging landscape of AR-based storytelling as it relates to opera. By situating dramatic action in the immediate surrounds of the user, it attempts to chart a new kind of immersive experience, and by creating a vision in which any user can experience world-class operatic performance in their own home, it continues an ambition to make the arts more accessible to more people.

We are excited to share this story about the new forms of creative expression that are being made possible on emerging platforms such as AR. We hope to continue the process of developing AR-ia to a point where we can present complete operatic stories to audiences in their homes, and would welcome conversation with anyone interested in seeing where this extraordinary marriage of art and technology may go.

The Team

Opera Queensland

Artistic Director Patrick Nolan
Original design concept Genevieve Blanchett
Music Director Narelle French
Orchestra Camerata
Tamino Brenton Spiteri
Pamina Emma Pearson
Sarastro Wade Kernot

Google

A number of people in different Google offices around the world have worked to make this project possible, including: Google’s Creative Lab in Sydney; Sean Kelly, Samantha Cordingley, Jude Osborn, Kelly Cameron, Kirstin Sillitoe, Eden Payne, Tea Uglow, Jonathan Richards. Augmented Perception team in San Francisco and Los Angeles; Christoph Rhemann, Sean Fanello, Danhang Tang, Jay Busch, Philip Davidson, Paul Debevec, Peter Denny, Graham Fyffe, Kaiwen Guo, Geoff Harvey, Peter Lincoln, Wan-Chun Alex Ma, Jonathan Taylor, Xueming Yu, Matt Whalen, Jason Dourgarian, Shahram Izadi.

    Sign up for the latest news

    By signing up to the Opera Queensland newsletter you agree to our Terms and Conditions and that you have read our Privacy Policy, including our Cookie use.

    Menu
    Menu

    Hi, what are you looking for?