A pioneering collaboration between The Royal Shakespeare Company, Manchester International Festival, Marshmallow Laser Feast and Philharmonia Orchestra.
The Royal Shakespeare Company (RSC), in collaboration with Manchester International Festival (MIF), Marshmallow Laser Feast (MLF) and Philharmonia Orchestra will stage a live performance of Dream at Portsmouth Guildhall’s Studio using motion capture as the culmination of a major piece of cutting-edge research and development (R&D) involving University of Portsmouth.
Of the exciting project, Chief Executive Officer of The Guildhall Trust, Andy Grays said, “Portsmouth Guildhall is delighted to welcome The Royal Shakespeare Company (RSC), in collaboration with Manchester International Festival (MIF), Marshmallow Laser Feast (MLF) and Philharmonia Orchestra with The University of Portsmouth for their pioneering collaboration of Dream where live performance and gaming technology come together using motion capture in a major piece of cutting edge research and development which explores the future for live theatre and audiences. It has been exciting to see the rehearsals and preparation for Dream come together in our purpose-built Guildhall Studio under strict Covid secure protocols”.
The pioneering collaboration explores how audiences could experience live performance in the future in addition to a regular visit to a performance venue. Dream was due to open in Spring 2020 as an in person and online live performance and has been recreated during the pandemic for online audiences whilst theatres remain closed. The project is one of four Audience of the Future Demonstrator projects, supported by the government Industrial Strategy Challenge Fund which is delivered by UK Research and Innovation.
Dream is inspired by Shakespeare’s A Midsummer Night’s Dream and gives a unique opportunity for audiences to directly influence the live performance from wherever they are in the world. Audiences will experience a new performance environment easily accessed on their mobile, desktop or tablet via the dream.online website. The performance uses the latest gaming and theatre technology together with an interactive symphonic score that responds to the actors’ movement during the show.
The live performance is set in a virtual midsummer forest. Under the shadow of gathering clouds at dusk, lit by the glimmer of fireflies, Puck acts as the guide. Audiences are invited to explore the forest from the canopy of the trees to the roots, meet the sprites, Cobweb, Mustardseed, Peaseblossom and Moth, and take an extraordinary journey into the eye of a cataclysmic storm. Together with Puck they must regrow the forest before the dawn. When day breaks, the spell breaks.
The 50-minute online event will be a shared experience between remote audience members and the seven actors who play Puck and the sprites. Audiences can choose to buy a £10 ticket to take part and at key points in the play directly influence the world of the actors, or to view the performance for free. The ten Dream performances are scheduled so that audiences across the world can join the event.
Gregory Doran, RSC Artistic Director said: “What’s brilliant about Dream is the innovation at play. An audience member sitting at home influencing the live performance from wherever they are – that’s exciting. It’s not a replacement to being in the space with the performers but it opens up new opportunities. By bringing together specialists in on-stage live performance with that of gaming and music you see how much they have in common. For instance, the RSC’s deep understanding of scripted drama combined with Marshmallow Laser Feast’s innovation in creative tech brings thrilling results.
“The story is king, whether you are a gamer, or an audience member. Stories haven’t changed, but the way we engage audiences with them has. Shakespeare was our greatest storyteller and it’s brilliant that we get the opportunity to use one of his plays to discover what could be possible for live performance.”
Robin McNicholas, Director and Co-Founder of Marshmallow Laser Feast added: “Our focus has been on creating an experience with the natural world at its centre. It’s a celebration of the magic of biodiversity brought to life by an incredible cast on this adventurous virtual production. The team has created a work that explores new narrative techniques, opening doors to a vast story-world that offers new perspectives enabled by cutting edge technologies performed live on a motion capture stage.
“We hope audiences find a new and unique way to engage with immersive storytelling. Virtual productions such as this offer new creative forms of expression and opportunities for performers, musicians, artists, designers and creative coders”.
A major piece of research runs through the project led by i2media research at Goldsmiths, University of London and NESTA, including the potential for making similar online performances financially viable for the arts sector. All findings and research will be shared with the wider UK cultural sector throughout 2021 after the live performances are completed.
DREAM – A live, online performance set in a virtual midsummer forest.
Cast and creatives: Robin McNicholas – Director Pippa Hill – Script Creation Robin Mc Nicholas & Pippa Hill – Narrative Esa-Pekka Salonen – Music Director & Composer Jesper Nordin – Composer, Interactivity Designer and Creative Advisor, Music Sarah Perry – Movement Director Maggie Bain (Cobweb), Phoebe Hyder (Understudy Puck and Mustardseed), Durassie Kiangangu (Moth), Jamie Morgan (Peaseblossom), Loren O’Dair (Mustardseed), EM Williams (Puck ), Edmund Wood (rehearsal assistant, Understudy Moth, Cobweb & Peaseblossom).
THE TECHNOLOGY Building on the technology used in the RSC’s 2016 ground-breaking production of The Tempest, the first play to feature live performance capture rendered in Unreal Engine, Dream harnesses live performance, virtual production and gaming technology. The production is performed with seven actors in a specially created 7x7metre motion capture volume created at the Guildhall in Portsmouth, supported by a team from the University of Portsmouth. The performance space includes an LED backdrop which displays the unreal world allowing performers to see their place and act within the virtual environment.
Vicon motion capture cameras and state of the art facial rigging capture the movements of the performers. This in turn drives the virtual avatars of each of the characters in real-time through a traditional performance lighting desk into Epic Games’ Unreal Engine. The live performance is mixed with pre-recorded animation sequences.
The audience is led by Puck (EM Williams) who takes them from the real world into the digital world. As fireflies the audience can guide Puck through the forest at key points in the play using the movement of their touchscreen, trackpad or mouse. The actors perform and respond to audience interaction and direction making each performance unique, as the audience will behave differently at each event.
A bespoke web-player has been created for Dream to enable the effective distribution of realtime content from individual audience members to the Unreal Engine server and back to the audience. This new software allows the level of dynamic, real-time interaction working with a mass volume of users (up to 2000 per performance) in a live environment.
James Golding, Technical Director – Character Tech, at Epic Games added: “It has been such a great experience to support phenomenal collaborators, across many different disciplines, on this ambitious project. It’s exciting to witness creators exploring the wonderful possibilities ahead for live entertainment.”
THE MUSIC Music is integral to the experience of Dream. An interactive symphonic score recorded by the Philharmonia Orchestra will be manipulated in key points in the play in real-time by the performers, who will create interactive music with their movements. The result will be a living, dynamic soundtrack that adapts and interacts live with the narrative and the pre-recorded orchestral tracks.
The installation will feature core classical repertoire and two contemporary orchestral works – excerpts from Gemini, the latest composition by Esa-Pekka Salonen, the Philharmonia’s Principal Conductor and Artistic Advisor, and Ärr, composed by Swedish composer Jesper Nordin. The music was recorded by the Orchestra, conducted by Salonen on Friday 13 March 2020, the last full-scale orchestral recording, involving 100 players, before the pandemic struck.
Alongside his growing recognition as a composer, Nordin is the creator of the ground-breaking interactive music tool Gestrument, giving parts of Dream an interactive musical layer. Gestrument allows the performers to generate music from their movements. This real-time generated music can be shaped by the performers but will always be in perfect sync with the pre-recorded orchestral score.
Esa-Pekka Salonen, Principal Conductor & Artistic Advisor, commented: “Immersive technologies are going to change the way we compose, perform and experience music, and I believe we need to reimagine the symphony orchestra for this new landscape. This is primarily an R&D project – to reimagine immersive technology for live performance – where the creative process itself is one of the most valuable aspects of the whole project.
“Dream has brought us together with some of the world’s leading theatre practitioners, VR world-builders and game designers to reimagine storytelling. The collaboration has moved from the stage to the real-time games engine. As a composer, it has allowed us to reimagine composition for a new landscape where the audience has agency, and the performers can become part of a living, dynamic score that is integral to the live performance.”
Talking about the project, Prof. Andrew Chitty, Challenge Director, Audience of the Future, UKRI, said: “Dream is an extraordinary achievement by the RSC and its partners but also a demonstration of how entirely new audience experiences are created as immersive and digital technologies become integrated into performance. When we set out the Audience of the Future Challenge no-one would have predicted this entirely digital performance of Dream; it goes far further in putting new technologies at the heart of performance than we had dared to hope. It’s also stands as a beacon showing what our world class performance and creative technology companies can do given the right support for Research and Innovation.”
Gabrielle Jenks, Digital Director at MIF concluded: “Audience of the Future has been an invaluable space for world-leading organisations to consider the future of virtual production and understand the opportunities and challenges of working with real-time technologies. From DYSTOPIA987, Skepta’s extraordinary mixed-reality experience created for MIF19, to Dream, the learnings from these ambitious and collective approaches to live performance will help us imagine new boundary-pushing ways to present work in The Factory, MIF’s future home being built in the heart of Manchester.”
Tickets for the live, online performances, Friday 12 March – Saturday 20 March 2021 can be booked online here.