Dream On with grandMA3: Shakespeare becomes virtual

“Dream” is an ingenious and engaging multi-layered mix of movement, music, visuals, cool technology and narrative magic, an immersive digital performance fusing the drama of Shakespeare with the dynamic worlds of gaming and theatre in a ground-breaking production by the UK’s Royal Shakespeare Company in collaboration with Manchester International Festival, Marshmallow Laser Feast, and the Philharmonia Orchestra.

Staged physically in the Studio at Portsmouth Guildhall and inspired by the classic A Midsummer Night’s Dream, the characters were played by six real actors utilising Vicon motion capture cameras, their avatars and effects appearing onscreen – centring around the antics of cocky and capricious mischief-maker Puck. They run amok in the virtual Midsummer forest, hellraising and spoofing four other sprites during a disruptive and chaotic journey!

Matt Peel was responsible for lighting design and show control. He utilised the power – and specifically the OSC (open sound control) and DMX remote triggering capabilities – of his grandMA3 system with the new grandMA3 software.

The highly acclaimed show was broadcast live for 10 evenings and enjoyed by thousands worldwide, who logged in, either paying for an interactive ticket – with the chance to shoot fireflies into the story to help illuminate Puck’s pathway through the forest – or simply watch the performance for free.

Dream was originally intended to be an in-person performance during 2020, but due to the pandemic, was adapted and re-worked as a live performance concept to be fully enjoyed remotely … from wherever the audience happen to be!

As part of the R&D phase of the project, Matt explains that working closely with the RSC’s Daniel Orchard, an Unreal Engine developer, they integrated known event technologies to communicate with the virtual and real-world show control systems.

An MVR (My Virtual Rig) importer developed for Unreal enabled the Vectorworks pipeline for real-world lighting to be received by both Unreal as well as natively in grandMA3; OSC was integrated into Arduino powered proximity sensors to communicate the status of the physical world; and a PosiStageNet (PSN) – 3D live positional data protocol – plugin for Unreal enabled the grandMA3 and Unreal (which already has inbuilt DMX and OSC) to communicate bi-directionally.

By building custom grandMA3 fixture-types, certain aspects in the game world could be controlled via DMX, e.g., the height and brightness of an object like the sun, or the colour of an avatar, etc. Using grandMA3, all these elements could be rapidly tweaked in real-time on the console.

“Using the grandMA3 in this way meant we could work really fast in this context to make adjustments to these game effects, rather than using game engine keyframing which is a lot more time consuming jumping Unreal in and out of ‘editor’ mode,” says Matt.

A project like Dream was a perfect opportunity to experiment with this bi-directional control and state awareness, merging game-based event logic to create a new style of live performance. It allowed the grandMA3 to be the master show controller sending and receiving unicast OSC between 16 different role machines.

Unreal Engine instances were used to create the rich and complex visual environments in which the action took place, and OBS instances allowed vision mixing the final output between Unreal and four broadcast cameras.

Matt explained that a small rig of traditional theatre lighting – in the form of eight moving lights – in the real studio motion capture volume (capture space), assisted in directing the actors to respond to interactions from the remote audience.

In addition to orchestral music by composer Jesper Nordin and Philharmonia Orchestra principal conductor and artistic advisor Esa Pekka-Salonen, at strategic points the actors’ movement was fed into Gestrument software (a Jesper Nordin project) allowing them to ‘play’ digital instruments and interact together via their motion. Another clever twist which created a stream of beautifully ethereal sonic moments.

The overall show was cued and run in traditional theatre style by a DSM – a synergetic mix of innovative technologies and well-respected techniques that pushed several boundaries.

Matt has been using grandMA3 for some time in his work. In addition to the possibilities of bidirectional communication, vital for a cutting-edge performance like this, he also likes having a large number of playbacks on one page offering all the major elements “at your fingertips”.

Like everyone, Matt was delighted to be back working on another show after the disruption of the pandemic, and especially being alongside “so many talented people with huge expertise in numerous highly specialised areas” to make Dream such a success.

Dream was directed by Robin McNicholas of Marshmallow Laser Feast, Sarah Perry was the movement director, and the project lead and executive producer was Sarah Ellis, director of digital development at the RSC. Ambersphere Solutions Ltd is the exclusive distributor of MA Lighting in the UK.

www.malighting.com

Photo: © Stuart Martin / RSC