Virtual Production & Digital Scenography

Wayfinding: Tools > Other Tools > Virtual Production & Digital Scenography

In the first chapter of The VES Handbook of Virtual Production, the production method as: ‘the augmentation or replacement of traditional visual effects or animation workflows by the use of real-time, digital technology’. This cinematic approach to editing and production is perhaps most akin to in-person theatre’s digital scenography, which in her monograph Digital Scenography in Opera in the Twenty-First Century as one that combines stage design practices, including backdrops and lighting, with any digital technology being incorporated onto stage. Onstage, digital scenography is most noticeable with moving projected backdrops or large LED panels displaying video or animations. For example, Starweaver was a co-production between Madness of Two and The State Theatre in Adelaide, South Australia; the team partnered with Flinders University's The Void to develop set and character designs for an in-person videogame-like science fiction production that relied on the fast scene changes and animation afforded by the LED screen background. that the increasingly elaborate approach to virtual production and digital scenography in second wave digital theatre made it much more engaging for audiences: ‘Designing and layering of foreground and background elements, objects and planes was key in developing virtual scenographies with sufficient visual depth and perspective to truly resemble a theatre stage set’. When considering why theatre makers chose this method of virtual and scenographic production, : ‘Many virtual settings and effects such as these, and others including vertiginous ones with characters placed in the sky or upside down, were created by the companies specifically because they would be impossible (or at least prohibitively expensive) for them to stage in a physical theatre’. Michael Deacon confirmed this to me in our interview when he discussed Creation Theatre’s production of Aphra Behn’s The Emperor of the Moon; when the show was originally performed in the 18th century, it used Spectacular Theatre conventions, meaning it was laden with special effects that would be very expensive and time-consuming to replicate onstage in 2024, which makes it a perfect production for low-cost virtual production and digital scenography methods in second wave digital theatre. Within the context of second wave digital theatre virtual production and digital scenography overlap. Theatres using Zoom, for instance, often changed the casts’ backdrops using Zoom’s simple background replacement tool, thus using one of the medium’s affordances to simply set an online stage. Other companies, including Creation Theatre, Big Telly Theatre, and Streamed Shakespeare engaged in more extensive virtual development; Streamed Shakespeare’s Henry IV Parts 1 & 2, for example, used Unity to design scifi animated backdrops and assets to place the actors on other worlds. On Unity and related tool Unreal Engine, : ‘The virtual performance spaces were generated by real-time games engines Unity and Unreal Engine, with live interaction design through sets of digital tools such as shaders, colliders, and GPU particle effect. These tools permit reactions between avatars and objects, and give dancers the feeling of virtual touch, proximity and gravity, that allows a sense of kinesthetic embodiment within the virtual scene’. Unity and Unreal Engine might be important for Virtual Reality Theatre in particular, although you can also use an existing VR platform like VRChat, which is what Ferryman Collective did for their production of Finding WiiLii. some of the virtual production tools commonly used: ‘VJing applications (which allow real-time manipulations of digital imagery, normally in synchronisation with music)’: ‘Resolume Avenue 6’ (first release 2017), ‘WebRTC’ (Web Real-Time Communications, first release 2011), ‘VDMX’ VJing/video processing application, ‘vMix’ (first release 2009), a complete video production platform. ‘Using a Google Chrome browser and the vMix Call website, the remote actors simply called into the Telepresence Stage vMix platform operated by a member of the research team, who could apply a full range of video switching functions and effects’.

Last updated