In a dark basement room in the northwest corner of University College, I’m catching a glimpse of the future of live theatre and performance. Or at least a possible future.
There is no stage of any kind here, and no audience seating. Nevertheless, for the past four years, some of the country’s most talented theatre artists, including writers, actors and directors, have walked into this space (and its predecessor, a few blocks south at the Koffler Centre) to act, speak and move – and test how they could use artificial intelligence and other technologies to push theatre in new directions.
“This used to be the University College kitchen,” explains the lab’s genial director, David Rokeby, as he fiddles with computer cables and switches. Yet what Rokeby, his research students and international artists have been cooking up at the lab – which is part of U of T’s Centre for Drama, Theatre and Performance Studies – is anything but traditional theatre fare.
I sit in on a session in July with Nick Flynn, an American poet who is visiting the lab for a few days to experiment with a voice-to-image system that uses artificial intelligence to create visual representations of what a performer is saying. As he reads a poem from his upcoming collection, images flash onto a makeshift screen along one wall: birds fly by, corpses pile up, a forest burns. When Flynn says, “A blue mass of cold,” a blue stone appears, hanging in the sky, like something out of a Magritte painting. To influence the type of image the computer produces, he can provide the system, before he starts reading, with prompts, such as “landscape,” “black and white” or “in the style of Hieronymus Bosch.”
Flynn, who thrives on collaboration, is excited by the possibilities of what he sees. When his book is published later in the year, he hopes to incorporate the technology into his public readings, adding an immersive layer to his performances.
Later, as he gets ready to return home, Flynn tells Rokeby it’s a shame that his book is finished. Because of this experiment, he says, he can “feel the poems rewriting themselves.”
“Nick saw the system go off on tangents from the words he used, showing him new possibilities,” says Rokeby. “But [the system] also pointed things out the way a proofreader might, asking him to clarify a point or make things more concrete. It made me think about the potential creative roles of these technologies in real life.”
Real life. Real space. These are terms Rokeby returns to during our conversations. As an installation and interactive artist, Rokeby has been working adjacent to new technologies since the early 1980s. He’s interested in how they affect living, breathing actors and vice versa – not in flashiness for its own sake (he mentions the Royal Shakespeare Company’s Intel-funded production of The Tempest, complete with 27 projectors, as a spectacle that didn’t add anything to the play). So, he’s choosy about the kind of technology the lab will work with. Video has been around for too long. “It’s not a place for the lab to invest its time,” he says. Virtual reality doesn’t interest him so much, either: he finds it alienating for audience members to have to put on a clunky headset.
Although he enjoyed parts of Canadian playwright Jordan Tannahill’s 2021 show Draw Me Close, which combined virtual reality with a live performance by a single actor for one audience member at a time, he says the technology is still unproven – and may not gain widespread appeal. “The fact that VR did not take off massively during the COVID pandemic was fascinating to me,” he says. “Some people are 100 per cent in, but others aren’t going there.”
But he is intrigued by working with artists who want to pose questions about the role these technologies should play in society. And he wants to give U of T students from a variety of fields a chance to work creatively with these kinds of tools, which they will likely use at some point during their career.
During the early days of the pandemic, the lab and Canadian Stage created a paid eight-month residency to bring in artists to experiment with the technologies. The program launched with Sébastien Heins and Ryan Cunningham, both writers and actors. It quickly expanded to include writer and director Rick Miller and Stratford star Maev Beaty. “Nobody was working [because live theatres were closed], and so this was an amazing opportunity for all of us,” says Rokeby.
Heins worked closely with Rokeby on his autobiographical play, No Save Points, figuring out how to use the latest motion-capture technology, which tracks human movement to animate a digital character, to tell a story about his mother’s Huntington’s Disease, a debilitating neurological condition.
Heins was in his early 20s when he found out about his mother’s illness and was confronted with the possibility that he had inherited it as well. The play, which enjoyed sold-out shows in Toronto in June, illustrated his attempts to escape from his real-life worries by retreating into a fantasy world through his Game Boy. “I found myself wanting to escape from the truth of her diagnosis – and so the Game Boy became a symbol of taking back control,” Heins said in a U of T News interview just before the play’s debut.
The lab helped Heins use motion-capture to create an animated, 10-year-old version of himself that would appear on a screen onstage. When certain audience members pressed buttons on a control pad they’d been given, they sent signals to sensors planted on Heins that would tell him whether to move left or right, jump or duck. His movements triggered his on-screen avatar, who was trying to escape danger, to do the same.
Using this technique, Heins effectively brought several “video games” in the show to life. The metaphor illustrated Heins’s anxiety about his mother’s – and his own – health. And it captured his feeling of helplessness. “The use of technologies as a metaphor here is really key because it shifts it from being, ‘OK, here’s a cool thing you can do,’ to ‘Here is a way this character is working through things,’” Rokeby told U of T News.
“I was really specific about what I thought we needed the tech for,” says Heins. “I needed to test a prototype of a 3D-looking avatar that moved on the screen. If the motion-capture character looked a little weird, David tweaked it. And he also gave philosophical mentorship as well, talking about what to do if and when the tech went wrong.”
One day, as Heins was working on No Save Points, he had finished up what he was doing and took off the motion-capture suit. A stick-figure avatar of himself connected to the sensors in the suit was still up on a screen. And suddenly Heins saw his avatar’s head flop over and its arm begin to move erratically – much like his mother’s limbs would move with her disease. That haunting image – a poignant mix of theatre and technology – made it into the final moments of No Save Points. “That’s the sort of thing you don’t get through discussion and theorizing,” says Heins. “It’s pure discovery.”
The lab, which is supported with a $5-million gift from BMO Financial Group, is also experimenting with a large language AI tool that works like ChatGPT, the popular chatbot released late last year. Rokeby has fed the system the complete works of Shakespeare and can now get the computer to generate lines of text that resemble the Bard’s. This in itself may not seem especially useful. But what if a contemporary playwright fed the system their own collected works? Could the AI help generate ideas for future projects? Or assist them in writing?
Rokeby tried something like this with Rick Miller, who was interested in seeing what the system could come up with. He wasn’t especially impressed with the output, but he saw its potential. “You wouldn’t say [the result] was a play, but it was a kind of live performance,” Miller observed. “It made me wonder, ‘Am I expendable?’”
The text generator also offers a “chat mode,” which allows for a real-time, improvised dialogue between the performer and the system. Such an AI could act, effectively, as a rehearsal partner, challenging actors to respond authentically and creatively, and encouraging them to experiment with different emotional nuances.
For the next several months, Rokeby will focus on preparing for the lab’s move into a new space in UC’s Laidlaw Wing. There, it will occupy one of three state-of-the-art studios equipped with full lighting grids and sound systems. (The other two studios will belong to the Centre for Drama, Theatre and Performance Studies.) Set to open in 2024, the new facility will be used for teaching, experimenting and workshopping new productions.
Rokeby’s goal is to outfit the lab in such a way that artists – both students and visitors – can follow their instincts and intuition. Do they want to use the voice-to-text-to-image tech for a scene? It’s available. Do they want an interactive light and sound system that responds to gestures you make while wearing a motion-capture suit? It’s also there. Do they want to use a large-language-model AI system to help generate ideas for characters or plot for a play they’re writing? That’s there, too.
“We want to be able to shift the space quickly and fluidly, and show them what’s possible,” he says. “It’s one thing to make cool, interactive experiences. But the process of making these things should themselves be interactive and responsive – you should be able to use the toolset at the speed of creative inspiration. That’s not just useful and efficient; it actually changes what’s possible, because you can feel your way through the possibilities.”