interviews
Interview
Cinema
Digital creation

Mathias Chelebourg

We’re seeing the birth of a new storytelling medium that’s a perfect hybrid of film, video game and theatre.

Author and director Mathias Chelebourg has put his name to the virtual reality (VR) films Alice, The Virtual Reality Play (2017), Jack (2018), The Real Thing (2018), or even Doctor Who: The Runaway (2019). He presented the preview of his latest film Baba Yaga at Venice VR Expanded 2020, the VR festival of the Venice International Film Festival. 

Updated on 28/05/2021

5 min

Image
Mathias Chelebourg
Crédits
© Par Athys de Galzain sur le plateau virtuel de l'Atelier Daruma

You started your career as an assistant in film and fiction. How did you come into contact with virtual reality?

In 2012, I took part in the Kickstarter crowdfunding campaign of the Oculus start-up, which has now been bought by Facebook and is leader in VR (virtual reality) headsets. At the time, I was making films for the luxury industry and I wanted to convince my clients of the potential VR had. I started experimenting in the freest sector of contemporary art with my exhibition Variations (2015). Before going on to create various immersive experiences for brands like Prada, Dior or Hermès. 

As a creator of stories, why use the virtual reality as a medium rather than another? 

We’re seeing the birth of a new storytelling medium that’s a perfect hybrid of film, video game and theatre. Those that have moved into VR have varied backgrounds from app development to dance, or film to live performance! Incredible competition came about between all these means of expression that are highly partitioned today, and I’m convinced that VR’s own language in the process of developing from the wealth of these exchanges, that’s specific to it alone. 

VR is a technology that develops fast, particularly with new headsets being launched on the market each year. How do you reconcile the creative aspect with the technological constraints? 

Permanent dialectics exist in new media between technological innovation and artistic creation. But it would be misleading to believe that artists blindly follow the path of innovations. It’s also thanks to works that technology finds its inspiration and the path of its development. Maintaining close relationships with hardware and software manufacturers has always been at the heart of my work with constantly keeping an eye on developments in mind, but also to work with them to bring solutions on with the aim of widening their storytelling potential. Today partners like Epic (Moteur de temps réel, Unreal Engine), HP, Arri, NVidia or Optitrack motion capture manufacturers have as it happens directly integrated updates following the live motion capture work Jack, Alice, The Virtual Reality Play or MADO.

With the VR films Alice, The Virtual Reality Play (2017) and Jack (2018) that you directed, you are creating a new type of interactive experience at the crossroads of new types of artistic genres. How did these works come about and what guides your artistic choices? 

After having had the opportunity to experiment through various control formats, I had in mind to get off the beaten track radically to explore a language that’s specific to VR. The need for interactivity was central, the spectator was going to be able to become an actor acting from the narration thanks to VR. The desire was also born from wanting to touch all the senses: Hearing and vision obviously, but also taste, touch and smell. To sum up, the fantasy of totally immersing the body in a fictional space where anything is possible. It was when we saw the immersive theatre play Sleep No More in New York that we had a lightbulb moment! What’s more interactive than human contact? What’s more transformative than wandering around an environment that’s really structured? That’s how the idea Alice, The Virtual Reality Play came about. A performative experience of multisensorial theatre, taking advantage of live motion capture technology to give a privileged viewer-actor a buzz with an actor who plays several roles in a row. The experience was presented at the Cannes Film Festival in 2017, then the first immersive selection of the Venice International Film Festival, before undertaking an international tour and opening the way for a new singular possibility for VR. 

Why choose fairy tales? 

I like my imagination to draw from fairy tales, legends and European children’s literature, which have also been the melting pot for Disney. I’m passionate about the idea that VR enables these imaginary worlds to be brought to life by giving the viewer the opportunity to submerge themselves in real dreams whilst awake. Thanks to Alice, The Virtual Reality Play I had the opportunity to meet major names in animation and former employees of DreamWorks who had just founded the first independent interactive animation studio in the world: Baobab Studio. We immediately worked together to push back the limits of this new form of performative storytelling through an adaptation of Jack and the Magic Beanstalk, with the Oscar-winning actress Lupita N’Yongo, which was presented at the Tribeca festival in 2018.

I find the mechanics of dreams fascinating. You experience the moment and that’s the reality revolving around us. VR is very good at reproducing this sensation and I try to push this principle as far as possible.

At Venice VR Expanded you’re presenting Baba Yaga, produced by Baobab Studio, that you co-directed with Eric Darnell (director and scriptwriter of the Madagascar, Madagascar 2 and 3 animated films). How did you join this project? 

By working with Baobab Studio on Jack: Part One we shared the will to design experiences that enthral. But also to explore the idea of reproducing the depth of emotions, the feeling of freedom and organic interactivity that we had the opportunity to try out with Alice, The Virtual Reality Play and Jack, this time with the constraints of wider distribution on platforms. They also approached me with the idea to adapt the Slav fairy tale Baba Yaga and we set off to accomplish the hard task of giving the viewer the freedom to progress in a story again and interact naturally with fictional characters, but this time in the comfort of their living room! 

Baba Yaga brings together a prestigious cast: Kate Winslet, Glenn Close, Daisy Ridley and Jennifer Hudson. How did these collaborations come about and how do these actresses grasp the VR medium?

These collaborations happened naturally because the actresses were very curious about this new medium. It was very moving when Daisy Ridley met her character moving for the first time in the headset and that she was able to interact with it, or when Glenn Close told us that she’d been passionate about the premisses of 3D animation at Disney in the 1990s and felt a similar excitement with us today. 

With Baba Yaga, you recreate a unique fairy tale and explore the idea of courage through it in particular. What can the viewers expect in trying the work?

Often with virtual reality there’s a big temptation to give the viewer a junior and contemplative role of “sidekick”, or the opposite in being the almighty power like in a video game. With Baba Yaga we tried to explore the complexity of a real equal to equal relationship between two characters from the same family, Magda and Sasha; the latter character being played by the viewer. They can therefore expect to experience an adventure through the eyes of a child, with all the poetic distance and subjectivity entailed by this. 

I find the mechanics of dreams fascinating. In a dream you can turn around and not be in the same decor anymore, or experience an eclipse of time in the blink of an eye with no problem of continuity or cut. You experience the moment and that’s the reality revolving around us. VR is very good at reproducing this sensation and I try to push this principle as far as possible. By using the amazing potential of interactive off-camera but also through very particular artistic direction that takes advantage of theatre half-light. 

At the end of August you were involved in the VR residency co-created by the Institut français and the VR Arles festival. What did you think of this experience? 

The residency shows the huge diversity there is already in virtual reality projects. There are people from the visual arts, film, video games, interactivity and being in contact with people who don't have the same references changed how the participants saw the work. You can really speak of a French cultural exception in VR: I don’t know another country where such a formal and technical demand exists, with the freedom given to authors to develop singular and innovative visions around interactive works. 

How do you see the future of interactive creation?

VR must manage to get out of the certainly crazily inspiring but very closed universe of festivals to approach the public. I’m convinced that the key is to move into large spaces with collective experiences. The crazy successes of teamLab or l’Atelier des Lumières, and even by extension those of Sleep no More or the Disney theme parks, are very good signs. The public has a real appetite for the experiential and this aspiration hasn’t found the path of VR yet.

I’m currently working on reconciling these two universes through an ambitious adaptation of Oscar Wilde’s The Canterville Ghost. I think that the future of interactive storytelling is in the free and collective exploration of huge virtual decors in which demanding narrations unfold. This means reconciling the deep-rooted force of immersive theatre with the visual and storytelling quality of major animation films. Who wouldn't buy a ticket to see that!?

Image
Jack (Part I)
© Baobab Studio
Jack (Part I)
Jack Teaser - Tribeca Film Festival Premiere
The Institut français and the artist

Jack (Part I) and Alice, The Virtual Reality Play, by Mathias Chelebourg, are presented on culturevr.fr, an Institut français platform which offers a panoramic view of cultural innovation in virtual reality.

L'institut français, LAB