Skygge is the alias of songwriter Benoît Carré, who has been working on music assisted by Artificial Intelligence (AI) since 2015. A novel approach that can now be experienced live on stage.
Updated on 25/11/2021
You have had a long career as a singer-songwriter since the mid-90s. Can you explain how you "became" Skygge?
Skygge means "Shadow" in Danish. It's the title of a fairy tale by Andersen about a learned poet who asks his shadow to go and explore the house next door from which a melodious song emanates every night. The shadow disappears and becomes a sort of whimsical character who travels the world. I was reminded of this tale during my earliest experiments with the team of researchers I joined in 2015 on the Flow Machines project. I wasn't sure if I was their shadow or if the artificial intelligence was mine, but I found it exciting to surrender control and let my shadow surprise me.
The origins of AI-generated music go back to the 1950s. And yet, the process has rarely been used. Why did it take so long to develop this technology?
At the time, it was largely a concept, developed by John McCarthy in particular. The first algorithms appeared in the 1980s, but it wasn't until the 2000s that computers had the power to create neural networks that could analyse musical elements and generate compositional fragments. Laboratories like the one I work with then started to develop tools on style, to offer musicians a new way of creating.
Have you seen an evolution in the public's perception of artificial intelligence in music?
I don't think we can talk about a trend just yet because there are too few artists experimenting with these new approaches to composition. This is why I believe the support of the Institut français since 2019 is forward thinking and therefore invaluable. Joining the Catalogue of the Institut français with my project Interface Poetry will help to highlight my new artistic approach and will also allow me to share my experience, to meet other cultures and other artists. I am fortunate enough, for example, to be invited to present my work in Saudi Arabia next November as part of Novembre Numérique, alongside well-known artists. I am always keen show my creative process through concrete examples. This helps to break down negative preconceptions about Artificial Intelligence.
What is the difference between music generated by Artificial Intelligence and Computer Assisted Music (CAM)?
AI tools allow for the creation of entirely new material. In CAM, we use samples of existing music. I feed the machine with my desires, my musical "fantasies". For example, I can give it a Purcell score, which it will adapt to a voice, as in my piece "Black is the Color" by Pete Seeger (American Folk Songs). The point is not to replace the musicians but to open up endless possibilities for exploration. Although they are very powerful, these tools have no imagination and only a musician can recognise the beauty of a result and be inspired by it for their work. In fact, I asked musicians from the Radio France Philharmonic to play what the AI had generated. They accepted and played magnificently.
What is it like to compose a song assisted by artificial intelligence?
AI allows me to increase my scope of experimentation and increases my powers! As I build my songs, from composition to arrangements to production, I call on these prototypes to push me around. Like any creator, I try to achieve a creative state that could be compared to hypnosis, a controlled letting go. Despite these new powers, it is not any easier. I spend a lot of time sorting through the results, looking for beauty or poetry. A song can take me a month to complete.
Can you tell us about your new project, Interface Poetry?
I started writing my new album in March 2020, just after a concert in Tokyo organised by the Institut français, during which I gleaned sounds from the street and listened to a lot of trendy J-pop! I fed the algorithms with these sounds and they adapted them to my music. Lockdown forced me to work alone, with my shadow, and explore a feeling that we all perhaps shared a little: a melancholy in that period when we were all stuck at home, in the middle of a deserted city, which was off-limits to us. As the album progressed, the idea of a deserted city was born, one where Andersen's Shadow wanders around and tells of his ennui.
Is there a difference between using AI during the composition and production process and using it live on stage?
I have been working with the visual artists' collective OYE Label to create this city in 3D. AI exists in the relationship between sound and image. I am so happy to be able to travel again and share this Ballad of the Shadow on stage. In the concert I also show my creative process with AI on the famous song Daisy, sung for the first time by an IBM in 1961 (and used in 2001 Space Odyssey). After the concert and where possible, I offer a Q&A with the audience showing concrete examples.
Skygge's new project, Interface Poetry, is presented as part of Novembre Numérique 2021 and in La Collection 2022. La Collection offers turnkey proposals for the French cultural network abroad in the fields of performing arts, visual arts and architecture, town planning and landscape, easy to distribute and modular in format.
Find out more about La Collection
In 2021, Skygge also took part in the Webinar "Music and Innovation" organised by the Institut français.
Invited by the Institut français of Tokyo in 2020 as part of Digital Choc, Skygge will also be the guest of other locations in the French cultural network abroad in 2022.
Most popular within the same topic