Concerta:AI for Everyone


The word ‘concert’ comes from the Italian phrase ‘concerta’ which translates loosely to to sing.’ A powerful phrase to think about especially if you are standing in a concert right now and truly the feeling of a hundred voices chiming in to sing a song that means something so different to each and every one of us is a feeling to be remembered forever as perhaps the greatest moment of your life. I recently attended one of my first concerts and it is safe to say that I will not be forgetting it in any hurry but there were some other moments to be taken into consideration, that I would most definitely not be forgetting any time soon. 

 Concerts are stressful things, I felt stressed just attending one and I can only imagine what it must be like for the people organizing such things or the artists themselves. The crowds, the trash littered all over the place, that one drunk person taking their shirt off in front of you, being way too short to see the stage and consoling yourself by stretching your hands as high as possible to try and record your favorite singer only to end up recording a sea of people’s heads, these are all the downsides to the experience. And also did I mention the queues and the pushing? At one point I could literally feel my feet lift off the ground as the crowds rushed through the barricades into the event. It was hard enough for me to breathe let alone enjoy the music but somehow at a concert people are for one second united by something larger than themselves, a sense of purpose that they were meant to be there and only there at this time. That sense of purpose does remarkable things to our brain, the closest feeling I could perhaps liken it to is a soldier going off the war knowing the experience may not be the most comfortable yet, still going because of our belief in it. 

 As I stood in that long concert line it got me thinking, is it possible for me, as an engineer and someone (according to myself) who is a practical observer of the world’s functions, think of ways the experience could be enhanced into something a little less like going to war and fighting your fellow concert attendees to catch a glimpse of whatever singer is performing and more like a group of friends dancing to their favorite song excitedly in a bedroom, speakers blaring with not a care in the world. 

 The answer dawned upon me, AI was the key. Artificial Intelligence, is already being implemented in various ways by artists and tech conglomerates joining forces together and creating something spectacular. Take for example, the AI programmed tech idol, Hatsune Miku hailing from Japan who is an online global sensation. The beauty of Hatsune Miku is that she has been built on the Yamaha company’s Vocaloid 2, Vocaloid 3 and Vocaloid 4 which are music synthesizing technologies alongside Crypton Future Media’s Piapro music synthesizer editor which means that anyone’s songs can be sung with Hatsune Miku’s voice whether you are a full fledged music creator in a large studio or perhaps an amateur musician in his mother’s basement, you could be a global singing sensation. However, I am not here to talk about the numerous ways in which AI is influencing music composition rather the numerous ways in which it could influence the music experience. 

 Katy Perry, during her ‘Witness Tour’ embraced AI to create stunning and immersive concert experiences by enhancing sound and light design with technology. AI powered stage lighting, visuals and sound design can be used to create more immersive experiences suited to the general crowd mood and diaspora present at the event. This according to me is the most basic and easiest step that artists can take to improve their concerts through artificial intelligence. Traditional setups rely heavily on manual power and pre configured settings which are cumbersome but AI can allow some of these processes to be automated and adaptable. Machine learning algorithms can analyze music tracks similar to the tracks sung by artists and predict lighting effects from these tracks, systems could interact with audiences and display different colors as the audiences voices grow louder, computer vision could allow real time monitoring of participants responses to certain movements, gestures and facial expressions and GANs that is Generative Adversarial Networks could create powerful visuals that blur the lie between reality and digital artistry. GANs are class of AI models that work by pitting two neural networks against each other, so we have one network that focuses on the generation of images and the other that focuses on the discrimination of them on the basis of their realism. There are already several corporations working on projects that do improve sound and visual design. We have Lumen, a neural network powered psychedelic visual agency that consists of an artificial intelligence that consistently analyzes input data such as audio frequencies and motion cues transforming them into psychedelic patterns. We also have SoundSwitch that works on light synchronization with music by analyzing audio patterns to generate lighting effects. There is Lightform an interactive Project Mapping company that uses computer vision to generate a 3D map of any physical surface that could potentially be useful in mapping visual and sound installations for concerts. 

 VR or Virtual Reality technologies can also be used to improve a concert experience. Famously used by rapper, Travis Scott to create avatars but VR could be so much more than just that. VR technology can create a simulated environment of the concert using headsets or input devices and sensors could be used to track movements and adjust the technological landscape accordingly. This potentially could allow for us short dwarfs at concerts to actually see their favorite artists in front of them, performing for them and making it feel like it was worth skipping a Saturday night to come see them. Real time streaming or pre recordings and spatial audio recordings of concerts will need to be taken and distributed but artists and fans could truly interact with each other and better still, you could interact from any corner of the world. Artists could create customizable story lines for concerts with holograms, special effects and 3D visuals. Companies working on these technologies include Oculus Venues, a platform owned by Meta that allows you to experience concerts in virtual reality. Sansar produced by Linden Labs, is another corporation that focuses on creating socially immersive experiences for concerts and music festivals and AltSpaceVR, owned by Microsoft that focuses on providing VR experiences for not only concerts but other such events. 

 Another application of AI in concerts is through enhancing fan engagement and this was done by none other than Taylor Swift. During her ‘Reputation Tour’ she used facial recognition technology to identify concert goers thereby preventing any unnecessary visitors from entering her concerts. She uses AI to enhance her social media presence and to understand fan behavior and preferences allowing her to cater more to their needs while producing music. AI driven crowd monitoring and heatmaps are used in the Wembley Stadium and Coachella uses RFID technology for contactless crowd management. Companies like WaitTime provide real time monitoring functions for concerts, Eventio provides AI powered event planning assistants and software and EventHex.ai works towards better management at events again, through AI. 

 Recently in India we had Coldplay a band which is immensely popular and also a band that designs their concert experiences by blending music and visuals. They are known to use data science to improve fan engagements, AI to sync visuals with music and 3D mapping to help the visuals blend in with the stage. They also deploy sound fine-tuning to ensure that the transitions are crisp and that all concert goers are able to hear the music. 

As a student of computer science however I have to think of more than just what is being implemented in the field currently. Standing in line the ideas popping into my head were as follows. I noticed as soon as I entered was the number of people rushing over to the merchandise section and how amazing it would be if fans could input concepts to create their own personalized merchandise, holographic duets could allow artists to collaborate with other legendary musicians, dead or alive, AI models could summarize key moments in concerts and submit 'personalized recaps' for fans, if fans would like to isolate certain performers for example, a fan would like to only hear a drum solo for one song it would be possible with VR and AR glasses could be handed out for fans to see custom effects or exclusive data like photographs and videos downloaded during certain parts of songs, really the sky is the limit for irresponsible innovation. 

 Critically we are at a junction in the entertainment industry where technology and art will become integrated with each other and especially when we talk about AI taking over the world and all the jobs with it, in my opinion arts and music is something that it could never colonize. The finer touches to the more humane expression of our emotions through these mediums is something few shocks of electricity through a circuit could never understand however, technology is forever a tool that could be used to define a new generation of concerts and artists. AI for everyone is a series I would like to pursue on this blog because it means that technology is for everyone and everything, not as a big bad wolf ready to blow down the house rather help build up a new foundation for the future.



PS : LOVED YOU IN BANGALORE ED!

Comments

Post a Comment

Popular posts from this blog

The Social Slot Machine

Introduction to TechTok

The Zero IQ Machine That Rules Our World