In the article, Belgian data scientist Aubin Huet explains, how he created an adaptive soundtrack for a video game using Cyanite’s data. The overlying question is nothing less but: how to find the right music for the right time?
The boxing of music in neat little pre-made packages is from another era, as is Monet’s type of impressionism. What we are evolving to is beauty like the apple tree I can see from my window. There are no 2 days that it looks the same, sometimes the wind shakes the branches, and sometimes the sun falls through the leaves. It’s a brave new world, where beauty is a passing thing, and capturing it means allowing it to never be the same.
It is no coincidence that omniPhony uses video games as context. For many years games have been at the forefront of what is known as adaptive music. In short, music is specifically composed and arranged so that it can easily interact with the player’s actions. OmniPhony is different because it builds a framework that doesn’t limit it to a predefined score but can actually take any collection of music and craft a fitting soundtrack out of it. In this article, I’ll be explaining how omniPhony uses music data from Cyanite’s API to add music to the awesome video game Elite: Dangerous.
For more ways on how AI can be used in the music industry and beyond check the article: The 4 Applications of AI in the Music Industry.
Space odyssey docking in Elite: Dangerous
Elite: Dangerous docking
- For the character of the piece, we use Cyanite’s mood keywords. A very powerful feature, it gives a float value for tags like Epic, Sad, Calm or Romantic. Each of these keywords is in fact mapped to the possible game states the computer vision module can detect. If the state of conflict were to be detected, it loads a set of weights for each keyword. This weight would be low for Calmness, and high for Epicness, and when these weights are multiplied with the values, the sum of them can describe music that matches the scene by filtering the highest values.
- This gives us a good playlist for each scene, but we should be ready for the scene to change from one state to the next. This means we must be able to mix 2 pieces of music at any given point in the playback. For this, we can use other Cyanite data points such as valence, arousal, instrumentation, and genre, and again, assign weights to each of these.
Because these data points are associated with timestamps we can look for a close match at the exact moment the game state changes, and transition into any other piece at any point in the music.
For that reason, the omniPhony engine uses feature reduction with AutoEncoder networks to optimize performance. An AutoEncoder is a type of neural network that takes an input and compresses it down to a very small representation of it, called the bottleneck, and then reconstructs it. It is then trained by calculating the loss between input and reconstruction. In doing this, you force the model to learn its own rules for memorizing the data.
When the model is fully trained you can then use the latent space variables generated in the bottleneck as new features to base the search on. Using this technique, the features, in our case, were reduced by 70%.
But don’t take my word for it, here it is in action:
Watch the omniPhony engine in action on Youtube
The omniPhony project was started in 2019 by Aubin Huet.
I want to use Cyanite API for my project – how can I get started?
If you want to get the first grip on Cyanite’s technology, you can also register for our free web app to analyze music and try similarity searches without any coding needed.
More Cyanite content on AI and music
Video Interview – How Cinephonix Integrated AI Search into Their Music Library
A music library stands out by its content but also by how fast, intuitive and easy the users can find the music they need. In other...
#1 Case Study Video Interview – How did MySphera integrate Cyanite’s API into their platform?
In our first case study video interview with MySphera, we explore how the artist to tastemaker matchmaking platform integrated...
How BPM Supreme enriched their music services with AI-generated moods and search algorithms from Cyanite
BPM Supreme is a digital music service delivering a wide choice of music for professional DJs, producers, and artists through an...
Music Ally Startup Files with Cyanite
One and only Music Ally covered us in their latest StartupFiles. In an interview with two of Cyanite's co-founders, Markus and Jakob,...
Case Study: How SWR uses Cyanite’s recommendation algorithms for their new radio app
About SWR SWR is not only Germany’s 3rd biggest radio station but also investing heavily in the future of radio via its own...
How AI Empowers Employees in the Music Industry
According to a McKinsey report, 70% of companies are expected to adopt some kind of AI technology by 2030. The music industry is not...