At Cyanite we empower music companies to take their catalogs and music search to the next level with the help of AI. At the same time, our API can be used by anyone to develop their own breakthrough products. This new article on the blog presents one of the creative uses of Cyanite API in game development.

In the article, Belgian data scientist Aubin Huet explains, how he created an adaptive soundtrack for a video game using Cyanite’s data. The overlying question is nothing less but: how to find the right music for the right time?

Originally written by Aubin Huet
The omniPhony Project
The omniPhony engine builds the soundtrack to your life, and it never does the same thing twice. It takes in what happens in the world around you and translates it to music by weaving together different pieces of music.

The boxing of music in neat little pre-made packages is from another era, as is Monet’s type of impressionism. What we are evolving to is beauty like the apple tree I can see from my window. There are no 2 days that it looks the same, sometimes the wind shakes the branches, and sometimes the sun falls through the leaves. It’s a brave new world, where beauty is a passing thing, and capturing it means allowing it to never be the same.

It is no coincidence that omniPhony uses video games as context. For many years games have been at the forefront of what is known as adaptive music. In short, music is specifically composed and arranged so that it can easily interact with the player’s actions. OmniPhony is different because it builds a framework that doesn’t limit it to a predefined score but can actually take any collection of music and craft a fitting soundtrack out of it. In this article, I’ll be explaining how omniPhony uses music data from Cyanite’s API to add music to the awesome video game Elite: Dangerous. 

For more ways on how AI can be used in the music industry and beyond check the article: The 4 Applications of AI in the Music Industry.

Space odyssey docking in Elite: Dangerous

In the game, the player travels a vast open universe and is free to do as he pleases. omniPhony uses computer vision to track the player’s actions and detect what he’s doing. Some examples of these game states are being engaged in conflict, docking with a space station, traveling through the universe in supercruise, or racing through the canyons on a planet at break-neck speed. 

Elite: Dangerous docking

The Score
Much like in a movie, where a film composer is looking for a synergy between the scene and its score, the engine looks for suitable music in a collection of data points extracted by Cyanite’s analysis tools. These are divided into 2 categories; those that describe the character of the piece and those that can be used to describe similarity.

 

  • For the character of the piece, we use Cyanite’s mood keywords. A very powerful feature, it gives a float value for tags like Epic, Sad, Calm or Romantic. Each of these keywords is in fact mapped to the possible game states the computer vision module can detect. If the state of conflict were to be detected, it loads a set of weights for each keyword. This weight would be low for Calmness, and high for Epicness, and when these weights are multiplied with the values, the sum of them can describe music that matches the scene by filtering the highest values.
  • This gives us a good playlist for each scene, but we should be ready for the scene to change from one state to the next. This means we must be able to mix 2 pieces of music at any given point in the playback. For this, we can use other Cyanite data points such as valence, arousal, instrumentation, and genre, and again, assign weights to each of these.

Because these data points are associated with timestamps we can look for a close match at the exact moment the game state changes, and transition into any other piece at any point in the music.

The whole library where every piece is analyzed by global character.

The mixing of the music based on local similarities but switching between the different global characters.

Optimization
Combining these two categories is no trivial task given that you’ll be performing database lookups a few times a second over all the timestamps for all of the data points for each piece of music you’ll be selecting from, not to mention you’ll be running computer vision networks and audio playback modules at the same time.

For that reason, the omniPhony engine uses feature reduction with AutoEncoder networks to optimize performance. An AutoEncoder is a type of neural network that takes an input and compresses it down to a very small representation of it, called the bottleneck, and then reconstructs it. It is then trained by calculating the loss between input and reconstruction. In doing this, you force the model to learn its own rules for memorizing the data

When the model is fully trained you can then use the latent space variables generated in the bottleneck as new features to base the search on. Using this technique, the features, in our case, were reduced by 70%. 

But don’t take my word for it, here it is in action:

Watch the omniPhony engine in action on Youtube

Adaptive music is specifically related to video games, but in the area of music recommendation, context-based recommendation algorithms take the same approach of finding music fitting to the environment or user activity such as walking or running. See the article How do AI Music Recommendation Systems Work where Cyanite discusses this case in detail.
The Mission
All things said and done, this project is about setting music free. While I’m writing this, more music is being created in recording studios and bedrooms than ever before, and because of that its meaning is getting lost through sheer quantity. Whether it’d be for a video game or something else, omniPhony tries to find the right music at the right time. Giving music context gives it back its meaning. 
About the author

The omniPhony project was started in 2019 by Aubin Huet.

Being both a musician and data scientist, he combines cutting edge technology with a deep understanding of music to build the future of streaming.

I want to use Cyanite API for my project – how can I get started?

Please contact us with any questions about our Cyanite AI via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

If you want to get the first grip on Cyanite’s technology, you can also register for our free web app to analyze music and try similarity searches without any coding needed.

More Cyanite content on AI and music