How AI Empowers Employees in the Music Industry

How AI Empowers Employees in the Music Industry

According to a McKinsey report, 70% of companies are expected to adopt some kind of AI technology by 2030. The music industry is not an exception. Yet, when it comes to AI, skepticism often overshadows the potential. AI is thought to be a job killer and a force that will outperform humans in creativity and value production. This popular view causes general anxiety across all industries. This article is intended to add a new point of view about AI in music and show how AI can be used to solve typical business problems such as employee motivation and getting the work done in a more meaningful way.

The music industry is special in its AI adoption journey, as it is one of the industries that has very vivid problems that can be solved by AI. The number of tracks uploaded to the internet reaches 300.000 a day and the number of artists on Spotify could be 50 million by 2025. In this situation, the output exceeds the human capacity to manage it, so sometimes there is no other way but to use the AI. Despite the benefits, the negative impact of AI is a common subject in the media – see the article here

Rise of Robots

Annie Spratt © Unsplash

For a more balanced view on the topic, let’s explore how AI can actually help humans working in the music industry if the goal is not to maximize profits and productivity but empower the workforce. 

How AI can improve the employee experience

Speed up music tagging and search 

In their work, music companies often have to deal with music search. The essential part of a music search is a well-managed and consistently tagged music library. Such a library usually uses a lot of metadata. Some of the metadata are “hard factors” such as release date, recording artist, or title. But “soft factors” such as “genre”, “mood”, “energy level”, and other music describing factors become more and more important.

Obviously, assigning tags for those soft factors to music is a tedious and subjective task. AI helps with these tagging tasks by automatically detecting all the data in a music track and categorizing the song. It can easily decipher between rock music and pop music, emotions, such as “sad” or “uplifting”, add instruments to a track, and tag thousands of songs in a very short time. 

But in fact, music tagging AI doesn’t perform without errors. AI is not perfect and it still needs human supervision.

Automatic tagging increases work speed which gives companies the opportunity to quickly add songs to their catalog and pitch them to customers. 

Clean up a huge amount of data 

As it is with AI, there is always a possibility of errors when a person manages the catalog. Especially when different team members managed a catalog over the years, the number of errors can become overwhelming. 

AI helps reduce human errors in existing catalogs as well as prevent them in the future. For existing catalogs, AI can provide an analysis of data, discover oddly tagged songs, and then eliminate discrepancies. AI will also make some mistakes but it will do them in a consistent way. You can see how it works step by step in this case study here

Enrich tedious music discussions with AI-generated data 

AI is great for visualizing data and making complex information digestible – and so it is for music! In some visualizations, songs are grouped by emotions so you can get a very comprehensive view of the library. For single tracks, AI can analyze emotions across the whole track or in a custom segment. Another way to visualize a catalog is to group songs by similarity so most similar artists are bundled up together.

Ever tried to convince a data-driven CMO that this one song has a melancholic touch and doesn’t fit too well to the campaign? Try to back up your expertise with some data the next time!

Instruments in Detail View
Instruments analysis in Cyanite app

In any case, AI can complement marketing and sales efforts by giving the companies tools to visualize the catalog and song data and then use this data to sell. On a song-by-song basis, visualization provides a snapshot of the song that can be understood easily. But, really, visualization of data emphasizes innovative data-centered positioning of the company and adds a bit of spice to the sales efforts.

At Cyanite, we even created several music analysis stories using visualization.

Reduce human bias and make data-based decisions 

AI in music can ensure every decision is based on data, not emotions. For example, when choosing tracks for a brand video it is important that it adheres to the brand guidelines. But more often, tracks are chosen simply because someone liked them. 

To avoid human bias, checking in with AI can be implemented into the business strategy for more consistent and better branding efforts. For example, in the case of a branded video, AI will offer songs that correspond to the brand profile whether it is “sexy”, “chill”, or “confident”. 

All these capabilities of AI drastically improve the quality of the end result for customers and allow to get rid of tedious and boring tasks for employees.

How AI can boost motivation

Spend more time on creative and meaningful tasks 

One of the AI benefits is that employees can focus on creative solutions rather than on repetitive tasks. This frees up individual qualities such as empathy, communication, and problem-solving. Those qualities are then useful for customer acquisition and service, as the studies confirms.  If you are working in sync and answering a sync briefing, finding right-fitting songs from the catalog with AI’s help leaves you more time to add a creative storyline why this song is a great fit beyond its pure sound. In the end, AI has the potential to increase the customer service level while contributing to higher employee satisfaction. 

Employee

LinkedIn Sales Solutions © Unsplash

Speed up learning and training new employees 

In one of the case studies, we’ve shown the process of cleaning a music library. Some assets that were created during the project can be used by the company to teach and train new employees. For example, a visualization of song categories can be used as a guide for new staff who are in charge of tagging new songs. See for more details here Also, starting to work with a catalog of 10,000 songs represents a very high entry barrier and it usually takes months to understand a catalog in depth. With a Similarity Search, like the one from Cyanite or other services like Musiio, AIMS or MusiMap, a catalog search can start intuitively and easily with a reference track. It provides guidance and creates more opportunities for meaningful human work. Overall, AI is characterized by ease of use. It is highly intuitive, doesn’t need much time to be set up, and produces results at an instant. The better UX helps not only employees but also the customers if they have access to the catalog. To see for yourself, you can try the Cyanite web app here. 

Ensure consistent and collaborative approach to work processes and policies

In general, AI follows one consistent tagging scheme and does so automatically which means less control is needed from a human side to keep things going. Having clean metadata means at any point in time, the catalog can be repurposed, offered to a third party, and integration can be done. And integration will become more and more important in the future: could you directly serve a new music-tech startup that wants to offer your catalog in their new licensing platform? How well are you equipped to seize business opportunities? When a catalog includes many different music libraries, and there is a need for a unified approach, AI will scan the catalog for keywords that are equal in meaning and eliminate the redundancies. When a catalog is being integrated into a larger audio library, the AI will draw parallels between the two tagging systems and then automatically retag every song in the style of the new catalog at little to no information loss rate.  In general, having clean metadata and the ability to repurpose catalogs allows music companies to experiment with their offers and be more agile and innovative. 

Summary 

There are many benefits of AI for music companies. But also there are quite a lot of risks. When looking at AI in the music industry, it is important to understand that AI isn’t replacing jobs but it is a tool to work with and help employees improve. Of course, AI tools are different. In the case of Cyanite, the AI is handling boring repetitive tasks such as music analysis, tagging, and search. At the same time, it gives people the opportunity to work on something more meaningful and inspiring.

However, the introduction of AI not only in the music but in any industry has the potential to bear a variety of risks. That is why we are advocates for empowering human work with AI. It is important to stay critical, question new technology, and help its creators make the right decisions.

The Sound of Traumprinz – AI Music Analysis

The Sound of Traumprinz – AI Music Analysis

In the world of electronic music, shifting between different aliases allows artists to explore different sounds and demonstrate their versatility without transforming existing music identities too much. But is it really true that different aliases mean different sound for all the DJs? In this article, we are going to analyze the music of one of the famous German DJs across all of their aliases to confirm or challenge that view.

Late last year, Dutch musician Afrojack, known for EDM, revived his house and techno alias Kapuchon, releasing a housey single ‘10 Years Later’. With the TESTPILOT alias, deadmau5 flaunts his techno chops. 

But, not all aliases signal a vastly different sound change. Some reveal more subtle transformations, which makes it a little more challenging for listeners, fans, and reviewers to articulate, but never any less pleasing to the ears.

In these cases, how can machine learning help to identify even the more granular differences in your music catalog? 

We put our Cyanite music intelligence tools to the test through tracking and analyzing the different aliases of quite possibly one of the most reclusive German producers in the underground electronic world, Traumprinz. 

The Banksy of the underground music community, his presence is marked by sporadic Soundcloud releases across various identities, no announcements of live gigs, and definitely no hints at his real name. Having produced under 7 different aliases throughout his career, the elusive producer’s musical output spans techno, ambient, and house, and everything in-between, and yet somehow remains recognizably ‘Traumprinz-sounding’.

We analyzed EPs and albums from all 7 aliases, amounting to over 150 songs. Today, we share with you some interesting insights gleaned using our mood and genre algorithms.

Analyzing tracks from all his aliases, we obtained unique statistics on each track’s genre breakdown, emotion breakdown, BPM, and more. From there, we arrived at alias-level breakdowns, and a whole-of-Traumprinz, combined aliases musical breakdown of his ‘average’ sound across all aliases. 

The emotions of Traumprinz’s many aliases

Mood Analysis Traumprinz

Sentiment analysis in the world of Music AI goes beyond positive and negative. Our Cyanite models detect 13 different facets of emotions in the tracks it analyzes. 

Overall, our analysis shows that Traumprinz sound veers towards the contemplative, melancholy territory, being detected by our music models as largely spherical, sad, and dark. Musical output from the DJ Healer era is detected as being the saddest, most calm, most spherical, and most chilled out of all 7. Occasional experimentation with the lighter side of things is seen in his releases under the DJ Metatron and The Phantasy aliases. 

If you are an artist, check out this article on how to build trust with the gatekeepers in the music industry.

The ‘benchmark’ Traumprinz sound

The Cyanite intelligence tools revealed that the Traumprinz sound can be largely summed up (if that was ever possible!) as being largely electronic dance (techno and house-oriented), with a significant touch of ambient and in some parts classical.

Traumprinz pie chart

And now, on to a deep-dive:

Genre Analysis Across Aliases Traumprinz

The most electronic dance era: Traumprinz

With more house-tinged tracks in releases such as Into the Sun, Mothercave, and Intrinity, and a more upbeat BPM of 121, the Traumprinz era was detected as being 82.6% electronic dance in the genre makeup, soaring above the average of 59.5%. 

The most ambient-sounding era: DJ Healer 

We found that Traumprinz’s songs were most ambient-sounding under the DJ Healer alias, with the solemn, sophisticated Nothing 2 Loose release, moderately paced at an average BPM of 102. At 71.9% ambient, this amount was far above the average Traumprinz ambient level of 28.4%. 

The most classical-sounding era: DJ Healer

At 8.0% classical in makeup, songs from the DJ Healer days were found as being the most classical, although DJ Metatron alias, with releases like Loops of Infinity, was found to be a very close runner-up, at 7.8% classical.

Looking at the other aliases, ambient makes up slightly more than one quarter of DJ Metatron’s overall sound, and electronic dance in vibe dominates. Compared to DJ Metatron, Golden Baby has more of that electronic dance feel, and less of the ambient. Musical output from The Phantasy era very similar mirrors Golden Baby in genre profile, with just a little bit more ambient.

Even dancier than the Golden Baby is the Prince of Denmark sound. Finally, the closest runner up to Traumprinz for the most electronic dance-sounding era would be the Prime Minister of Doom alias. 

Energy levels and emotional profiles

Apart from the data on genre and moods, our analysis uncovered the energy level and emotional profile of each song. Our general summary of the analysis can be described as follows: 

  • The emotional profile tends to be negative with very few songs being neutral, and even less tracks being tagged as positive

  • The energy level, for the most part, alternates between low and medium. But when it comes to Prince of Denmark and the Phantasy, there are definitely more high-energy songs compared to other aliases. Are these two aliases a way for the producer to show a more energetic side? 

Here is a detailed Excel sheet with each song’s data. Click on the links in the file to see the full albums. 

Our final thoughts: AI as a tool for discovering new ways for music curation

Through this music analysis experiment, we can once again see how music tagging and categorization software can be used as a counterpart to human judgment and instinct when it comes to appreciation of music. Moreover, this data can be used to select the tracks for DJ sets to interchange smoothly. For music companies, it represents the inner workings of AI, which can be used to sort music in the catalog and make similar song recommendations. 

To end off, here’s a mix of Traumprinz for your listening pleasure.

How to Create Custom Audiences for Pre-Release Music Campaigns in Facebook, Instagram, and Google

How to Create Custom Audiences for Pre-Release Music Campaigns in Facebook, Instagram, and Google

As a music label, you know how hard it is to promote a new artist and cut through the noise. Most music advertising agencies and labels choose to do Facebook and Instagram marketing as an easy way to start. There are some important things you should know about setting up such campaigns but there are already great guides and tips on that here and here. What we want to cover in this article are the steps you can take to identify the audience for a completely new track.

An interest-based audience is a tool that allows you to select customers based on their interests. You can target people who are fans of other artists or people who browse websites similar to your artist’s webpage. Interest-based audiences feature significantly narrows your audience to the most relevant group, thus increasing your chances of reaching the right people. 

If you are a big music label you can also use custom audiences on Facebook. You might have thought that custom audiences are only applicable to music that already gained its following, but there is a workaround. All you need to do is find similar artists who fit the roster of your label and then launch a campaign based on the same audience but for the new artist. Thus you make the most out of your advertising efforts. 

With Spotify, it’s easy to find similar artists and their respective fan communities, but when the new song is not yet released or you are breaking a new artist, Spotify algorithms won’t work. So how do you identify similar artists for a track that is not released yet? You can use Cyanite’s Similarity Search to solve that problem. The Cyanite Similarity Search compares the sound of the song you want to promote with hundreds of thousands of other tracks and finds the ones that sound similar.

At this point, kudos to Maximilian Pooschke of Virgin Music Label & Artist Service  for bringing this use case of Cyanite’s similarity search to our team’s attention.

Cyanite’s Similarity Search is an intuitive tool when I’m creating custom audiences for new artists on social media. Especially for music that doesn’t easily fit into a box, the similarity search is a great entry point for campaign planning.”

Maximilian Pooschke

Virgin Music Label & Artist Service

Now, here is a step-by-step guide on how to create custom audiences using similar artists identified by Cyanite.

Step 1. Upload music to the library view and let Cyanite analyze it

Drag and drop your music to the library view. Before you do that, you need to register for free here https://app.cyanite.ai/register

Library view
Picture 1. Cyanite library view

The library view will show some data about the song such as mood, genre, energy level, emotional profile, and more. You can explore this data or move to the next step.

Step 2. Find similar songs using Similarity Search

Click on Similarity next to the analyzed song in the library to start finding similar songs from our showcase database of around 600k popular songs. Our similarity algorithms work to offer you the most relevant and precise results and focus purely on the actual sound and feel of a song. Additionally, you will see all the same analysis data available for all the songs including Moods, Energy Level, and Emotional Profile. 

Step 3. Play around with the different filters for more granular insights

Often the magic occurs when you apply different filters. Use the custom interval, play around with tempo, genre, and key, and dive deeper into different results. Then pick the artists and tracks you find most relevant from the Cyanite suggestions.

Step 4. Enrich your findings with additional data from sources like Chartmetric

To get more details on discovered similar songs and artists, you should use other data sources to further narrow down your selection and be as precise as possible. You can check out festivals, radio stations, and/or magazines to enrich your search and select more source audiences for your audience.

Custom interval
Picture 2. Cyanite Similarity Search based on custom interval

Step 5. Go and select your audiences

Off to Facebook or Instagram to create your audiences with the popular artists you have found and selected with the Similarity Search. Use interest-based targeting and enter a similar artist’s name as a keyword. Play around with keywords for maximum results. You can use artists’ names, song names, genres, or other keywords. A good comprehensive resource on how to use and manage Facebook, Instagram, and Google ads is AdEspresso

Facebook Ad Settings
Picture 3. Facebook Ads Detailed Targeting

This is just one of many ways to use Cyanite for your purposes. You can check out this article to find out more on how to use Cyanite for playlist pitching or this one to find out how to use Cyanite to find music for your videos.

The 4 Applications of AI in the Music Industry

The 4 Applications of AI in the Music Industry

A couple of weeks ago, Cyanite co-founder Jakob, gave a lecture in a music publishing class at Berlin’s BIMM Institute. The topic was to show and give concrete examples of AI’s real use cases in today’s music industry. The goal was to get away from the overload of buzzwords surrounding the AI topic and shed more light on AI’s actual applications and benefits.

This lecture was well received by the students, so we decided to publish its main points on the Cyanite blog. We hope you enjoy the read!

Introduction

Many people, when they hear about “AI and music”, think of robots creating and composing music. This understandably comes together with a very fearful and critical perception of robots replacing human creators. But music created by algorithms merely represents a fraction of AI applications in the music industry. 

AI Robot & Music
Picture 1. AI Robot Writing Its Own Music
This article is intended to explore:

1. Four different kinds of AI in music.

2. Practical applications of AI in the music industry. 

3. Problems that AI can solve for music companies.

4. Pros and cons of each AI application.

How does AI work? 

Before we dive into the four kinds of AI in the music industry, here are some basic concepts of how AI works. These concepts are not only valuable to understand but they can help come up with new applications of AI in the future. 

Just like humans, some AI methods like deep learning need data to learn from. In that regard, AI is like a child. Children absorb and learn to understand the world by trial and error. As a child, you point your finger at a cat and say “dog”. You then get corrected by your parents who say, “No, that’s a cat”. The brain stores all this information about the size, color, looks, and shape of the animal and identifies it as a cat from now on. 

AI is designed to follow the same learning principle. The difference is that AI is still not even close to the magical capacity of the human brain. A normal AI neural network has around 1,000 – 10,000 neurons in it, while the human brain contains 86 billion!

This means that AI can currently perform only a limited number of tasks and needs a lot of high-quality data to learn from.

One example of how data is used to train AI to detect objects in pictures is a process called reCAPTCHA. This is a system that asks you to select traffic lights in a picture to “prove you are human”.

The system collects highly valuable training data for neural networks to learn how traffic lights look like.

ai learning
Picture 2. AI Learning with reCAPTCHA
If you are interested to learn more about how this process works for detecting genres in music, you can check out this article.

The 4 types of AI in music

Now that you understand the basic AI concept, here is an overview of the four main applications of AI in the music industry. Keep in mind that there are many more possible applications.

1. AI Music Creation

2. Search & Recommendation

3. Auto-tagging

4. AI Mastering

Let’s have a closer look at what problems each area addresses, how the solutions work, and also explore their pros and cons!

Application 1: AI-Generated Music

Problem

Problems that AI can solve in the AI creation field are not very apparent. AI-generated music is, firstly, a creative and artistic field. However, if we look at it from a business context we can identify existing problems. When the music needs to adapt to changing situations, for instance, in video games or other interactive settings, AI-created music can adapt more natively to changing environments. 

Solution

AI can be trained to create custom music. For that AI needs input data and then it needs to be taught to make music. Just like a human.

To understand current AI creation capabilities here are a couple of real-world examples:

Yamaha company analyzed many hours of Glenn Gould’s performance to create an AI system that can potentially reproduce the famous pianist’s music style and maybe even create an entirely new Glenn Gould’s piece.

A team of Australian engineers won AI “Eurovision Song Contest” by creating a song with samples of noises made by koalas and Tasmanian devils. The team trained a neural network on animal noises for it to produce an original sound and lyrics. 

Who is AI-generated music for?

  • Game Studios
  • Art Galleries
  • Brands  
  • Commercials  
  • Films  
  • YouTubers  
  • Social Media Influencers

Implementation Examples

Pros of this solution

  • Cheap to produce new content
  • Customizable
  • Great potential for creative human & AI collaboration
  • Creative tools for artists.

Cons of this solution

  • The quality of fully synthesized AI music is still very low
  • No concrete application in the traditional music industry
  • Legal issues over the copyright including rights to folklore music 
  • Most AI creation models are trained on western music and can reproduce western sound only
  • Very high development cost.

Bottom line

It will take some time for AI-created music to sound adequate or have a straight use case. However, hybrid approaches that use AI to compose music with pre-recorded samples, loops, and one-shots show that the AI-generated future is not far away.

Application 2. Search & Recommendation

Problem

It can be hard to find that one song that fits the moment perfectly, whether it is a movie scene or a podcast. And the more music a catalog contains, the harder it is to efficiently search it. With 500 million songs online and 300,000 new songs uploaded to the internet every day (!!), this can easily be called an inhuman task. Platforms like Spotify develop great recommendation algorithms for seamless and enjoyable listening experiences for music consumers. However, if we look at sync, it gets a lot more difficult. Imagine a music publisher who administers around 50,000 copyrights. Effectively they can oversee maybe 10% of that catalog leaving a lot of potential unused. 

Solution

AI can be trained to detect sonic similarities in songs.  

Who are Similarity Searches for?

  • Music publishers: using reference songs to search their catalog
  • Production music libraries and beat platforms
  • DSPs that don’t have their own AI team
  • Radio apps
  • More use cases in A&R (artist and repertoire) and etc.
  • DJs needing to hold the energy high after a particularly well-received track (in the post-Covid world)
  • Basically, anyone who starts sentences like “That totally sounds like…”
  • Managers targeting look-alike audiences. 

Implementation Examples

Pros of this solution

  • Finding hidden gems in a catalog which goes far beyond the human capacity for search. Here both AI-tagging and AI search & recommendation are employed
  • Low entry barrier when working with big catalogs
  • Great and intuitive search experiences for non-professional music searchers.

Cons of this solution

  • Technical similarity vs. perceived similarity – there is still quite a lot of difference in how a human and AI function. Human perception is highly subjective and may assign higher or lower similarity to two songs, which may be different to what AI thinks. 

Bottom line

All positive. Everyone should use Similarity Search algorithms every day.

Application 3. Auto-tagging

Problem

To find and recommend music, you need a well-categorized library to deliver the tracks that exactly correspond to a search request. The artist and the song name are “descriptive metadata”, while genre, mood, energy, tempo, voice, language are “discovery metadata”. More on this topic here. The problem is that tagging music manually is one of the most tedious and subjective tasks in the music industry. You have to listen to a song and then decide the mood it evokes in you. Doing that for one song might be ok, but forget about it at scale. At the same time, tagging requires extreme accuracy and precision. Inconsistent and wrong manual tagging leads to a poor search experience, which results in music that can’t be found and monetized. Imagine tagging the 300,000 new songs uploaded to the internet every day. 

Solution

Tagging music is a task that can be done with the help of AI. Just like in the example in the first part of this article, where an algorithm detects traffic lights, neural networks can be trained to learn how, for example, rock music differs from pop or rap music.

Here is a Peggy Gou’s song, analyzed and tagged by Cyanite: 

You are currently viewing a placeholder content from Default. To access the actual content, click the button below. Please note that doing so will share data with third-party providers.

More Information
AI-tagged song
Who is AI-tagging for? 

For every music company that knows the pain of manual tagging. If you work in music, chances are pretty high that you had or will have to tag songs. If you pitch a song on Spotify for Artists, you have to tag a song. If you ever made a playlist – you most probably had to deal with its categorization and tagging. If you’re an A&R and present a new artist to your team and say something like, “This is my rap artist’s new party song,” you literally just tagged a song. In all these cases it is good to have an objective AI companion to tag a song for you. 

AI-tagging is a really powerful tool at scale. You just bought a new catalog with tons of untagged songs but want to utilize it for sync: AI-tagging is a way to go. You’re a distributor tired of your clients uploading unfinished or false metadata: AI-tagging can help. You’re a production music library that picked up tons of legacy from years of manual tagging: the answer is also AI-tagging.  

Implementation Example

In the BPM Supreme library, you can see the different moods, energy levels, voice presence, and energy dynamics neatly tagged by an AI.

BPM Supreme Interface
Picture 3. BPM Supreme Cyanite Search Interface
Pros of this solution

  • Speed 
  • Consistency across catalog
  • Objectivity / reproducibility
  • Flexibility. Whenever something changes in the music industry, you can re-tag songs with new metadata at a lightning speed.

Cons of this solution

  • Development cost and time (luckily, Cyanite has a ready-to-go solution)
  • High energy consumption of deep learning models, but still less resource-heavy compared to manual tagging.

Bottom line

Tagging can not replace human work completely. But it’s a powerful and practical tool to dramatically reduce the need for manual tagging. AI-based tagging can increase the searchability of a music catalog with little to no effort.

Application 4. AI Mastering

Problem

Mastering your own music can be very expensive, especially for all DIY and bedroom producers. These categories of musicians often resort to technology to create new music. But in order to distribute music to Spotify or similar platforms, the music needs to meet certain criteria of sound quality. 

Solution

AI can be used to turn a mediocre-sounding music file into a great sound. For that, AI is trained on popular mastering techniques and on what humans have learned to recognize as a good sound. 

Who is AI mastering for?

  • DIY and bedroom producers
  • Professional musicians
  • Digital distributors 

Implementation Example

One company that is leading the field of AI mastering is LANDR. The Canada-based company has a huge community of creators and already mastered 19 million songs. Other players include eMastered and Moises.

LANDR AI Mastering
Picture 4. LANDR AI Mastering
Pros of this solution

  • Very affordable ($48/year for unlimited mastering of LO-MP3 files plus $4.99/ track for other formats vs. professional mastering starting at $30/song)
  • Fast
  • Easy for non-professionals. 

Cons of this solution

  • A standardized process that doesn’t allow room for experiments and surprises
  • Some say AI mastering is “lower quality compared to human mastering”.

Bottom line

AI mastering is an affordable tool for musicians with low budgets. For up-and-coming artists, it’s a great way to get your professionally edited music out to DSPs. For professional songwriters it’s the perfect means to let demos sound reasonably good. Professional mastering experts usually serve a different target group, so these fields are complementing each other rather than AI taking over human jobs.

Summary

To sum it up, we presented 4 different concrete use cases for AI, that work for almost every part of the value chain in the music industry. Still, the practical applications and quality differ.  AI is far from having the same complex thinking and creativity as a professional music tagger, mastering expert, or musician. But it can already help creatives do their work or even completely take over some of the expensive and tedious tasks. 

One of the biggest problems that prevents us from embracing new technology is wrong expectations. There are often two extremes: on the one side, people overestimate and expect more from AI than it can currently deliver e.g. tagging 1M songs without a single mistake or always being spot-on with music recommendations. The other camp has a lot of fear about AI taking over their jobs.

The answer may lie somewhere in between. We can embrace technology and at the same time remain critical and not blindly rely on algorithms, as there are still many facets of the human brain that AI can not imitate. 

We hope you enjoyed this read and learned more about the 4 different use cases of AI in music. If you have any feedback, questions, or contributions, you are more than welcome to reach out to jakob@cyanite.ai. You can also contact our content manager Rano if you are interested in collaborations. 

PR: BPM Supreme integrates Cyanite’s algorithms

PR: BPM Supreme integrates Cyanite’s algorithms

PRESS RELEASE

 

BPM Supreme is the first digital record pool worldwide to integrate Cyanite’s artificial intelligence for individual music recommendation

 The digital record pool BPM Supreme from San Diego will soon use the algorithms of the Berlin and Mannheim-based technology company Cyanite. The AI enables BPM Supreme to suggest music according to moods and to provide users with individualized music suggestions. BPM Supreme is one of the world’s first online music services for DJs that integrates algorithms to enhance the user experience.

Mannheim/Berlin/San Diego, December 1st, 2020BPM Supreme is one of the world’s leading digital record pools. For a monthly fee DJs get unlimited access to the entire catalog featuring thousands of new releases and exclusive remixes. The BPM Supreme brand also includes a new online sample library for producers and music makers, BPM Create, as well as a record pool specialized in Latino music, BPM Latino. BPM Latino will also integrate Cyanite’s algorithms.

Supported by Cyanite’s Deep Learning technology, the music search on BPM Supreme will be made even more intuitive, e.g. by introducing moods as search categories. BPM Supreme users will be able to find suitable music for their DJ sets and playlists easily and optimize them with the help of intelligent recommendations.

With the cooperation with BPM Supreme, Cyanite has taken on board its first major customer in the United States. In addition, SWR, Mediengruppe RTL, NEUBAU Music, and Meisel Music as well as production music providers Soundtaxi, Filmmusic io, and RipCue Music already use Cyanite’s technology.

Jakob Höflich, founder and co-director of Cyanite: “BPM Supreme embodies the ability to break through the industry and quickly adapt to the constantly evolving market. They have proven many times that they have the spirit to pioneer the industry through new business models and technologies. We are very proud that they have chosen Cyanite as their AI partner to go this crucial step into the future with.

Angel “AROCK” Castillo, Founder and CEO of BPM Supreme said: “Together with Cyanite we will enter the next phase of BPM Supreme towards an AI driven future and enable our users to find music even better with state-of-the-art discovery functions.” 

Anyone wishing to try Cyanite’s technology can register for the free Web App and upload music or integrate Cyanite into an existing database system via an API.

Try Cyanite for free: https://app.cyanite.ai/login

Full press release and material here: https://drive.google.com/drive/folders/1X9Ug29ISA-QdOBOHUMQFLOaqelL9HrXl?usp=sharing

 

BPM Supreme’s music search interface

Background to BPM Supreme:
BPM Supreme is a leading digital music service for professional DJs delivering an extensive selection of new releases and exclusive tracks through a user-friendly platform and mobile app. With an innovative approach to music discovery, the company’s mission is to be the most trusted source for DJ-ready content. BPM Supreme names many notable DJs as users of the platform, such as DJ Jazzy Jeff, Z-Trip, A-Trak, The Chainsmokers, and DJ Snoopadelic. Over the past ten years, BPM Supreme has partnered with some of the music industry’s most prominent companies, including Sony Music Entertainment, Universal Music Group, Empire Records, Dim Mak Records, Mad Decent, Roland, Pioneer DJ, Denon DJ, and Serato.
Learn more at
www.bpmsupreme.com

 

Background to Cyanite:
Cyanite believes that state-of-the-art technology should not be exclusive to big tech companies. The start-up is one of Europe’s leading independent innovators in the field of music-AI and supports some of the most renowned and innovative players in the music and audio industry. Customers and music companies using Cyanite include the Mediengruppe RTL, the radio station SWR, the music publishers NEUBAU Music and Meisel Music, and the production music libraries Soundtaxi, RipCue Music, and filmmusic.io. Cyanite’s mission is to help music companies make the transition to the age of AI without spending expensive resources on tech-innovation. The 13-person team from Mannheim and Berlin operates at the interface between the music industry, data science, and software engineering. The founding team emerged from the Popakademie Baden-Württemberg – Germany’s top-university for the music business. They are completed by a team of data scientists from one of the world’s most renowned chairs for Music Information Retrieval at the Technical University of Berlin. The company, known from the magazines Musikwoche and Music Ally among others, has received numerous awards from TechCrunch, Google, the German Government, Business Punk, Music WorX, German Accelerator and is currently a participant in the Accelerator Marathon LABS of the music company Marathon Artists in London. Cyanite is supported by the city of Mannheim and various business angels, such as Germany’s Business Angel of the Year 2019, Dr. Andrea Kranzer.

Press contact
Jakob Höflich
Co-Founder
+49 172 447 0771
Jakob (@) cyanite.ai 

Headquarter Mannheim
elceedee UG (haftungsbeschränkt)

Badenweiler Str. 4
68239 Mannheim

Office Berlin
Cyanite
Gneisenaustraße 44/45
10961 Berlin

Website: https://cyanite.ai/
LinkedIn: Cyanite.ai 
Twitter: Cyanite.ai