Experience Our Biggest Web App Update with 5,000+ New Genres! 🎉 Discover Now

The Power of Automatic Music Tagging with AI

The Power of Automatic Music Tagging with AI

In the ever-evolving landscape of the music industry, staying ahead of the curve is essential. As professionals in the music field, you understand the importance of organization and accessibility when it comes to your vast catalog of tracks. That’s where automatic music tagging, empowered by AI, comes into play.

In this article, we delve into the fascinating world of AI tagging, explaining what it is and how it’s changing the game.

What is Automatic Music Tagging?

At Cyanite, we’re at the forefront of this transformative technology. Automatic music tagging is a process in which AI algorithms analyze audio tracks and assign relevant metadata to them automatically. This metadata includes information like genre, mood, tempo, instruments used, and more. Imagine not having to manually tag and categorize each track in your library—auto-tagging streamlines this labor-intensive task with remarkable precision.

How Does AI Make It Possible?

The magic behind automatic music tagging lies in AI’s ability to decipher audio content. Advanced machine learning models are trained on vast datasets of music, allowing them to recognize patterns, structures, and features in audio. These models can identify subtle nuances that human ears might miss, resulting in highly accurate and consistent tagging.

Benefits of Auto-Tagging for Professionals

As a professional in the music industry, you’ll find numerous advantages in incorporating automatic music tagging into your workflow:

  • Efficiency: Save valuable time and resources by automating the tagging process. This means more time for creativity and less time on administrative tasks.
  • Consistency: AI ensures that tags are applied consistently, reducing errors and maintaining a standardized catalog.
  • Discoverability: Precise tagging enhances the discoverability of your music. Whether you’re curating playlists or licensing tracks, accurate metadata is crucial.
  • Content Monetization: Easily identify tracks that fit specific licensing opportunities, increasing your revenue potential.

Cyanite’s Automatic Music Tagging Solution

At Cyanite, we’re dedicated to pushing the boundaries of what’s possible in the music industry. Our automatic music tagging solution is built on state-of-the-art AI technology that has been meticulously trained on a vast and diverse music dataset. This enables us to provide industry-leading accuracy in music metadata tagging.

We offer a user-friendly API that seamlessly integrates with your existing systems, allowing you to harness the power of AI tagging without disruption. You can also easily receive Cyanite’s tags as a csv or spreadsheet. Our solution covers a wide range of tags, including genre, mood, instrumentation, and more, ensuring that your music library is enriched with valuable metadata.

To try out Cyanite’s tagging yourself, you can register for web app via the link below which allows you to analyze 5 songs for free each month. Please know that the web app doesn’t contain the full Cyanite tagging but is just a showcase. Reach out to us to get a full tagging sample.

If you are an artist, be sure to explore the 4 best ways to use Cyanite for artists, producers, and DJs.

Why Choose Cyanite?

  1. Unparalleled Accuracy: Our AI models are fine-tuned to deliver the most accurate music tagging results.
  2. Customization: Tailor our solution to your specific needs. Choose the tags and metadata that matter most to you.
  3. Scalability: Whether you have a small library or a vast collection, our solution scales effortlessly to meet your demands.
  4. Comprehensive Support: Cyanite is not just a technology provider; we’re your partners in music innovation. Our dedicated team is ready to assist you every step of the way.

Conclusion

Automatic music tagging with AI is revolutionizing the music industry by simplifying the management of vast music libraries. As a professional in the field, embracing this technology can significantly boost your efficiency, discoverability, and revenue potential. At Cyanite, we’re committed to delivering cutting-edge solutions that empower music professionals to excel in a fast-paced and dynamic industry. Explore the world of automatic music tagging and unlock the full potential of your music catalog with Cyanite.

Discover how Cyanite can transform your music management. Contact us today to learn more about our services and take the first step toward a more streamlined and efficient music workflow.

Key Take-Aways From MUSEXPO 2023 In Los Angeles – Part 2

Key Take-Aways From MUSEXPO 2023 In Los Angeles – Part 2

Written by our CMO Jakob Höflich

This is the second part of my take-aways from Musexpo 2023. If you have missed the first part, you can read it here.

Besides a noisy market and the importance of back catalog, those were further topics that stuck with me when I travelled back to Germany.

AI & Data

Of course, as an AI representative I would have loved to see more AI players on stage such as Beatoven.ai, but on the other hand it was refreshing to have this hype-topic not so in the fore-front as is the case at many other conferences, but rather bringing actual applications and use cases up while discussing proper real-world challenges. Nevertheless, it became clear that the current AI discussion is dominated by AI generated music. There was fear issued by industry representatives that it will take away creativity and replace it. But then there was also the beautiful quote that in music, “only hearts will touch hearts“ – unfortunately I forgot who said it but I think that is very true. Still, the entire Castaway in Burbank, where the conference was held, held its breath when Dennis Hausammann, CEO of iGroove put it out upfront: “Guys, AI is here and it’s here to stay. It will change the industry and you can either embrace it or decide not to. But let’s face it, it is here to stay and it is happening right now“. As you can imagine, I loved that.

What I also experienced in my conversations is that the value and benefit of AI for tagging and searching music, such as we do at Cyanite.ai, is not yet fully leveraged by music publishers. So even though this technology already proves hands-on benefits such as saving money on tagging and licensing more music by leveraging the depths of a catalog with AI, everything is still young. I feel we are really at the beginning of a new wave of tech-driven publishers, supervisors and sync teams who are super data and music-savvy and leverage the huge opportunity of data and play it back to their artists and teams meaningfully.

Internationalization

I really loved the panel “Market Discovery India“. We deal with quite a lot of requests from India and I can really see this market blowing up. What was fascinating to hear is that 5-10 years ago, around 90% of the popular music in India came from movie soundtracks. There was no separate film and music industry, it was one big industry with no separation of video and audio. Today, that number has dropped to 30-50%, which is still very high compared to other markets, but also shows that a new Indian music industry is on the rise.

But it’s not only about India. One panelist spoke of an exceptionally famous artist from South Africa who is not represented on a single streaming service. There are new, emerging markets that not only have the opportunity to transform the global music industry, but also to redefine streaming payout models as they are currently applied in the Western world.

What was also dropped here was the importance of subtitles. With good subtitles, regional music is not limited to its countries of origin anymore. But with subtitles, Chilean kids can enjoy K-Pop and Japanese teenagers can dig underground Macedonian rap.

Bottom line was that we will see a change from a US and UK dominated music industry to something more international. I find this truly fascinating as it also opens the western dominated music industry model for new influences from new cultures which bring different business ethics, new ideas, and just more diversity to this fascinating industry.

MARKET FOCUS INDIA AT MUSEXPO 2023

Music For Mental Health

A little bit more niche but by no means less fascinating was the Alchemic Sonic Environment experience created by Satya Hinduja and her team. In a multi-sensory listening experience, they presented an intimate, spatial audio installation that demonstrated the potential of music for mental health. Personally, I am deeply convinced that music makes our inner walls permeable and better connects us to our true desires and needs, which is why it was so great to see and, more importantly, experience this outstanding work. They also easily won the award for the most beautiful setting and booth.

The most interesting question to me is if and how an industry that is primarily focused on entertainment is also able to tap into the healing aspects of music. A good example for that might be Endel which offer soundscapes for all kinds of scenarios from studying to sleeping, and also collaborate with artists like Grimes or James Blake to offer “functional” musical experiences designed by actual artists. I believe something very big is starting there that also contains lots of potential for new and innovative revenue streams for artists and their work.

BEAUTIFUL SETTING OF ALCHEMIC SONIC ENVIRONMENT

Conclusion

Honestly, I would have liked to have one or two more days at Musexpo to further connect with people and possibly have some hands-on workshops that could be initiated and led by delegates, working together on some of the topics discussed in the panels (as it’s done at Future Music Camp for example). It was an intimate setting that made it easy to share openly and meet people in person that you usually only see on screen. Although the focus is very much on A&Ring, I felt there was almost a 360-degree view of the music industry’s most pressing challenges, and I’m sure everyone enjoyed getting out of the usual bubble and enjoying other perspectives as much as I did.

It became so clear to me at the conference, that the biggest challenge in the music industry right now is not that AI will replace artists, but it’s about discovering the great music, the hidden gems, the outstanding artists that are out there, and to find ways to connect those artist with audiences that resonate with their music. At the end of each day, every single job of every single person attending the event goes back to human creativity and the artists who write and produce music. We need technology to help us navigate the masses; we need an open dialog between old and new music industry and we need events like Musexpo to bring all of this together.

Key Take-Aways From MUSEXPO 2023 In Los Angeles – Part 1

Key Take-Aways From MUSEXPO 2023 In Los Angeles – Part 1

Written by our CMO Jakob Höflich

I just came back to Berlin after visiting this year’s Musexpo on behalf of Cyanite after Covid closed the event down in 2020, when we originally planned to attend.

It was a four-day event packed with panels featuring some of the industry’s leading figures such as Adam Taylor (President APM Music), Evan Bogart (Founder & CEO, Seeker Music) and Kristin Graziani (President Stem Disintermedia Inc. ) as well as evening showcase performances at iconic S.I.R. Studios Hollywood by an international group of artists such as Caity Baser from the UK or Holly Riva from Australia. My first eye opener was when the German band KAMRAD played their hit song “Believe” on the first night of the showcase, which I definitely knew from radio and has been listened to over 70 million times on Spotify, but is still completely unknown in the American market. It made me realize again how isolated Western music markets can still be.

YouTube

By loading the video, you agree to YouTube's privacy policy.
Learn more

Load video

The panels were mainly about “traditional” craft in the areas of sync, publishing, artist promotion and distribution. In addition, many artists had the opportunity to meet supervisors and A&Rs. Technology topics such as AI, NFTs and the Metaverse were not represented in the panel topics. However, on the panels themselves and in the audience Q&A, AI was a recurring topic brought up. Of course, as an AI company, I would have liked to see a bit more tech talk, but on the other hand, it was interesting to approach these topics from the “inside out”.

One thing that’s always refreshing to see is that everyone puts on their trousers one leg at a time. The challenges of mass content production, an extremely decentralized media and distribution landscape, and the future of creativity in the age of AI were topics to which no one had a perfect answer or a concrete solution. The challenges are obvious, and it became very clear at a conference like this that these challenges can only become solutions that benefit all players equally if they are worked on together and a dialog is cultivated between the music industry, artists and technology providers – as Cherie Hu recommends as well in this article.

Besides meeting really inspiring and genuine people in person, such as a leading NASA researcher turned music composer, here are the main take aways, that I brought back to Germany and that were interesting to see addressed.

Before I start, a huge thanks to Sat Bisla and his team who put together a fabulous event and provided a setting in which new and old relationships can evolve, nurture, and deepen.

Without further ado, here are my personal key take-aways – of course, there was much more and I won’t be able to cover the whole scope of the conference.

It’s noisy and crowded

The conference started off with taking a look at the industries most pressing problems and opportunities. It directly became clear that the biggest challenge for all players involved is the masses of content and the numerous outlets for them. It was said that “it is freedom and chaos at the moment“. It’s extremely hard to cut through the noise and in contrast to the times when there was MTV and your local record shop to distribute music, it is an extremely individualized case by case decision about which target groups to focus on, where to reach them, and what kind of content to produce for them.

Also, everything comes with lots of new challenges for artists who were often called “brands“ at the conference. Artist development is more and more in the hands of the artist themselves (and their teams) as the big players in particular focus on placing bets on single hits that often dominate today’s streaming landscape. However, it is said that fans engage with artists, not with songs, and that is where true fandom is created.

Lots of question marks in this space of freedom and chaos evolve around TikTok and Co. and how those platforms will be able to set up fair royalty payouts. And as we shift to poorly-paid licensing models such as Tiktok, artist teams need to find new revenue streams.

The importance of back catalog & sync

There were a couple of really amazing panels around sync, publishing, and music supervision. The Hello Group’s President Phil Quartararo said in the opening panel: “People have unlearned to work their back catalog“ and have forgotten how to maximize the use of it. And he subtly but directly addressed the majors with this statement. Apparently, the majors are so focused on breaking new artists and “going where the money is“, that they forget about all the brilliant music that’s in their back catalogs. According to him, the industry should pay more focus on the dusty corners of the catalogs where the real gems can be very well hidden.

What also became clear, that despite the fact that access to music has become so easy, the access to the influential people who recommend your music to the music directors at Netflix et al. or at the most influential radio stations, create a very tough bottleneck to pass through. Both radio stations and music supervisors have their so-called ”trusted sources“ that not only provide them with music that could work amazingly well in sync, but from whom they also know that they make sure that the music is easy to clear.

One thing that I found mind-blowing is that supervisors apparently often prefer to take older music where the rights don’t have to be cleared from 15 co-writers but maybe just 2 or so. Contemporary music takes more time to clear the breadth of songwriters that were involved. This is another motivation to all songwriter out there to pay meticulous attention to clean and neat metadata!

Last but not least, commercial music only produced to get attention in sync, is not really favored by supervisors. Yes, it can be a great fit sound-wise but the initial motivation might reveal a lack of authenticity. And authenticity is what supervisors are looking for when they connect music with movie productions and especially with brands. Here, again, people engage with the people behind the songs, not only the songs themselves.

More insights tomorrow in Part 2 on AI, data and the internationalization of the music industry.

How to Create Mood- and Contextual Playlists With Dynamic Keyword Search

How to Create Mood- and Contextual Playlists With Dynamic Keyword Search

In the last article on the blog, we covered how Cyanite’s Similarity Search can be used in music catalogs. In this article, we explore another way to search for songs using Dynamic Keyword Search and how to leverage it to create mood- and contextual-based playlists. 

Rather than relying on a reference track, Dynamic Keyword Search allows you to select and combine from a list of 1,500 keywords and adjust the impact of these keywords on the search. This is especially helpful to create playlists where songs match in mood, activity, or other characteristics. 

But before we explain how this feature works, let’s explore how playlists are created. What makes a perfect playlist? Why are playlists so essential when utilizing a music catalog? And how can the Dynamic Keyword Search help with that?

How are playlists created?

There are three techniques for playlist creation:

  1. Manual creation (individually picking songs) 
  2. Automatic generation and recommendation 
  3. Assisted playlist creation. 

Historically, manual creation has been the most basic and old approach. It involves picking songs individually for playlists. It might be the simplest technique but the amount of time and effort that goes into it can be overwhelming. Imagine you are working 100,000 audios in a catalog and have to create an “Energetic Workout” and “Beach Party” playlist. 

Automatic generation uses various algorithms to create playlists with no human intervention. One of the most famous ones is, for example, “Discover Weekly” by Spotify. 

Assisted playlist creation uses music technology to guide and support manual playlist creation. 

In the research by Dias, Goncalves, and Fonseca, manual playlist creation was found to be most effective in terms of control, engagement, and trustiness. This means that people trust handmade playlists. Also, manual creation provides the most amount of control over the outcome and it engages editors in the creation process. 

Automatic creation was found to be the most effective in adapting to the listeners’ needs. There is no manual control involved, so automatic tools can adapt and change playlists in no time. 

Assisted techniques were found to be most effective in terms of engagement and trustiness whilst being quick to create. They also performed well on the song selection criteria. Song selection has been defined as the most critical factor in the playlist creation process according to this study. However, while song selection is considered very important, the question of what makes a song right for the particular playlist is still open. Apart from that, assisted techniques proved to be optimal in control, and serendipity and they also can adapt to listening preferences rather easily. 

To anticipate things already: The Dynamic Keyword Search is exactly such an assisted technique in playlist creation.

Why are search tools for playlist creation important in a catalog?

Playlists have been known to be the ultimate tool for promoting music. We already covered the ways artists can get on Spotify and other people’s playlists in other articles on the blog. But creating playlists can also be beneficial for catalog owners and catalog users, be it professional musicians or labels. Here is why: 

  • You can realize new and passive modes to exploit and monetize your catalog. If you make it easier for your users and/or customers to explore your catalog, you directly increase its value.
  • Playlists are used as a promotional tool to showcase the works of an artist or the inspirations behind the artist. This article recommends creating two playlists: a vibe playlist and a catalog playlist for brand engagement and streams. 
  • Playlists help organize music by theme or context
  • With playlist creation features, users save time on finding the right fitting songs
  • Playlists can be indexed separately in search results. This helps music get discovered. 

So playlist creation tools in a catalog are pretty important. Similarity Search is one of these tools. Another one, which we focus on in this article is Dynamic Keyword Search.

How does Dynamic Keyword Search Work?

Cyanite’s Dynamic Keyword Search allows for searching tracks based on multiple keywords simultaneously where each keyword can be weighted for its impact on the search. This feature leads to more relevant search results with less time-effort spent on search.

Usually, the keywords you choose represent your idea of what you’re searching for. But you don’t have full control over the search. With Dynamic Keyword Search, you can increase the precision of the search results by adjusting the impact of the keywords on the search. So you can express exactly what you’re looking for. There are 1,500 keywords to choose from representing such characteristics of the song as mood, genre, situation, brand values, and style. These keywords’ impact on search can then be adjusted on the scale from -1 to 1 from no impact at all to “heavy impact”.

Cyanite Dynamic Keyword Search interface

What playlist features can be improved with Dynamic Keyword Search?

Not all playlists are created equal. Some are better than others. This study outlines 5 characteristics of playlists that can indicate a good or bad playlist. The authors of the study assumed that user-generated playlists could be an indicator for the algorithms to create good playlists. Here are the 5 playlist characteristics they outlined: 

  • Popularity – most user-generated playlists feature popular tracks first. This, however, is not too obvious though but grabbing the attention spans of the listeners from the start is important. 
  • Freshness – playlists should contain recently released tracks. Most playlists in the study contain tracks released on average in the last 5 years.
  • Homogeneity and diversity –  playlists on average cover a very limited number of genres so playlists should be rather homogenous. However, diversity plays a significant part in listeners’ satisfaction so it should be incorporated into the playlist as well.
  • Musical Features – in terms of energy, playlists with a narrow energy spectrum with a low average energy level are preferred, but there can be some high-energy tracks in the list. 
  • Transition and Coherence – the similarity between the tracks defines the smoothness in transition and coherence of the playlist. Usually, user-generated playlists have a better similarity in the first half and a lesser similarity in the second half. 

As the study deals with a variety of user-generated playlists, it can’t be said that all of them were equally good playlists. But the criteria outlined above can help improve playlists by understanding the character of the playlist. With Dynamic Keyword Search, you can control such criteria as homogeneity and diversity, musical features such as energy level, and similarity between the tracks to ensure transition and coherence

PRO TIP: To improve a playlist’s transition and coherence you can combine the Dynamic Keyword Search with our Similarity Search to further filter music on Camelot Wheel. The Camelot Wheel indicates which songs transition harmonically well giving you an extremely powerful tool to perfect the song order. You can find a deeper explanation of that in this article.

Creating Playlists with Dynamic Keyword Search – Step-by-step

Here is how to access Dynamic Keyword Search in the Cyanite app. This feature is also available through our API

  1. Go to Search in the menu and select the Keyword Search tab. Choose whether to display results from the Library or Spotify. 
  2. Select keywords from the Augmented Keywords set. For example, these are some of the keywords in the list: joy, travel, summer, motivating, pleasant, happy, energetic, electro, bliss, gladness, auspicious, pleasure, forceful, determined, confident, positive, optimistic, agile, animated, journey, party, driving, kicking, impelling, upbeat. We recommend selecting up to 7 keywords out of 1,500. 
  3. Adjust the weights for each keyword from 1 to -1 to define their impact on search. For example, let’s set  the search input as sparkling: 0.5, sad: -1, rock: 1, dreamy: 1 
  4. Scroll down for search results. The search results will return tracks from the library that are dreamy, slightly sparkling, and not at all sad. They will also all be rock songs.

Dynamic Keyword Search can be requested from our support team.

Conclusion

There are various ways to create playlists from manual creation to automatic and assisted techniques. An assisted approach that combines automatic and manual creation has proved to be the most effective in playlist creation. It meets almost all the editors’ needs such as providing control over the process, maintaining a high level of engagement and trustworthiness, and offering a good selection of songs. However, the automatic approach is fast developing and algorithms might substitute human work completely in the future. 

Our Dynamic Keyword Search feature can help you create playlists as one of the assisted techniques. It can provide search results that take into account the search intent  in terms of keywords and the impact of those keywords on search. This doesn’t mean that Dynamic Keyword Search replaces the manual work completely, but it can help artists, labels, and catalog owners do the creative work and engage fans and listeners with the support of the right tools to save time, money, and effort. This is what we’re striving to achieve here at Cyanite – to help you fully unlock your catalog’s potential.

Let us know if this article has been helpful and stay tuned for more on the Cyanite blog! 

I want to try Dynamic Keyword Search – how can I get started?

Please contact us with any questions about our Cyanite AI via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

If you want to get the first grip on Cyanite’s technology, you can also register for our free web app to analyze music and try similarity searches without any coding needed.

Best of Music Similarity Search: Find Similar Songs With the Help of AI

Best of Music Similarity Search: Find Similar Songs With the Help of AI

In the past, music search was limited to basics like artist name and song title. Today’s vast and diverse music landscape calls for better ways of discovering music. Studies show that people connect with music for emotional and social reasons, making style, mood, genre, and similarity crucial to music discovery.

In this article, we explore how AI-driven music similarity search works and practical ways to find songs that sound alike using Cyanite’s Similarity Search tool.

 

How does Cyanite’s Music Similarity Search Work?

Our AI-powered music Similarity Search uses a reference track to pull a list of matching songs from a library. First, the AI analyzes the entire catalog, comparing the audio features of each song to enable accurate similarity searches. You can also filter results, for example, by BPM or genre to refine your search.

These algorithms compute the distance between songs based on their audio features. The smaller the distance, the more similar the tracks are. As music libraries expand, Similarity Search makes finding music easier and more efficient. Unlike platforms like Spotify that recommend songs based on user behavior, Cyanite focuses purely on sound, making our matches more accurate.

Find Similar Songs by Audio – 9 Best Practices

Here we outline 9 best ways to use music Similarity Search in a music catalog.

1. Finding similar songs using audio references for sync and music briefs

Music supervisors often work under tight deadlines. Our research with the University of Utrecht shows that 75% of music searches are done in a rush. Using a reference track within music Similarity Search can speed up this process and boost the chances of licensing tracks that otherwise get overlooked. Unlike Spotify’s “Similar Artist” feature, Cyanite analyzes sound characteristics, making it perfect for precise sync projects.

With the help of Cyanite’s AI tags and the outstanding search results, we were able to find forgotten gems and give them a new life in movie productions. Without Cyanite, this might never have happened.

Miriam Rech

Sync Manager, Meisel Music

Photo at Unsplash @dillonjshook

 

2. Finding duplicates

Music libraries often have duplicates, which can clutter your catalog. Similarity Search easily identifies and removes these duplicates, saving time and effort.

3. Social media campaigns

Want to promote a new artist? Use Similarity Search to find songs by popular artists with similar sounds. This data helps target fans on platforms like Facebook, Instagram, and Google, increasing campaign effectiveness.

Read more about this use case in our article on Custom Audiences for Pre-Release Music Campaigns.

Photo at Unsplash @William White

 
 

4. Determining type beats

Beat producers often create “type beats” to mimic the style of popular artists. With Similarity Search, they can compare their beats to the intended style and refine them. Catalog users can also find unique, niche matches to avoid oversaturation.

5. Playlist pitching

Use music Similarity Search to target your pitches to Spotify editors and playlist curators. Ingest full playlists and find the closest match for a more personalized approach. Providing references, like “Fans of Max Richter and Dustin O’Halloran,” makes your pitch stronger and more relatable.

Learn more in our article on Playlist pitching with Cyanite.

👉 Ready to try it out? Register for our free web app and start using Similarity Search here.

6. Playlist optimization

Similarity Search helps generate playlists automatically based on a reference track, inspiring playlist curators to create cohesive collections for study sessions or specific moods.

7. Dj mixing and DJ Crates optimization

DJs can use Similarity Search to find tracks that match in key and vibe, creating smoother transitions. The Camelot Wheel filter ensures harmonic mixing for an optimal DJ set.

Discover more in our article on Optimizing Playlists and DJ Sets.

 

A screenshot showing Cyanite's Music Similarity Search interface.

Cyanite’s music Similarity Search interface

8. Uncovering Catalog Blind Spots 

Older or niche songs often get lost in catalogs. Similarity Search reveals hidden gems, expanding your options and keeping users engaged with more variety.

9. Finding Samples

Instead of wasting hours searching for samples, Similarity Search pulls up similar sounds instantly. Refine results by key or BPM to quickly build your ideal sample stack.

Why use music Similarity Search in a Catalog?

Similarity Search doesn’t just find similar tracks. It helps clean up your catalog, surface hidden songs, and optimize playlist curation. It’s also invaluable for strategic playlist pitching and social media targeting. As the music industry evolves, tools like these will be essential for staying competitive.

Cyanite provides Similarity Search via an API or web app. Our tool uses audio and metadata to deliver results, reducing search time by up to 86% and simplifying tedious tasks. Check out our Cinephonix integration video for a real-world example.

FAQs

Q: How accurate is Cyanite’s Similarity Search compared to Spotify’s recommendations?
A: Unlike Spotify, which relies on user behavior, Cyanite focuses on the actual sound. This makes our matches more sonically accurate for use cases where the song’s tonality is crucial.

Q: Can I use Similarity Search without coding skills?
A: Yes! Our free web app lets you analyze music and run similarity searches without any coding knowledge.

Q: How does Similarity Search help in marketing campaigns?
A: By finding songs with similar sounds to popular artists, you can target fans of those artists on social media, making your campaigns more effective.

Q: Can DJs benefit from Similarity Search?
A: Absolutely. DJs can use it to find tracks that blend well for seamless transitions and harmonic mixing.

Q: How can I try Similarity Search for free?
A: Simply register for our free web app here to start using Similarity Search today!

From Data to Decision – How to Use Music Data and Analytics for Intelligent Decision Making

From Data to Decision – How to Use Music Data and Analytics for Intelligent Decision Making

We continue writing about the Data Pyramid and in this article we finalize the series with an overview of the fourth level of the pyramid – Intelligence. The supreme discipline of data utilization and a path to success when done right.

Other articles in the series include: 

How to Turn Music Data into Actionable Insights: This is an overview of the Data Pyramid and how it can be used in the music industry. 

An Overview of Data in The Music Industry: This article gives a list of all types of metadata in the music industry.

Making Sense of Music Data – Data Visualizations: This article explores data visualizations as the second step of the pyramid and gives examples of visualizations in the music industry. 

Benchmarking in the Music Industry – Knowledge Layer of the Data Pyramid: This article deals with Knowledge and how it is used to benchmark performance and set expectations.

Data Pyramid and the Intelligence Layer
The Intelligence layer of the pyramid deals with the future and answers questions “So What?” or “Now What?”. When this level is reached, usually the company stakeholders already have the dataset that is organized and structured as well as information about past outcomes of decision making. They also must have access to real-time data to learn and adjust on the fly. Having all the information at hand enables them to anticipate the outcomes of future decisions and choose the most suitable course of action.

Intelligence can be described as the ability to choose one decision out of a million other decisions based on knowledge of how these decisions might affect the outcome. 

Intelligence can be generated by the machine, for example, a self-driving car is a form of intelligence that scans the environment and can predict the course of action for the next section of the road. In the music industry, intelligent decisions are still, for the most part, made by humans by examining information, reading graphs and charts, memorizing past outcomes, and monitoring real-time data. In this article, we’ll explore some of the emerging intelligence technology in the music field so keep reading to find out more.

Prescriptive, not Predictive Analytics
Intelligence in data science is produced by the use of prescriptive analytics, which is the process of using data to determine the best possible course of action. Prescriptive analytics often employ machine learning algorithms to analyze data and consider all the “if” and “else” scenarios. Multiple datasets over different periods of time can be combined in prescriptive analytics to account for various scenarios and model complex situations. 
Intelligence Layer – Examples in the Music Industry

1. Recommendation systems that learn and adapt effectively to individual users’ preferences

Recommendation systems already use some sort of prescriptive analytics when they make a selection of songs based on past user behavior. Recommendation systems can also take into account the sequence of songs and context that affect the enjoyment level of the playlist as a whole. As previously played songs influence the perception of the next song, the playlist can be adjusted accordingly. The ability to prescribe a listening experience by recommendation systems is, perhaps, the most common and well-developed example of intelligence in the music industry.

Additionally, recommendation systems can prescribe music that directly affects user behavior. This project, for example, uses data from running exercises, predicts the future running performance, and recommends songs that maximize running results. It does so continuously, as the system stores and learns from each updated running exercise record.

To learn more about different types of recommendation systems, check out the article How Do AI Music Recommendation Systems Work. 

Photo at Unsplash @skabrera

2. Automatic playlist generation based on context

Generating music or suggesting existing music based on the context is an analog of a self-driving car in the music industry. The music adapts to the listening situation to amplify the current experience. For instance in video games, where music adjusts to the plot as the user progresses through various levels of the game. More on that in our article on Omniphony engine that explores adaptive soundtracks and music context in game development.

Such systems are also used as location-aware music recommendations for travel destinations (when music is chosen based on the sightseeing place you visit), or computer vision systems for museum experiences (when the artwork dictates the audio choice). In these cases, the constantly changing environment serves as the basis for recommendations. 

Another example of intelligence in this field is generating music in the metaverse which is a virtual environment, that includes augmented reality. The metaverse can be accessed through Oculus headsets and a smartphone. Currently, virtual streams and concerts are already conducted in the metaverse, so it is only a matter of time till the curated immersive experiences that can adjust to the audience’s needs will be delivered using intelligence.

3. Prescriptive curatorship – What’s going to be hot next? 

Prescriptive curatorship entails an understanding of how up-and-coming artists and tracks will perform and who is more likely to break in the near future. In the past, platforms like Hype Machine indexed music sites and helped find the best new music. 

Nowadays, there are systems that can predict future hits and breaking artists automatically. For example, Spotify is developing algorithms that can predict future-breaking artists. The algorithm takes into account the preferences of the early adopters and then determines whether the artist can be considered breaking. This data can then be used to sign deals with the artist at a very early stage.

Photo at Unsplash @jhjowen

4. Tracking changes in music preference distribution  – making music that hits the current preferences or even future preferences

Unlike prescriptive curatorship that relies on a group of experts, music preference distribution numbers serve artists to show how their chosen genre and formats fit audience demographics and how music can be changed for current or future preferences. The general consensus in the music industry is that music preference algorithms come after the music is produced, otherwise all music will end up sounding the same to mimic popular artists

There is not yet a system that would automatically recommend changing the content of the song based on what users prefer. Nevertheless, attempts to use the numbers to create songs people will like are still being made.

5. Royalty Advances

Royalty advances are a complex task that requires comprehensive tracking of music consumption across all platforms. Distributors such as Amuse and iGroove offer a royalty advance service that is able to predict upcoming payout amounts so that artists can invest in their music long before the actual royalties kick in. These systems analyze streaming data to calculate upcoming earnings. 

Recently the topic got even more attention through the hype of NFTs. Crypto-investors want to predict future royalty payouts and the value of their asset. 

Future platforms most likely will be able to prescribe a course of action regarding which distribution platform to focus on based on the predicted royalty amounts. 

Conclusion
True intelligence in music is still hard to come by. Most of the technology described in this article falls in the space between Knowledge engines, that make predictions, and Intelligence machines, that prescribe the most appropriate course of action out of million other possible actions.

The main concern in the industry is how far can one go with technological intelligence considering that music is a creative activity and the human element is still largely prevalent. An intelligence machine that can tell which music to produce based on a prediction of future user preferences generally prompts an adverse reaction in the industry

Nevertheless, intelligent decisions to adjust the content of songs or to sign future-breaking artists identified by the AI can already be made by the artists and labels based on available data. 

At Cyanite, we provide our API for access to data and the development of any kind of intelligence engines. As always, at each level of the pyramid, the quality of data plays a vital role. Cyanite generates data about each music track such as bpm, dominant key, predominant voice gender, voice presence profile, genre, mood, energy level, emotional profile, energy dynamics, emotional dynamics, instruments, and more.

Cyanite Library view

Each parameter is provided with its respective weight across the duration of the track. Based on different audio parameters, the system determines the similarity between the items and lists similar songs based on a reference track. These capabilities can be used for the development of intelligent products and tools as well as making intelligent decisions based on data within the company.

I want to analyze my music data with Cyanite – how can I get started?

Please contact us with any questions about our Cyanite AI via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

If you want to get the first grip on Cyanite’s technology, you can also register for our free web app to analyze music and try similarity searches without any coding needed.