AI Music Recommendation Fairness: Gender Balance

AI Music Recommendation Fairness: Gender Balance

Eylül

Eylül

Data Scientist at Cyanite

Part 2 of 2. To get a more general overview of AI Music recommendation fairness – more specifically the topic of gender bias, click here to check out part 1.

Diving Deeper: The Statistics of Fair Music Discovery

While the first part of this article introduced the concept of gender fairness in music recommendation systems in an overview, this section delves into the statistical methods and models that we employ at Cyanite to evaluate and ensure AI music recommendation fairness, particularly in gender representation. This section assumes familiarity with concepts like logistic regression, propensity scores, and algorithmic bias, so let’s dive right into the technical details.

Evaluating Fairness Using Propensity Score Estimation

To ensure our music discovery algorithms offer fair representation across different genders, we employ propensity score estimation. This technique allows us to estimate the likelihood (or propensity) that a given track will have certain attributes, such as the genre, instrumentation, or presence of male or female vocals. Essentially, we want to understand how different features of a song may bias the recommendation system and adjust for that bias accordingly to enhance AI music recommendation fairness.

Baseline Model Performance

Before diving into our improved music discovery algorithms, it’s essential to establish a baseline for comparison. We created a basic logistic regression model that utilizes only genre and instrumentation to predict the probability of a track featuring female vocals. 

A model is considered well-calibrated when its predicted probabilities (represented by the blue line) closely align with the actual outcomes (depicted by the purple dashed line in the graph below). 

Calibration plot comparing the predicted probability to the true probability in a logistic regression model. The solid blue line represents the logistic regression performance, while the dashed purple line represents a perfectly calibrated model. The x-axis shows the predicted probability, and the y-axis shows the true probability in each bin

Picture 1: Our analysis shows that the logistic regression model used for baseline analysis tends to underestimate the likelihood of female vocal presence within a track at higher probability values. This is evident from the model’s performance, which falls below the diagonal line in reliability diagrams. The fluctuations and non-linearity observed suggest the limitations of relying solely on genres and instrumentation to predict artist representation accurately.

Propensity Score Calculation

In Cyanite’s Similarity Search – one of our music discovery algorithms – we model the likelihood of female vocals in a track as a function of genre and instrumentation using logistic regression. This gives us a probability score for each track, which we refer to as the propensity score. Here’s a basic formula we use for the logistic regression model:

Logistic regression formula used to calculate the probability that a track contains female vocals based on input features like genre and instrumentation. The equation shows the probability of the binary outcome Y being 1 (presence of female vocals) given input features X. The formula includes the intercept (β0) and coefficients (β1, β2, ..., βn) for each input feature.

Picture 2: The output is a probability (between 0 and 1) representing the likelihood that a track will feature female vocals based on its attributes. 

Binning Propensity Scores for Fairness Evaluation

To assess the AI music recommendation fairness of our models by observing the correlations between the input features such as genre and instrumentation with the gender of the vocals, we analyze for each propensity the model outcome of the female artist ratio. To see the trend of continuous propensity scores into discrete variables and the average of female vocal presentation for that range, binning has been applied. 

We then calculate the percentage of tracks within each bin that have female vocals as the outcome of our models. This allows us to visualize the actual gender representation across different probability levels and helps us evaluate how well our music discovery algorithms promote gender balance.

 

A bar chart comparing the average female vocal presence in Cyanite's Similarity Search results across different metadata groups.

Picture 3: We aim for gender parity in each bin, meaning the percentage of tracks with female vocals should be approximately 50%. The closer we are to that horizontal purple dashed line, the better our algorithm performs in terms of gender fairness.

Comparative Analysis: Cyanite 1.0 vs Cyanite 2.0

By comparing the results of Cyanite 1.0 and Cyanite 2.0 against our baseline logistic regression model, we can quantify how much fairer our updated algorithm is.

  • Cyanite 1.0 showed an average female presence of 54%, indicating a slight bias towards female vocals.

  • Cyanite 2.0, however, achieved 51% female presence across all bins, signaling a more balanced and fair representation of male and female artists.

This difference is crucial in ensuring that no gender is disproportionately represented, especially in genres or with instruments traditionally associated with one gender over the other (e.g., guitar for males, flute for females). Our results underscore the improvements in AI music recommendation fairness.

How Propensity Scores Help Balance the Gender Gap

Propensity score estimation is a powerful tool that allows us to address biases in the data samples used to train our music discovery algorithms. Specifically, propensity scores help ensure that features like genre and instrumentation do not disproportionately affect the representation of male or female artists in music recommendations.

The method works by estimating the likelihood of a track having certain features (such as instrumentation, genre, or other covariates) using and checking if those features directly influence our Similarity Search by putting our algorithms to the test. Therefore, we investigate the spurious correlation which is directly related to gender bias in our dataset, partly from the societal biases. 

We would like to achieve a scenario where we could represent genders equally in all kinds of music. This understanding allows us to fine-tune the model’s behavior to ensure more equitable outcomes and further improve our algorithms.

Conclusion: Gender Balance 

In conclusion, our comparative analysis of artist gender representation in music discovery algorithms highlights the importance of music recommendation fairness in machine learning models.

Cyanite 2.0 demonstrates a more balanced representation, as evidenced by a near-equal presence of female and male vocals across various propensity score ranges.

If you’re interested in using Cyanite’s AI to find similar songs or learn more about our technology, feel free to reach out via mail@cyanite.ai.

You can also try our free web app to analyze music and experiment with similarity searches without needing any coding skills.

Music CMS Solutions Compatible with Cyanite: A Case Study

Music CMS Solutions Compatible with Cyanite: A Case Study

In today’s digital age, efficiently managing vast amounts of content is crucial for businesses, especially in the music industry. For those who decide not to build their own library environment, music Content Management Systems (CMS) have become indispensable tools. At Cyanite, we integrate our AI-powered analysis and search algorithms with these systems – helping you create music moments.

In this blog post, we’ll delve into Cyanite’s compatibility with various CMS. We’ll provide an overview of the features Cyanite offers for each platform, recommend the ideal user types for each CMS, and include relevant examples

Additionally, you’ll find information on how to use Cyanite via each of these providers.

A Spreadsheet giving an overview of what Cyanite features are implemented into which content management system.

    Synchtank

    Synchtank provides cutting-edge SaaS solutions specifically designed to simplify and streamline asset and rights management, content monetization, and revenue processing. 

    It is trusted by some of the world’s leading music and media companies, including NFL, Peermusic, Warner Music, and Warner Bros. Discovery, to drive efficiency and boost revenue.

    Cyanite Features Available

    • Auto-Tagging
    • Auto-Descriptions
    • Similarity Search

    Recommended for

    • Music Publishers
    • Record Labels
    • Production Music Libraries
    • Broadcast Media/Entertainment Companies
    A Screenshot showing United Masters Sync's website using the CMS Synchtank

    Synchtank in United Masters Sync

    How to use Cyanite via Synchtank

    Cyanite is directly integrated into Synchtank.

    If you want to use Cyanite with Synchtank, please get in touch with a member of the Synchtank team or schedule a call with us to learn more via the button below.

    Reprtoir

    Reprtoir is a France-based CMS offering solutions for asset management, playlists, contacts, contracts, accounting, and analytics – providing supported data formats for various music platforms, distributors, music techs, and collective management organizations.

    Cyanite Features Available

    • Auto-Tagging
    • Auto-Descriptions
    • Similarity Search
    • Free Text Search
    • Visualizations

    Recommended for

    • Record Labels
    • Music Publishers
    • Production Music Libraries
    • Sync Teams
    A screen recording of Reprtoir, a music content management system. It provides a brief overview of Cyanite's integration into the platform.
    Screen Recording of Reprtoir with Cyanite

    How to use Cyanite via Reprtoir

    Cyanite is directly integrated into Reprtoir.

    If you want to use Cyanite with Reprtoir, please get in touch with a member of the Reprtoir team or schedule a call with us to learn more via the button below.

    Source Audio

    US-based Source Audio is a CMS that features built-in music distribution and offers access to broadcasters and streaming networks. Whilst offering its own AI tagging and search functions, again, specifically larger catalogs will find deeper, more accurate tagging necessary to effectively navigate their repertoire.

    Cyanite Features Available

    • Auto-Tagging
    • Auto-Descriptions

    Recommended for

    • Production Music Libraries
    • TV-Networks and Streaming Services
    A Screenshot showing the Interface of the Music CMS Source Audio

    How to use Cyanite via Sourceaudio

    Cyanite is directly integrated into Sourceaudio.

    If you want to use Cyanite inside Sourceaudio, send us an email or schedule a call below.

    Harvest Media

    Harvest Media is an Australian cloud-based music business service. They were founded in 2008 and offer catalog managing, licensing, and distribution tools based on standardized metadata and music search engines.

    Cyanite Features Available

    • Auto-Tagging
    • Auto-Descriptions
    • Similarity Search
    • Free Text Search

    Recommended for

    • Production Music Libraries
    • Music Publishers
    • Music Licensing & Subscription Services
    • Record Labels
    • TV Production, Broadcast and Entertainment Companies
    A screen recording of Human Librarian's interface, based on the CMS Harvest Media. It provides a brief overview of Cyanite's integration into the platform.

    Screen Recording of Harvest Media in Human Librarian

    How to use Cyanite via Harvest Media

    Cyanite is directly integrated into Harvest Media.

    If you want to use Cyanite inside Harvest Media, send us an email or schedule a call below.

    MusicMaster

    MusicMaster is the industry-standard software for professional music scheduling. It offers flexible rule-based planning, seamless integration with automation systems, and scalable tools for managing music programming across single stations or complex broadcast networks.

    Cyanite Features Available

    • Auto-Tagging
    • Visualizations

    Recommended for

    • Broadcast radio groups
    • FM/AM radio stations
    • Satellite radio networks
    A screen recording of Human Librarian's interface, based on the CMS Harvest Media. It provides a brief overview of Cyanite's integration into the platform.

    Screenshot of MusicMaster Scheduling Software

    How to use Cyanite via MusicMaster

    Cyanite is directly integrated into MusicMaster.

    If you want to use Cyanite inside MusicMaster, send us an email or schedule a call below.

    Cadenzabox

    Cadenzabox is one of the UK-based music Content Management Systems offering tagging, search, and licensing tools as a white-label service, enabling brand-specific designs and a deep level of customization built by Idea Junction – a full-service digital creative studio. 

    Cyanite Features Available

    • Auto-Tagging
    • Auto-Descriptions
    • Similarity Search
    • Free Text Search

    Recommended for

    • Production Music Libraries
    • Music Publishers
    A screen recording of Music Mind Co., a music library using the content management system Cadenzabox. It provides a brief overview of Cyanite's integration into the platform.

    Screen Recording of Cadenzabox in MusicMind Co.

    How to use Cyanite via Cadenza Box

    Cyanite is directly integrated into Cadenzabox.

    If you want to use Cyanite inside Cadenzabox, send us an email or schedule a call below.

    Tunebud

    UK-based Tunebud offers an easy, no-code music library website-building solution complete with extensive file delivery features, music search, playlist creation, e-commerce solutions, watermarking, and bulk downloads. It’s an all-in-one music library website solution suitable for individual composers wanting to showcase their works to music publishers and labels looking for a music sync solution for catalogs of up to 500k tracks.  

    Cyanite Features Available

    • Auto-Tagging
    • Auto-Descriptions
    • Similarity Search
    • Free Text Search

    Recommended for

    • Musicians
    • Composers
    • Music Publishers
    • Record Labels
    • Music Library and SFX Library Operators
    A Screenshot showing an example website using the CMS Tunebud
    Tunebud with Cyanite’s similarity search

    How to use Cyanite via Tunebud

    Cyanite is directly integrated into Tunebud.

    If you want to use Cyanite with Tunebud, please get in touch with a member of the TuneBud team or schedule a call with us to learn more via the button below.

    Supported CMS

    DISCO

    DISCO is an Australia-based sync pitching tool to manage, share, and receive audio files. While DISCO offers its own audio tagging version, particularly catalogs north of 10,000 songs may prefer using Cyanite’s deeper, more accurate tagging to organize and browse its catalog. 

    Cyanite Features Available

    • Auto-Tagging
    • Auto-Descriptions

    Recommended for

    • Music Publishers
    • Record Labels
    • Sync Teams
    A Screenshot of the Music CMS DISCO

    DISCO

    How to use Cyanite via DISCO

    All you need to do is reach out to your DISCO customer success manager and ask for a CSV spreadsheet of your catalog including mp3 download links. We’ll download, analyze, and tag your music, according to your requirements, and you can effortlessly upload the updated spreadsheet back to DISCO.

    You decide which tags to use, which to keep, and which to replace.

    Are you missing any music Content Management Systems? Feel free to chat with us and share your thoughts!

    Haven’t decided on a CMS yet? Contact us for free testing periods.

    Your Cyanite Team.

    The Importance of Music Auto-Tagging for Content Strategies

    The Importance of Music Auto-Tagging for Content Strategies

    An Introduction

    By Jakob Höflich, Co-Founder and CMO of Cyanite

    When I was 19, I worked at community radio 4ZZZ in Brisbane, tasked with digitizing daily CD deliveries, tagging their genre, and sorting them in the library. It was a tedious and challenging task – every mistake could persist in the library until corrected. And let’s face it, this rarely was the case. This was one of the experiences that motivated me to found Cyanite many years later, also to help catalog owners tag their catalogs with AI and to eradicate the legacy of tagging mistakes made in the past 25 years of digitization.

    While Auto-Tagging to create a clean and better searchable library has become a commodity, with various music companies worldwide leveraging this to alleviate the burden on their tagging teams and create more space for creative work, there is one underappreciated use case that has recently grown in significance: using Auto-Tagging data on a global catalog basis to derive actionable insights for your content strategy.

    From Hunches to Data-Driven Insights

    If you own or work with a music catalog, you likely have a solid understanding of its character. But what if the number of songs goes in the tens of thousands or even beyond? How confident can you be to know the profile of the catalog and what it stands for? When making important decisions about the creative direction of your catalog, especially with multiple stakeholders involved, this ‘feeling’ can tend to be subjective and leave room for guesswork. It’s the music’s subjective and “magical” nature that makes it hard to quantify and discuss.

    That’s why keywords remain crucial when managing and developing a catalog where AI can be so helpful to your work. By providing data insights, AI can turn these hunches into consistent and concrete knowledge.

    Here are three benefits of leveraging music Auto-Tagging for your content strategy.

    1. Deep Understanding of a Catalog’s Character

    AI music Auto-Tagging dives deep into the sound character of a catalog. By translating the complexity of music into concrete datapoints such as genres and moods, it allows for a shared, objective and consistent understanding across your team or company. Imagine having a precise breakdown of your catalog’s characteristics at your fingertips. Like, which percentage of my repertoire is Rock, Funk, Disco? Does it stand for upbeat or more melancholic tones? Do I maybe have a gender problem by favoring one more over the other? This not only enhances internal cohesion but also aligns strategies and decisions.

    Pro Tip: Our Auto-Tagging focuses on creative metadata, extracting information such as genres, BPM, key, and energy. Curious? Check out our full taxonomy. It does not extract copyright or performance metadata. A smart move here is to pair the AI-driven insights with other data pools. For example, pairing AI-insights with performance and sales data can reveal things like: Only 2% of my catalog is Hip Hop, yet this content has a 200% higher performance rate.

    2. Uncovering Blind Spots and Highlighting Trends

    AI’s ability to uncover blind spots and highlight trends within a catalog is another significant benefit. This data-driven approach can reveal underutilized niches or trends when data is placed on a timeline. Whether it’s identifying a resurgence in a particular genre or pinpointing areas with high sync opportunities, AI insights shed light on the hidden corners of a catalog. Particularly for sync teams of companies that do not have a distinct genre profile it is beneficial to have a balanced catalog to answer to the upmost possible amounts of briefs with adequate content.

    3. Informed Decisions for Catalog Acquisition

    Lastly, AI-driven insights are not limited to managing your existing catalog. They are invaluable when evaluating to-be-acquired catalogs. While frontline repertoire might be familiar, B-sides and deep cuts often remain mysterious territories. By thoroughly analyzing these lesser-known tracks, AI can contribute a creative due diligence aspect by providing a comprehensive understanding, which in turn informs better acquisition decisions. This ensures you’re investing in a catalog that has the ability to complement your existing one.

    Contrarily, if you want to sell a catalog, comprehensive tagging data on your repertoire can help you identify the perfect acquirer or prove the future longevity of your catalog to drive up the multiple.

    A Real-World Example

    A German publisher utilizing Cyanite’s AI insights discovered previously underappreciated genres, allowing them to optimize their catalog strategy effectively. The analysis showed that the genres Hip Hop, Funk, and RnB and the mood Epic were underrepresented even though both have been extremely valuable qualities for successful sync placements in the last years.

    Visual representations, such as pie charts and graphs, can further show how AI can dissect and categorize catalog elements, providing clear, actionable insights.

    All data above can be retrieved via our API.

    Conclusion: Embracing the Future with AI

    AI music Auto-Tagging can be a great help for developing content strategies in the music industry. These actionable insights provide a deep, data-driven understanding of catalogs, uncover blind spots, highlight trends, and inform strategic decisions for catalog acquisitions.

    Undoubtedly, AI can’t and shouldn’t replace the final decision-making process as it can’t anticipate the future as us humans do. But it can be used as a great tool to navigate this process with data that make it easier – and often more convincingly – to talk about the magic of music.

    As we live in a time where content production is at an all-time peak providing the sync market with opportunities as never before, every song in the catalog should have the same chance of being discovered. Having a well-organized and indexed catalog is key to that.

     

    How to Use AI Music Search for Your Music Catalog

    How to Use AI Music Search for Your Music Catalog

    Ready to level up your search workflows? Try AI-powered music search in Cyanite.

    Even the most carefully organized catalog reaches a point where text metadata can no longer support effective search on its own. Genres blur, moods can overlap, and large libraries hold thousands of tracks that look similar on paper but sound different when you listen. When you’re working on a brief, your search method needs to reflect the sound itself—not just the words attached to it.

    AI music search enables your catalog to reveal more. By working with audio alongside the metadata, it returns search results that match the intent behind a brief rather than the exact words used in a query. You get a shortlist faster and surface strong tracks that would otherwise stay buried.

    We see this need showing up across the catalogs we serve, so we put together this guide to outline how AI music search works in Cyanite and how it supports faster, more intuitive discovery in real-world workflows.

    Learn more: See how AI music tagging works in Cyanite and how it supports large catalogs.

    What is AI music search?

    Traditional catalog search depends heavily on how consistently tracks are described. It works well when metadata is uniform and when everyone searches in the same way. But this is rarely the case in practice. Different people use different language, and many musical qualities are easier to hear than to articulate precisely.

    AI music search approaches the problem by analysing the sound itself. This allows the system to understand rhythm, harmony, instrumentation, intensity, and voice presence. These sonic attributes are then used alongside existing metadata to guide search results.

    Instead of matching exact keywords, the system focuses on musical similarity and intent. That means you can start a search from a reference track or a descriptive sentence without losing nuance along the way.

    AI music search does not replace structured tagging. Instead, it builds on it as an additional way to explore a catalog when sound, context, or creative intent are easier to hear than describe.

    At the same time, well-structured tagging remains the baseline to navigate a catalog in many day-to-day scenarios. AI-driven search becomes most valuable when teams need to move beyond fixed labels or explore music from a different angle.

    How different types of AI music search work together

    In practice, AI music search is most effective when it supports multiple ways of thinking about music. These are three ways we enable catalog music search in Cyanite:

    1. Audio-based search
    2. Prompt-based search
    3. Customizable advanced search features

    These tools are designed to work together. Audio gives a clear view of how a track moves, text helps describe what you’re looking for, and advanced filters narrow the field to traits that matter for the request. Using them together keeps the catalog flexible and reduces the chance of great tracks being missed.

    Exploring your catalog through Similarity Search

    Similarity Search starts from sound. Cyanite analyzes a reference track’s audio and compares it with the rest of your catalog, returning tracks with a similar shape or mood. 

    The reference can come from within your library or from an external source, such as Spotify, YouTube, or an uploaded audio file. You can also choose which part of the reference track to use, such as the chorus, the intro, or a specific section that best represents the desired direction.

    This approach is especially useful when a brief comes with a musical example rather than a written description. Instead of translating sound into words and back again, you can search directly from what you hear. If you work with multiple reference tracks or an entire playlist, the Advanced Search features below are here to help.

    Read more: Similar song finder AI for catalogs: Use Cyanite to search your library by sound

    Searching with language using Free Text Search

    Not every search starts with a reference track. Free Text Search allows users to describe music in natural language, using full sentences rather than rigid keywords.  

    Prompts can reference mood, pacing, instrumentation, scene context, or use case. They can also include cultural references and be written in different languages. The system interprets the prompt’s meaning and matches it against the audio-based understanding of the catalog, without relying on external language models.

    This makes search accessible to a wider range of users, including those who may not be familiar with a catalog’s internal tagging conventions.

    Read more: How to prompt: the guide to using Cyanite’s Free Text Search

    Advanced Search

    For more specific searches, you often need additional control. Advanced Search builds on Similarity and Free Text Search by adding structured filters and deeper insight into why tracks appear in the results.

    This mode allows teams to:

    • View similarity scores that show how closely results align with a reference or prompt
    • Run similarity searches using up to 50 reference tracks at once
    • Upload custom metadata and use it as additional filters
    • Identify the most similar segments within each track

    Testing Advanced Search free for a month gave us the confidence we needed to update our search and tagging systems. The integration was smooth, and we were able to ship several exciting features right away—but we’ve only scratched the surface of its full capabilities!” Jack Whitis, CEO at Wavmaker

    Read more: How to level up your AI search with Advanced Search features

    AI music search: build vs buy

    Organizations considering AI search often decide based on whether they want to build internally or integrate an existing solution. It typically depends on the time, cost, and ongoing work you can take on.

    Building an in-house system can make sense for teams with significant machine-learning expertise and long-term resources. It typically requires a dedicated engineering team, a large and well-structured training dataset, and ongoing investment to maintain and improve model quality as catalogs and user needs evolve.

    However, for most catalogs, integrating a tested system is the more practical path. Cyanite offers AI music search through a web app, an API, and integrations with major catalog management systems. Teams can adopt advanced search capabilities without taking on the long-term cost and complexity of maintaining their own models.

    Smaller teams can start with the web app and scale usage over time. Larger organizations can integrate search directly into their own platforms, with pricing that aligns more predictably with catalog size.

    Cyanite’s approach to AI music search

    Cyanite is built to help teams understand their catalog through sound. We bring audio, language, and filters into one place so you can move through briefs without switching tools.

    Audio-first analysis

    Cyanite listens to the full track from beginning to end and captures how it develops in instrumentation, energy, and mood. This audio-first approach drives Similarity Search, Free Text Search, and Advanced Search. Because the focus stays on the audio rather than popularity and text-only metadata, you reach tracks that often get overlooked.

    Data security and model ownership

    Your audio remains within Cyanite’s environment.

    • Audio analysis and search models are built and maintained in-house.
    • No files are sent to external AI providers.
    • All processing meets GDPR requirements.

    Teams with specific copyright needs can use upload workflows specifically designed for internal and client-facing work.

    Built for catalog scale

    Full tracks are analysed in depth, with thousands of sonic details compared. This means large libraries can be processed quickly without search performance slowing as the catalog grows. Search performance remains steady at high volume, which makes it easier to bring new material into the library without disrupting ongoing work. 

    Search that adapts to the workflow

    Similarity Search, Free Text Search, and Advanced Search all draw from the same audio analysis, which makes it easy to move between a reference track, a written prompt, or a set of filters in a single workflow. Advanced Search adds scoring and segment highlights when you need more context, while the other modes help you move quickly through creative requests. Together, these tools support different working styles and keep results consistent across teams and briefs.

    Try AI music recognition with your own tracks

    AI music search helps catalogs stay workable as they grow. By reading the audio and supporting both reference-based and prompt-based queries, it reduces search time and brings more of the catalog into play.

    Want to see how this works with your own tracks? You can test Similarity Search and Free Text Search in the web app, or explore Advanced Search through the API.

    FAQs – API Integration

    Q: How does AI music recognition work in a catalog?

    A: AI music recognition interprets patterns in the audio and compares them across the catalog. This reduces reliance on metadata wording and supports searches that begin with a reference track or a natural-language prompt.

    Q: Is Cyanite the same as an AI music finder or consumer music search engine?

    A: No. Consumer-facing music search and recommendation systems are typically driven by listening behavior and user interaction data. Cyanite focuses on sound-based analysis and metadata, making it suitable for professional catalog search, editorial workflows, and internal systems.

    Streaming platforms use Cyanite to complement behavioral data with objective audio understanding, especially for catalog organization, discovery, and editorial use cases.

    Q: Can Cyanite be used in my CMS for music?

    Cyanite is fully integrated with SourceAudio, Cadenzabox, Harvest Media, Music Master, Reprtoir, Synchtank, and TuneBud. DISCO users can also import Cyanite’s Auto-Tagging and Auto-Descriptions into their libraries. These integrations support a wide range of Cyanite use cases across catalog management systems.

    Q: Who uses Cyanite?

    A: Music publishers, production libraries, sync teams, audio branding agencies, and music-tech platforms use Cyanite for tagging, search, playlist building, onboarding, and catalog analysis. Artists and producers use the web app for fast tagging and discovery.

    Q: Can I integrate AI search into my own platform?

    A: Yes. The API supports Similarity Search, Free Text Search, Advanced Search, and audio analysis, making it possible to add AI-powered discovery directly into your product.

    How to Create Mood- and Contextual Playlists With Dynamic Keyword Search

    How to Create Mood- and Contextual Playlists With Dynamic Keyword Search

    In the last article on the blog, we covered how Cyanite’s Similarity Search can be used in music catalogs. In this article, we explore another way to search for songs using Dynamic Keyword Search and how to leverage it to create mood- and contextual-based playlists. 

    Rather than relying on a reference track, Dynamic Keyword Search allows you to select and combine from a list of 1,500 keywords and adjust the impact of these keywords on the search. This is especially helpful to create playlists where songs match in mood, activity, or other characteristics. 

    But before we explain how this feature works, let’s explore how playlists are created. What makes a perfect playlist? Why are playlists so essential when utilizing a music catalog? And how can the Dynamic Keyword Search help with that?

    How are playlists created?

    There are three techniques for playlist creation:

    1. Manual creation (individually picking songs) 
    2. Automatic generation and recommendation 
    3. Assisted playlist creation. 

    Historically, manual creation has been the most basic and old approach. It involves picking songs individually for playlists. It might be the simplest technique but the amount of time and effort that goes into it can be overwhelming. Imagine you are working 100,000 audios in a catalog and have to create an “Energetic Workout” and “Beach Party” playlist. 

    Automatic generation uses various algorithms to create playlists with no human intervention. One of the most famous ones is, for example, “Discover Weekly” by Spotify. 

    Assisted playlist creation uses music technology to guide and support manual playlist creation. 

    In the research by Dias, Goncalves, and Fonseca, manual playlist creation was found to be most effective in terms of control, engagement, and trustiness. This means that people trust handmade playlists. Also, manual creation provides the most amount of control over the outcome and it engages editors in the creation process. 

    Automatic creation was found to be the most effective in adapting to the listeners’ needs. There is no manual control involved, so automatic tools can adapt and change playlists in no time. 

    Assisted techniques were found to be most effective in terms of engagement and trustiness whilst being quick to create. They also performed well on the song selection criteria. Song selection has been defined as the most critical factor in the playlist creation process according to this study. However, while song selection is considered very important, the question of what makes a song right for the particular playlist is still open. Apart from that, assisted techniques proved to be optimal in control, and serendipity and they also can adapt to listening preferences rather easily. 

    To anticipate things already: The Dynamic Keyword Search is exactly such an assisted technique in playlist creation.

    Why are search tools for playlist creation important in a catalog?

    Playlists have been known to be the ultimate tool for promoting music. We already covered the ways artists can get on Spotify and other people’s playlists in other articles on the blog. But creating playlists can also be beneficial for catalog owners and catalog users, be it professional musicians or labels. Here is why: 

    • You can realize new and passive modes to exploit and monetize your catalog. If you make it easier for your users and/or customers to explore your catalog, you directly increase its value.
    • Playlists are used as a promotional tool to showcase the works of an artist or the inspirations behind the artist. This article recommends creating two playlists: a vibe playlist and a catalog playlist for brand engagement and streams. 
    • Playlists help organize music by theme or context
    • With playlist creation features, users save time on finding the right fitting songs
    • Playlists can be indexed separately in search results. This helps music get discovered. 

    So playlist creation tools in a catalog are pretty important. Similarity Search is one of these tools. Another one, which we focus on in this article is Dynamic Keyword Search.

    How does Dynamic Keyword Search Work?

    Cyanite’s Dynamic Keyword Search allows for searching tracks based on multiple keywords simultaneously where each keyword can be weighted for its impact on the search. This feature leads to more relevant search results with less time-effort spent on search.

    Usually, the keywords you choose represent your idea of what you’re searching for. But you don’t have full control over the search. With Dynamic Keyword Search, you can increase the precision of the search results by adjusting the impact of the keywords on the search. So you can express exactly what you’re looking for. There are 1,500 keywords to choose from representing such characteristics of the song as mood, genre, situation, brand values, and style. These keywords’ impact on search can then be adjusted on the scale from -1 to 1 from no impact at all to “heavy impact”.

    Cyanite Dynamic Keyword Search interface

    What playlist features can be improved with Dynamic Keyword Search?

    Not all playlists are created equal. Some are better than others. This study outlines 5 characteristics of playlists that can indicate a good or bad playlist. The authors of the study assumed that user-generated playlists could be an indicator for the algorithms to create good playlists. Here are the 5 playlist characteristics they outlined: 

    • Popularity – most user-generated playlists feature popular tracks first. This, however, is not too obvious though but grabbing the attention spans of the listeners from the start is important. 
    • Freshness – playlists should contain recently released tracks. Most playlists in the study contain tracks released on average in the last 5 years.
    • Homogeneity and diversity –  playlists on average cover a very limited number of genres so playlists should be rather homogenous. However, diversity plays a significant part in listeners’ satisfaction so it should be incorporated into the playlist as well.
    • Musical Features – in terms of energy, playlists with a narrow energy spectrum with a low average energy level are preferred, but there can be some high-energy tracks in the list. 
    • Transition and Coherence – the similarity between the tracks defines the smoothness in transition and coherence of the playlist. Usually, user-generated playlists have a better similarity in the first half and a lesser similarity in the second half. 

    As the study deals with a variety of user-generated playlists, it can’t be said that all of them were equally good playlists. But the criteria outlined above can help improve playlists by understanding the character of the playlist. With Dynamic Keyword Search, you can control such criteria as homogeneity and diversity, musical features such as energy level, and similarity between the tracks to ensure transition and coherence

    PRO TIP: To improve a playlist’s transition and coherence you can combine the Dynamic Keyword Search with our Similarity Search to further filter music on Camelot Wheel. The Camelot Wheel indicates which songs transition harmonically well giving you an extremely powerful tool to perfect the song order. You can find a deeper explanation of that in this article.

    Creating Playlists with Dynamic Keyword Search – Step-by-step

    Here is how to access Dynamic Keyword Search in the Cyanite app. This feature is also available through our API

    1. Go to Search in the menu and select the Keyword Search tab. Choose whether to display results from the Library or Spotify. 
    2. Select keywords from the Augmented Keywords set. For example, these are some of the keywords in the list: joy, travel, summer, motivating, pleasant, happy, energetic, electro, bliss, gladness, auspicious, pleasure, forceful, determined, confident, positive, optimistic, agile, animated, journey, party, driving, kicking, impelling, upbeat. We recommend selecting up to 7 keywords out of 1,500. 
    3. Adjust the weights for each keyword from 1 to -1 to define their impact on search. For example, let’s set  the search input as sparkling: 0.5, sad: -1, rock: 1, dreamy: 1 
    4. Scroll down for search results. The search results will return tracks from the library that are dreamy, slightly sparkling, and not at all sad. They will also all be rock songs.

    Dynamic Keyword Search can be requested from our support team.

    Conclusion

    There are various ways to create playlists from manual creation to automatic and assisted techniques. An assisted approach that combines automatic and manual creation has proved to be the most effective in playlist creation. It meets almost all the editors’ needs such as providing control over the process, maintaining a high level of engagement and trustworthiness, and offering a good selection of songs. However, the automatic approach is fast developing and algorithms might substitute human work completely in the future. 

    Our Dynamic Keyword Search feature can help you create playlists as one of the assisted techniques. It can provide search results that take into account the search intent  in terms of keywords and the impact of those keywords on search. This doesn’t mean that Dynamic Keyword Search replaces the manual work completely, but it can help artists, labels, and catalog owners do the creative work and engage fans and listeners with the support of the right tools to save time, money, and effort. This is what we’re striving to achieve here at Cyanite – to help you fully unlock your catalog’s potential.

    Let us know if this article has been helpful and stay tuned for more on the Cyanite blog! 

    I want to try Dynamic Keyword Search – how can I get started?

    Please contact us with any questions about our Cyanite AI via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

    If you want to get the first grip on Cyanite’s technology, you can also register for our free web app to analyze music and try similarity searches without any coding needed.

    From Data to Decision – How to Use Music Data and Analytics for Intelligent Decision Making

    From Data to Decision – How to Use Music Data and Analytics for Intelligent Decision Making

    We continue writing about the Data Pyramid and in this article we finalize the series with an overview of the fourth level of the pyramid – Intelligence. The supreme discipline of data utilization and a path to success when done right.

    Other articles in the series include: 

    How to Turn Music Data into Actionable Insights: This is an overview of the Data Pyramid and how it can be used in the music industry. 

    An Overview of Data in The Music Industry: This article gives a list of all types of metadata in the music industry.

    Making Sense of Music Data – Data Visualizations: This article explores data visualizations as the second step of the pyramid and gives examples of visualizations in the music industry. 

    Benchmarking in the Music Industry – Knowledge Layer of the Data Pyramid: This article deals with Knowledge and how it is used to benchmark performance and set expectations.

    Data Pyramid and the Intelligence Layer
    The Intelligence layer of the pyramid deals with the future and answers questions “So What?” or “Now What?”. When this level is reached, usually the company stakeholders already have the dataset that is organized and structured as well as information about past outcomes of decision making. They also must have access to real-time data to learn and adjust on the fly. Having all the information at hand enables them to anticipate the outcomes of future decisions and choose the most suitable course of action.

    Intelligence can be described as the ability to choose one decision out of a million other decisions based on knowledge of how these decisions might affect the outcome. 

    Intelligence can be generated by the machine, for example, a self-driving car is a form of intelligence that scans the environment and can predict the course of action for the next section of the road. In the music industry, intelligent decisions are still, for the most part, made by humans by examining information, reading graphs and charts, memorizing past outcomes, and monitoring real-time data. In this article, we’ll explore some of the emerging intelligence technology in the music field so keep reading to find out more.

    Prescriptive, not Predictive Analytics
    Intelligence in data science is produced by the use of prescriptive analytics, which is the process of using data to determine the best possible course of action. Prescriptive analytics often employ machine learning algorithms to analyze data and consider all the “if” and “else” scenarios. Multiple datasets over different periods of time can be combined in prescriptive analytics to account for various scenarios and model complex situations. 
    Intelligence Layer – Examples in the Music Industry

    1. Recommendation systems that learn and adapt effectively to individual users’ preferences

    Recommendation systems already use some sort of prescriptive analytics when they make a selection of songs based on past user behavior. Recommendation systems can also take into account the sequence of songs and context that affect the enjoyment level of the playlist as a whole. As previously played songs influence the perception of the next song, the playlist can be adjusted accordingly. The ability to prescribe a listening experience by recommendation systems is, perhaps, the most common and well-developed example of intelligence in the music industry.

    Additionally, recommendation systems can prescribe music that directly affects user behavior. This project, for example, uses data from running exercises, predicts the future running performance, and recommends songs that maximize running results. It does so continuously, as the system stores and learns from each updated running exercise record.

    To learn more about different types of recommendation systems, check out the article How Do AI Music Recommendation Systems Work. 

    Photo at Unsplash @skabrera

    2. Automatic playlist generation based on context

    Generating music or suggesting existing music based on the context is an analog of a self-driving car in the music industry. The music adapts to the listening situation to amplify the current experience. For instance in video games, where music adjusts to the plot as the user progresses through various levels of the game. More on that in our article on Omniphony engine that explores adaptive soundtracks and music context in game development.

    Such systems are also used as location-aware music recommendations for travel destinations (when music is chosen based on the sightseeing place you visit), or computer vision systems for museum experiences (when the artwork dictates the audio choice). In these cases, the constantly changing environment serves as the basis for recommendations. 

    Another example of intelligence in this field is generating music in the metaverse which is a virtual environment, that includes augmented reality. The metaverse can be accessed through Oculus headsets and a smartphone. Currently, virtual streams and concerts are already conducted in the metaverse, so it is only a matter of time till the curated immersive experiences that can adjust to the audience’s needs will be delivered using intelligence.

    3. Prescriptive curatorship – What’s going to be hot next? 

    Prescriptive curatorship entails an understanding of how up-and-coming artists and tracks will perform and who is more likely to break in the near future. In the past, platforms like Hype Machine indexed music sites and helped find the best new music. 

    Nowadays, there are systems that can predict future hits and breaking artists automatically. For example, Spotify is developing algorithms that can predict future-breaking artists. The algorithm takes into account the preferences of the early adopters and then determines whether the artist can be considered breaking. This data can then be used to sign deals with the artist at a very early stage.

    Photo at Unsplash @jhjowen

    4. Tracking changes in music preference distribution  – making music that hits the current preferences or even future preferences

    Unlike prescriptive curatorship that relies on a group of experts, music preference distribution numbers serve artists to show how their chosen genre and formats fit audience demographics and how music can be changed for current or future preferences. The general consensus in the music industry is that music preference algorithms come after the music is produced, otherwise all music will end up sounding the same to mimic popular artists

    There is not yet a system that would automatically recommend changing the content of the song based on what users prefer. Nevertheless, attempts to use the numbers to create songs people will like are still being made.

    5. Royalty Advances

    Royalty advances are a complex task that requires comprehensive tracking of music consumption across all platforms. Distributors such as Amuse and iGroove offer a royalty advance service that is able to predict upcoming payout amounts so that artists can invest in their music long before the actual royalties kick in. These systems analyze streaming data to calculate upcoming earnings. 

    Recently the topic got even more attention through the hype of NFTs. Crypto-investors want to predict future royalty payouts and the value of their asset. 

    Future platforms most likely will be able to prescribe a course of action regarding which distribution platform to focus on based on the predicted royalty amounts. 

    Conclusion
    True intelligence in music is still hard to come by. Most of the technology described in this article falls in the space between Knowledge engines, that make predictions, and Intelligence machines, that prescribe the most appropriate course of action out of million other possible actions.

    The main concern in the industry is how far can one go with technological intelligence considering that music is a creative activity and the human element is still largely prevalent. An intelligence machine that can tell which music to produce based on a prediction of future user preferences generally prompts an adverse reaction in the industry

    Nevertheless, intelligent decisions to adjust the content of songs or to sign future-breaking artists identified by the AI can already be made by the artists and labels based on available data. 

    At Cyanite, we provide our API for access to data and the development of any kind of intelligence engines. As always, at each level of the pyramid, the quality of data plays a vital role. Cyanite generates data about each music track such as bpm, dominant key, predominant voice gender, voice presence profile, genre, mood, energy level, emotional profile, energy dynamics, emotional dynamics, instruments, and more.

    Cyanite Library view

    Each parameter is provided with its respective weight across the duration of the track. Based on different audio parameters, the system determines the similarity between the items and lists similar songs based on a reference track. These capabilities can be used for the development of intelligent products and tools as well as making intelligent decisions based on data within the company.

    I want to analyze my music data with Cyanite – how can I get started?

    Please contact us with any questions about our Cyanite AI via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

    If you want to get the first grip on Cyanite’s technology, you can also register for our free web app to analyze music and try similarity searches without any coding needed.