Experience Our Biggest Web App Update with 5,000+ New Genres! 🎉 Discover Now

Empowering Researchers with Free AI Music Analysis – Cyanite for Innovators Spotlight

Empowering Researchers with Free AI Music Analysis – Cyanite for Innovators Spotlight

Last updated on March 6th, 2025 at 02:14 pm

We believe in the power of our AI music analysis tools to fuel creativity across diverse fields, from the arts and research to grassroots movements and creative coding.

That’s why we launched Cyanite for Innovators in 2023, a support programme designed to empower individuals and teams working on non-commercial projects across the arts, research and creative coding communities. Participants gain access to our AI music analysis tools, opening the door for groundbreaking experimentation in areas like interactive art, digital installations, and AI-driven research.

So far, we’ve received 20+ applications, five of which have been selected for the programme:

 

Are you working on a groundbreaking AI-driven project? Apply now and join a community of innovators shaping the future.

 

How to Apply

1. Click the button below.

2. Submit a detailed proposal outlining your project’s objectives, timeline, and expected outcomes.

3. Allow us 4-8 weeks to review your application and get back to you.

Music and Dance Visualizer by Dilucious

Agustin Di Luciano, a digital artist and developer, is pushing the limits of real-time interactivity with audio processing, motion capture, procedural generation, and Cyanite’s AI music analysis /ML technology.

His project creates immersive, AI-powered sensory landscapes, transforming movement and sound into stunning real-time visuals. These images are from his recent exhibition at the Art Basel Miami where he showed his Music Visualizer to Miami’s top tech entrepreneurs as well as notable Latin American art collectors.

Soundsketcher

Led by Asteris Zacharakis, PhD and funded by the Hellenic Foundation for Research and Innovation, Soundsketcher (2024–2025) is a cutting-edge research project that blends computational, cognitive, and arts-based methodologies to develop an application that automatically converts audio into graphic scores.

This technology assists users in learning, analyzing, and co-creating music, making music notation and composition more accessible and intuitive.

MusiQ

Kristóf Kasza developed MusiQ for his thesis at Budapest University of Technology and Economics, a multi-user service that sorts song requests based on parameters analyzed by Cyanite, using a FastAPI backend with Docker and a Flutter-based frontend.

Cyanite’s AI music analysis API enabled precise music sorting by key, genre, BPM, and mood, contributing to the project’s success, which earned him top marks, and while he has since started a full-time software development job, he looks forward to further enhancing MusiQ in the future.

Expanding the Boundaries of Queer Music Analysis: A Comparative Study with AI Insights

A research team arround Dr. Jörg Mühlhans from the University of Vienna, Institute of Musicology is conducting a large-scale quantitative analysis of 125 queer music songs, revealing key trends in emotional tone and queer representation, and aims to integrate Cyanite’s AI to validate, expand, and refine these insights for future research.

Thesis: What is the greatest factor in making a timeless song?

Huw Lloyd is conducting primary research for his dissertation to investigate the key musical, historical/cultural, and economic factors that contribute to a “timeless song”—one that resonates across generations—aiming to determine the most influential elements and provide insights for musicians seeking to understand and apply these trends.

Feel inspired?

If you have a research project in mind or would like to try out the technology that supports these innovative projects, click the button below to analyze your own songs!

The Evolution of Electronic Music (2022-2024) – AI Data Analysis with RA’s Top Tracks

The Evolution of Electronic Music (2022-2024) – AI Data Analysis with RA’s Top Tracks

Vincent

Vincent

Marketing Intern @ Cyanite

The landscape of electronic music is always changing due to artistic innovation, technological breakthroughs, and cultural trends. Resident Advisor’s Top Tracks of 2022, 2023, and 2024, were thoroughly evaluated by Cyanite’s AI, in an attempt to methodically examine these changes through a detailed AI Data Analysis.

Such analyses are valuable because they provide data-driven insights into listening behavior and musical trends, confirming or challenging existing assumptions. A good example for this is Cyanite’s Club Sounds Analysis, which examined trends in club music and uncovered clear patterns in tempo, energy, and emotional shifts over time. 

One of the most prominent examples of these Analysis is Spotify Wrapped – which has shown how data-backed insights about user listening habits generate interest and engagement, offering artists, labels, and listeners a deeper understanding of musical developments. Cyanite’s AI-driven approach brings the same level of clarity to the ever-evolving electronic music landscape, making implicit trends measurable and comparable over time.

Most importantly, Cyanite’s AI delivers an objective perspective on music, which opens a lot of possibilities for profound analysis. 

This data story finds notable changes in voice predominance, emotional tone, and genre diversity using Cyanite’s machine learning models that can differentiate between more than 2,500 genres and offer in-depth mood and compositional evaluations.

The findings indicate a progressive fragmentation of electronic music, an increasing integration of vocal elements, and a marked shift towards darker, more introspective moods.

1. Increasing Prominence of Vocals and the Decline of Instrumental Tracks

A notable trend observed in the analysis is the diminishing presence of instrumental compositions alongside an increase in male vocals.

  •  

Key Findings:

  • Male vocals have become increasingly prominent, suggesting a shift towards vocal-driven electronic music.

  • The overall balance between instrumental and vocal compositions has changed, with lyric-based narratives gaining a stronger foothold in the genre, while instrumental tracks have seen a significant decline between 2022 and 2024.

This trend suggests a convergence between electronic and vocal-centric musical styles, potentially influenced by developments in popular music consumption patterns and the growing demand for more emotionally direct musical expressions.

2. Mood Data Analysis: A Shift Toward Darker, More Introspective Compositions

Over the last three years, there has been a noticeable shift in the emotional terrain of electronic music. Cyanite’s AI-generated mood classifications show an increase in darker, more ambiguous emotional tones and a decline in upbeat and joyful musical elements.

Key Findings:

  • Reduction in the prevalence of “happy” and “uplifting” moods.

  • Growth in moods classified as “mysterious,” “weird,” and “strange”, reflecting an increasing tendency toward introspection and abstraction.

  • Energetic and determined moods remain stable, indicating continuity in the genre’s dynamic core.

These findings align with broader sociocultural shifts, where uncertainty, complexity, and experimentation are becoming more prominent themes in contemporary artistic expression.

3. Genre Expansion and Increased Diversification 

One of the most significant discoveries pertains to the increasing diversification of genre influences. Our AI, which is capable of differentiating between thousands of genres, has identified a 40% increase in distinct genre influences between 2023 and 2024.

This increased hybridization implies that the limits of electronic music are opening up more and more, allowing for the incorporation of non-traditional influences into the genre.

Key Findings:

  • Techno and house music are losing ground to more experimental subgenres.

  • Subgenres such as Breakbeat, IDM, and bass music have gained prominence.

  • Genres previously outside the electronic domain—such as indie pop, shoegaze, and noise pop—are increasingly integrated into electronic compositions.

This genre fragmentation suggests that electronic music is moving toward greater stylistic pluralism, potentially leading to a subcultural diversification within the broader electronic music ecosystem.

Implications for the Future of Electronic Music

These findings have significant implications for artists, producers, and industry professionals seeking to understand and anticipate the trajectory of electronic music.

Key Takeaways:

  • The integration of vocals into electronic music is increasing, signaling a shift away from purely instrumental compositions.
  • Mood expressions are evolving, with a growing emphasis on introspection, complexity, and abstraction.
  • Electronic music is becoming increasingly hybrid, incorporating elements from a diverse range of musical traditions.
  • The rate of subgenre fragmentation is increasing, which raises concerns about how electronic music communities and their consumers will develop in the future.

Future Research Directions

Given these findings, further research could explore:

  • The relationship between sociopolitical factors and musical mood shifts.
  • The extent to which AI-generated insights can predict future genre evolution.
  • How these trends correlate with streaming and consumption behaviors in digital music platforms.

Tagging Beyond Music Discovery – A Strategic Tool

Beyond pure music discovery, this data story highlights how the importance of tagging and metadata analysis is expanding into strategic decision-making. As previously discussed in the Synchblog, structured tagging not only helps with search and recommendation but also shapes business strategies.

For example, one German music publisher used Cyanite’s insights to identify a critical gap in their catalog: While epic and cinematic music remains highly relevant for sync licensing, they had almost none of it in their repertoire. By shifting from gut feeling to data-driven content acquisition, they were able to adjust their catalog strategy accordingly.

AI Data Analysis for labels, publishers, and music libraries:

Data-driven insights generally provide a competitive advantage by optimizing key business areas:

  • Strategic Content Acquisition: Identify gaps in the catalog (e.g., missing genres or moods) and align acquisitions with data-driven demand trends.

     

  • Licensing & Sync Optimization: Prioritize metadata tagging to improve discoverability and match content to industry needs (e.g., film, gaming, advertising).

     

  • Market Positioning & Trend Monitoring: Track shifts in listener preferences, adjust marketing strategies, and ensure the catalog aligns with emerging industry trends.

     

  • A&R & Artist Development: Use genre and mood insights to guide signings and support artists in exploring high-demand styles.

These insights help catalog owners make informed, strategic decisions, replacing gut feeling with actionable market data.


Conclusion

Cyanite’s AI data analysis of Resident Advisor’s Top Tracks (2022–2024) provides compelling evidence of a rapidly evolving electronic music landscape. With vocals becoming increasingly integral, emotional expressions growing darker, and genre boundaries dissolving, the industry is entering a phase of heightened complexity and innovation.

For artists, labels, and curators, understanding these shifts is crucial for adapting to the changing demands of audiences and staying at the forefront of musical development.

By leveraging advanced AI-driven music analysis, we can gain deeper insights into the intricate mechanisms shaping the future of sound.

AI Music Search Algorithms: Gender Bias or Balance?

AI Music Search Algorithms: Gender Bias or Balance?

This is part 1 of 2. To dive deeper into the data we analyzed, click here to check out part 2.

Gender Bias in AI Music: An Introduction

Gender Bias in AI Music Search is often overlooked. With the upcoming release of Cyanite 2.0, we aim to address this issue by evaluating gender representation in AI music algorithms, specifically comparing male and female vocal representation across both our current and updated models.

Finding music used to be straightforward: you’d search by artist name or song title. But as music catalogs have grown, professionals in the industry need smarter ways to navigate vast libraries. That’s where Cyanite’s Similarity Search comes in, offering an intuitive way to discover music using reference tracks. 

In our evaluation, we do not want to focus solely on perceived similarity but also on the potential gender bias of our algorithm. In other words, we want to ensure that our models not only meet qualitative standards but are also fair—especially when it comes to gender representation

In this article, we evaluate both our currently deployed algorithms Cyanite 1.0 and Cyanite 2.0 to see how they perform in representing artists of different genders, using a method called propensity score estimation.

Cyanite 2.0 – scheduled for Nov 1st, 2024, will cover an updated version of Cyanite’s Similarity and Free Text Search, scoring higher in blind tests measuring the similarity of recommended tracks to the reference track.

    Why Gender Bias and Representation Matters in Music AI

    In machine learning (ML), algorithmic fairness ensures automated systems aren’t biased against specific groups, such as by gender or race. For music, this means that AI music search should equally represent both male and female artists when suggesting similar tracks.

    An audio search algorithm can sometimes exhibit gender bias as an outcome of a Similarity Search. For instance, if an ML model is trained predominantly on audio tracks with male vocals, it may be more likely to suggest audio tracks that align with traditional male-dominated artistic styles and themes. This can result in the underrepresentation of female artists and their perspectives.

    The Social Context Behind Artist Representation

    Music doesn’t exist in a vacuum. Just as societal biases influence various industries, they also shape music genres and instrumentation. Certain instruments—like the flute, violin, and clarinet—are more often associated with female artists, while the guitar, drums, and trumpet tend to be dominated by male performers. These associations can extend to entire genres, like country music, where studies have shown a significant gender bias with a decline in female artist representation on radio stations over the past two decades. 

    What this means for AI Music Search models is that if they aren’t built to account for these gendered trends, they may reinforce existing gender- and other biases, skewing the representation of female artists.

    How We Measure Fairness in Similarity Search

    At Cyanite, we’ve worked to make sure our Similarity Search algorithms reflect the diversity of artists and their music. To do this, we regularly audit and update our models to ensure they represent a balanced range of artistic expressions, regardless of gender.

    But how do we measure whether our models are fair? That’s where propensity score estimation comes into play.

    What Are Propensity Scores?

    In simple terms, propensity scores measure the likelihood of a track having certain features—like specific genres or instruments—that could influence whether male or female artists are suggested by the AI. These scores help us analyze whether our models are skewed toward one gender when recommending music.

    By applying propensity scores, we can see how well Cyanite’s algorithms handle gender bias. For example, if rock music and guitar instrumentation are more likely to be associated with male artists, we want to ensure that our AI still fairly recommends tracks with female vocals in those cases.

    Bar chart comparing the average female vocal presence across two Cyanite AI models. The blue bars represent the old model (Cyanite 1.0), and the green bars represent the improved model (Cyanite 2.0). A horizontal dashed purple line at 50% indicates the target for gender parity. The x-axis displays the likelihood of female vocals in different ranges, while the y-axis shows the percentage of female presence.

    Picture 1: We aim for gender parity in each bin, meaning the percentage of tracks with female vocals should be approximately 50%. The closer we are to that horizontal purple dashed line, the better our algorithm performs in terms of gender fairness.

    Comparing Cyanite 1.0 and Cyanite 2.0

    To evaluate our algorithms, we created a baseline model that predicts the likelihood of a track featuring female vocals, relying solely on genre and instrumentation data. This gave us a reference point to compare with Cyanite 1.0 and Cyanite 2.0.

    Take a blues track featuring a piano. Our baseline model would calculate the probability of female vocals based only on these two features. However, this model struggled with fair gender representation, particularly for female artists in genres and instruments dominated by male performers. The lack of diverse gender representation in our test dataset for certain genres and instruments made it difficult for the baseline model to account for societal biases that correlate with these features.

    The Results

    The baseline model significantly underestimated the likelihood of female vocals in tracks with traditionally male-associated characteristics, like rock music or guitar instrumentation. This shows the limitations of a model that only considers genre and instrumentation, as it lacks the capacity to handle high-dimensional data, where multiple layers of musical features influence the outcome.

    In contrast, Cyanite’s algorithms utilize rich, multidimensional embeddings to make more meaningful connections between tracks, going beyond simple genre and instrumentation pairings. This allows our models to provide more nuanced and accurate predictions.

    Despite its limitations, the baseline model was useful for generating a balanced test dataset. By calculating likelihood scores, we paired male vocal tracks with female vocal tracks that had similar characteristics using a nearest-neighbour approach. This helped eliminate outliers, such as male vocal tracks without clear female counterparts and resulted in a balanced dataset of 2,503 tracks, each with both male and female vocal representations.

    When we grouped tracks into bins based on the likelihood of female vocals, our goal was a near-equal presence of female vocals across all bins, with 50% representing the ideal gender balance. We conducted this analysis for both Cyanite 1.0 and Cyanite 2.0.

    The results were clear: Cyanite 2.0 produced the fairest and most accurate representation of both male and female artists. Unlike the baseline model and Cyanite 1.0, which showed fluctuations and sharp declines in female vocal predictions, Cyanite 2.0 consistently maintained balanced gender representation across all probability ranges.

    To see more explanation on how propensity scores can help aid gender bias in AI music and balance the gender gap, check out part 2 of this article.

    Conclusion: A Step Towards Fairer Music Discovery

    Cyanite’s Similarity Search has applications beyond ensuring gender fairness. It helps professionals to:

     

    • Use reference tracks to find similar tracks in their catalogs.
    • Curate and optimize playlists based on similarity results.
    • Increase the overall discoverability of a catalog.

    Our comparative evaluation of artist gender representation highlights the importance of algorithmic fairness in music AI. With Cyanite 2.0, we’ve made significant strides in delivering a balanced representation of male and female vocals, making it a powerful tool for fair music discovery.

    However, it’s crucial to remember that societal biases—like those seen in genres and instrumentation—don’t disappear overnight. These trends influence the data that AI music search models and genAI models are trained on, and we must remain vigilant to prevent them from reinforcing existing inequalities.

    Ultimately, providing fair and unbiased recommendations isn’t just about gender—it’s about ensuring that all artists are represented equally, allowing catalog owners and music professionals to explore the full spectrum of musical talent. At Cyanite, we’re committed to refining our models to promote diversity and inclusion in music discovery. By continuously improving our algorithms and understanding the societal factors at play, we aim to create a more inclusive music industry—one that celebrates all artists equally.

    If you’re interested in using Cyanite’s AI to find similar songs or learn more about our technology, feel free to reach out via mail@cyanite.ai.

    You can also try our free web app to analyze music and experiment with similarity searches without needing any coding skills.

    AI Music Recommendation Fairness: Gender Balance

    AI Music Recommendation Fairness: Gender Balance

    Eylül

    Eylül

    Data Scientist at Cyanite

    Part 2 of 2. To get a more general overview of AI Music recommendation fairness – more specifically the topic of gender bias, click here to check out part 1.

    Diving Deeper: The Statistics of Fair Music Discovery

    While the first part of this article introduced the concept of gender fairness in music recommendation systems in an overview, this section delves into the statistical methods and models that we employ at Cyanite to evaluate and ensure AI music recommendation fairness, particularly in gender representation. This section assumes familiarity with concepts like logistic regression, propensity scores, and algorithmic bias, so let’s dive right into the technical details.

    Evaluating Fairness Using Propensity Score Estimation

    To ensure our music discovery algorithms offer fair representation across different genders, we employ propensity score estimation. This technique allows us to estimate the likelihood (or propensity) that a given track will have certain attributes, such as the genre, instrumentation, or presence of male or female vocals. Essentially, we want to understand how different features of a song may bias the recommendation system and adjust for that bias accordingly to enhance AI music recommendation fairness.

    Baseline Model Performance

    Before diving into our improved music discovery algorithms, it’s essential to establish a baseline for comparison. We created a basic logistic regression model that utilizes only genre and instrumentation to predict the probability of a track featuring female vocals. 

    A model is considered well-calibrated when its predicted probabilities (represented by the blue line) closely align with the actual outcomes (depicted by the purple dashed line in the graph below). 

    Calibration plot comparing the predicted probability to the true probability in a logistic regression model. The solid blue line represents the logistic regression performance, while the dashed purple line represents a perfectly calibrated model. The x-axis shows the predicted probability, and the y-axis shows the true probability in each bin

    Picture 1: Our analysis shows that the logistic regression model used for baseline analysis tends to underestimate the likelihood of female vocal presence within a track at higher probability values. This is evident from the model’s performance, which falls below the diagonal line in reliability diagrams. The fluctuations and non-linearity observed suggest the limitations of relying solely on genres and instrumentation to predict artist representation accurately.

    Propensity Score Calculation

    In Cyanite’s Similarity Search – one of our music discovery algorithms – we model the likelihood of female vocals in a track as a function of genre and instrumentation using logistic regression. This gives us a probability score for each track, which we refer to as the propensity score. Here’s a basic formula we use for the logistic regression model:

    Logistic regression formula used to calculate the probability that a track contains female vocals based on input features like genre and instrumentation. The equation shows the probability of the binary outcome Y being 1 (presence of female vocals) given input features X. The formula includes the intercept (β0) and coefficients (β1, β2, ..., βn) for each input feature.

    Picture 2: The output is a probability (between 0 and 1) representing the likelihood that a track will feature female vocals based on its attributes. 

    Binning Propensity Scores for Fairness Evaluation

    To assess the AI music recommendation fairness of our models by observing the correlations between the input features such as genre and instrumentation with the gender of the vocals, we analyze for each propensity the model outcome of the female artist ratio. To see the trend of continuous propensity scores into discrete variables and the average of female vocal presentation for that range, binning has been applied. 

    We then calculate the percentage of tracks within each bin that have female vocals as the outcome of our models. This allows us to visualize the actual gender representation across different probability levels and helps us evaluate how well our music discovery algorithms promote gender balance.

     

    A bar chart comparing the average female vocal presence in Cyanite's Similarity Search results across different metadata groups.

    Picture 3: We aim for gender parity in each bin, meaning the percentage of tracks with female vocals should be approximately 50%. The closer we are to that horizontal purple dashed line, the better our algorithm performs in terms of gender fairness.

    Comparative Analysis: Cyanite 1.0 vs Cyanite 2.0

    By comparing the results of Cyanite 1.0 and Cyanite 2.0 against our baseline logistic regression model, we can quantify how much fairer our updated algorithm is.

    • Cyanite 1.0 showed an average female presence of 54%, indicating a slight bias towards female vocals.

    • Cyanite 2.0, however, achieved 51% female presence across all bins, signaling a more balanced and fair representation of male and female artists.

    This difference is crucial in ensuring that no gender is disproportionately represented, especially in genres or with instruments traditionally associated with one gender over the other (e.g., guitar for males, flute for females). Our results underscore the improvements in AI music recommendation fairness.

    How Propensity Scores Help Balance the Gender Gap

    Propensity score estimation is a powerful tool that allows us to address biases in the data samples used to train our music discovery algorithms. Specifically, propensity scores help ensure that features like genre and instrumentation do not disproportionately affect the representation of male or female artists in music recommendations.

    The method works by estimating the likelihood of a track having certain features (such as instrumentation, genre, or other covariates) using and checking if those features directly influence our Similarity Search by putting our algorithms to the test. Therefore, we investigate the spurious correlation which is directly related to gender bias in our dataset, partly from the societal biases. 

    We would like to achieve a scenario where we could represent genders equally in all kinds of music. This understanding allows us to fine-tune the model’s behavior to ensure more equitable outcomes and further improve our algorithms.

    Conclusion: Gender Balance 

    In conclusion, our comparative analysis of artist gender representation in music discovery algorithms highlights the importance of music recommendation fairness in machine learning models.

    Cyanite 2.0 demonstrates a more balanced representation, as evidenced by a near-equal presence of female and male vocals across various propensity score ranges.

    If you’re interested in using Cyanite’s AI to find similar songs or learn more about our technology, feel free to reach out via mail@cyanite.ai.

    You can also try our free web app to analyze music and experiment with similarity searches without needing any coding skills.

    Music CMS Solutions Compatible with Cyanite: A Case Study

    Music CMS Solutions Compatible with Cyanite: A Case Study

    In today’s digital age, efficiently managing vast amounts of content is crucial for businesses, especially in the music industry. For those who decide not to build their own library environment, music Content Management Systems (CMS) have become indispensable tools. At Cyanite, we integrate our AI-powered analysis and search algorithms with these systems – helping you create music moments.

    In this blog post, we’ll delve into Cyanite’s compatibility with various CMS. We’ll provide an overview of the features Cyanite offers for each platform, recommend the ideal user types for each CMS, and include relevant examples

    Additionally, you’ll find information on how to use Cyanite via each of these providers.

    A Spreadsheet giving an overview of what Cyanite features are implemented into which content management system.

      Synchtank

      Synchtank provides cutting-edge SaaS solutions specifically designed to simplify and streamline asset and rights management, content monetization, and revenue processing. 

      It is trusted by some of the world’s leading music and media companies, including NFL, Peermusic, Warner Music, and Warner Bros. Discovery, to drive efficiency and boost revenue.

      Cyanite Features Available

      • Auto-Tagging
      • Auto-Descriptions
      • Similarity Search

      Recommended for

      • Music Publishers
      • Record Labels
      • Production Music Libraries
      • Broadcast Media/Entertainment Companies
      A Screenshot showing United Masters Sync's website using the CMS Synchtank

      Synchtank in United Masters Sync

      How to use Cyanite via Synchtank

      Cyanite is directly integrated into Synchtank.

      If you want to use Cyanite with Synchtank, please get in touch with a member of the Synchtank team or schedule a call with us to learn more via the button below.

      Reprtoir

      Reprtoir is a France-based CMS offering solutions for asset management, playlists, contacts, contracts, accounting, and analytics – providing supported data formats for various music platforms, distributors, music techs, and collective management organizations.

      Cyanite Features Available

      • Auto-Tagging
      • Auto-Descriptions
      • Similarity Search
      • Free Text Search
      • Visualizations

      Recommended for

      • Record Labels
      • Music Publishers
      • Production Music Libraries
      • Sync Teams
      A screen recording of Reprtoir, a music content management system. It provides a brief overview of Cyanite's integration into the platform.
      Screen Recording of Reprtoir with Cyanite

      How to use Cyanite via Reprtoir

      Cyanite is directly integrated into Reprtoir.

      If you want to use Cyanite with Reprtoir, please get in touch with a member of the Reprtoir team or schedule a call with us to learn more via the button below.

      Source Audio

      US-based Source Audio is a CMS that features built-in music distribution and offers access to broadcasters and streaming networks. Whilst offering its own AI tagging and search functions, again, specifically larger catalogs will find deeper, more accurate tagging necessary to effectively navigate their repertoire.

      Cyanite Features Available

      • Auto-Tagging
      • Auto-Descriptions

      Recommended for

      • Production Music Libraries
      • TV-Networks and Streaming Services
      A Screenshot showing the Interface of the Music CMS Source Audio

      How to use Cyanite via Sourceaudio

      Cyanite is directly integrated into Sourceaudio.

      If you want to use Cyanite inside Sourceaudio, send us an email or schedule a call below.

      Harvest Media

      Harvest Media is an Australian cloud-based music business service. They were founded in 2008 and offer catalog managing, licensing, and distribution tools based on standardized metadata and music search engines.

      Cyanite Features Available

      • Auto-Tagging
      • Auto-Descriptions
      • Similarity Search
      • Free Text Search

      Recommended for

      • Production Music Libraries
      • Music Publishers
      • Music Licensing & Subscription Services
      • Record Labels
      • TV Production, Broadcast and Entertainment Companies
      A screen recording of Human Librarian's interface, based on the CMS Harvest Media. It provides a brief overview of Cyanite's integration into the platform.

      Screen Recording of Harvest Media in Human Librarian

      How to use Cyanite via Harvest Media

      Cyanite is directly integrated into Harvest Media.

      If you want to use Cyanite inside Harvest Media, send us an email or schedule a call below.

      MusicMaster

      MusicMaster is the industry-standard software for professional music scheduling. It offers flexible rule-based planning, seamless integration with automation systems, and scalable tools for managing music programming across single stations or complex broadcast networks.

      Cyanite Features Available

      • Auto-Tagging
      • Visualizations

      Recommended for

      • Broadcast radio groups
      • FM/AM radio stations
      • Satellite radio networks
      A screen recording of Human Librarian's interface, based on the CMS Harvest Media. It provides a brief overview of Cyanite's integration into the platform.

      Screenshot of MusicMaster Scheduling Software

      How to use Cyanite via MusicMaster

      Cyanite is directly integrated into MusicMaster.

      If you want to use Cyanite inside MusicMaster, send us an email or schedule a call below.

      Cadenzabox

      Cadenzabox is one of the UK-based music Content Management Systems offering tagging, search, and licensing tools as a white-label service, enabling brand-specific designs and a deep level of customization built by Idea Junction – a full-service digital creative studio. 

      Cyanite Features Available

      • Auto-Tagging
      • Auto-Descriptions
      • Similarity Search
      • Free Text Search

      Recommended for

      • Production Music Libraries
      • Music Publishers
      A screen recording of Music Mind Co., a music library using the content management system Cadenzabox. It provides a brief overview of Cyanite's integration into the platform.

      Screen Recording of Cadenzabox in MusicMind Co.

      How to use Cyanite via Cadenza Box

      Cyanite is directly integrated into Cadenzabox.

      If you want to use Cyanite inside Cadenzabox, send us an email or schedule a call below.

      Tunebud

      UK-based Tunebud offers an easy, no-code music library website-building solution complete with extensive file delivery features, music search, playlist creation, e-commerce solutions, watermarking, and bulk downloads. It’s an all-in-one music library website solution suitable for individual composers wanting to showcase their works to music publishers and labels looking for a music sync solution for catalogs of up to 500k tracks.  

      Cyanite Features Available

      • Auto-Tagging
      • Auto-Descriptions
      • Similarity Search
      • Free Text Search

      Recommended for

      • Musicians
      • Composers
      • Music Publishers
      • Record Labels
      • Music Library and SFX Library Operators
      A Screenshot showing an example website using the CMS Tunebud
      Tunebud with Cyanite’s similarity search

      How to use Cyanite via Tunebud

      Cyanite is directly integrated into Tunebud.

      If you want to use Cyanite with Tunebud, please get in touch with a member of the TuneBud team or schedule a call with us to learn more via the button below.

      Supported CMS

      DISCO

      DISCO is an Australia-based sync pitching tool to manage, share, and receive audio files. While DISCO offers its own audio tagging version, particularly catalogs north of 10,000 songs may prefer using Cyanite’s deeper, more accurate tagging to organize and browse its catalog. 

      Cyanite Features Available

      • Auto-Tagging
      • Auto-Descriptions

      Recommended for

      • Music Publishers
      • Record Labels
      • Sync Teams
      A Screenshot of the Music CMS DISCO

      DISCO

      How to use Cyanite via DISCO

      All you need to do is reach out to your DISCO customer success manager and ask for a CSV spreadsheet of your catalog including mp3 download links. We’ll download, analyze, and tag your music, according to your requirements, and you can effortlessly upload the updated spreadsheet back to DISCO.

      You decide which tags to use, which to keep, and which to replace.

      Are you missing any music Content Management Systems? Feel free to chat with us and share your thoughts!

      Haven’t decided on a CMS yet? Contact us for free testing periods.

      Your Cyanite Team.

      Music Genre Finder – Biggest Web App Update Ever!

      Music Genre Finder – Biggest Web App Update Ever!

      Biggest Web App Update Ever!

      We’re thrilled to announce the launch of our biggest web app update since its launch. We’ve been listening to your feedback and have updated the web app with the world’s most accurate genre model, over 40 new instruments, and new library views.

      Best of all: It’s free for all our users.

      The World’s Best Music Genre Finder: Free Genres

      We know that finding the right genres for your music can be a challenge – and with all the feedback we have received over the months, we have developed the most accurate genre auto-tagging algorithm in the world – with over 1,500 genres. Missing something? Let us know.

      You can find our new free genres next to the web app’s main and subgenres.

      40+ New Instruments

      We also added 40 new instruments to our web app – on top of our existing instrument tags. Everything you need – from Steel Drums to Glockenspiel.

      You can find the new advanced instrument tags right next to the regular instruments in your library.

      New Library Views

      With over 35 tagging classifiers, metadata can be overwhelming at times. That’s why we chose to let you decide how deeply you want to explore your song data.

      You can choose between three different views of your Cyanite library just by clicking the “view” drop-down in your library. Select between:

      • Compact View (ideal to get an overview)
      • Full View (dive deep into all the Cyanite metadata for your tracks)
      • API View (the full view with our API classifier names)

      Curious? Try it out!

      All of these new features are free for all our users. Don’t have a Cyanite account yet? Click here, or the button below and get started. 5 analyses per month are on us! Start using the world’s best genre finder now.

       

      Your Cyanite Team.