Experience Our Biggest Web App Update with 5,000+ New Genres! 🎉 Discover Now

Cyanite Advanced Search (API only)

Cyanite Advanced Search (API only)

Last updated on March 6th, 2025 at 02:14 pm

Unlock the Power of Advanced Search – A Glimpse into the New Cyanite

We’re excited to introduce Advanced Search, the biggest upgrade to Similarity and Free Text Search since launch. With this release, we’re offering a sneak preview into the power of the new Cyanite system.

Advanced Search brings next-level precision, scalability, and usability – all designed to supercharge your discovery workflows. From advanced filtering to more nuanced query controls, this feature is built for music teams ready to move faster and smarter.

Advanced Search Feature Overview

  • Top Scoring Segments
  • Percentage Scores
  • Customer Metdata Pre-Filtering
  • Mulitple Search Inputs
  • Multilingual Prompt Translation
  • Up to 500 Search Results

🎯 Top Scoring Segments: Zoom in on the Best Parts

We’re not just showing you results, we’re showing you their strongest moments. Each track now highlights its Top Scoring Segments for both Similarity and Free Text queries. It’s an instant way to jump to the most relevant slice of content without scrubbing through an entire track.

🔢 Percentage Scores: Total Clarity, Total Control

Now each result comes with a clear percentage Score, helping you quickly evaluate how close a match really is – both for the overall track and for each Top Scoring Segment. It’s a critical UX improvement that helps users better understand and trust the search results at a glance.

🗂️ Customer Metadata Pre-filtering: Smarter Searches Start with Smarter Filters

Upload your own metadata to filter results before the search even begins. Want only pre-cleared tracks? Looking for music released after 2020? With Customer Metadata Pre-filtering, you can target exactly what you need, making your search dramatically more efficient.

🌍 Multilingual Prompt Translation

Got a prompt in Japanese, Spanish, or Turkish? No problem. Our search engine now translates prompts from any language into English, so your creative ideas don’t get lost in translation. Say what you want, how you want – our engine will understand.

Still on the previous version?

Advanced Search is available exclusively for v7-compliant users. Migrating now not only gives you access to this feature but unlocks the full potential of the new Cyanite architecture. It’s the perfect time to make the switch.

If you’re ready to upgrade or want to learn more about v7, click the button below – we’re here to help.

Let’s take discovery to the next level. Together.

 

The Evolution of Electronic Music (2022-2024) – AI Data Analysis with RA’s Top Tracks

The Evolution of Electronic Music (2022-2024) – AI Data Analysis with RA’s Top Tracks

Vincent

Vincent

Marketing Intern @ Cyanite

The landscape of electronic music is always changing due to artistic innovation, technological breakthroughs, and cultural trends. Resident Advisor’s Top Tracks of 2022, 2023, and 2024, were thoroughly evaluated by Cyanite’s AI, in an attempt to methodically examine these changes through a detailed AI Data Analysis.

Such analyses are valuable because they provide data-driven insights into listening behavior and musical trends, confirming or challenging existing assumptions. A good example for this is Cyanite’s Club Sounds Analysis, which examined trends in club music and uncovered clear patterns in tempo, energy, and emotional shifts over time. 

One of the most prominent examples of these Analysis is Spotify Wrapped – which has shown how data-backed insights about user listening habits generate interest and engagement, offering artists, labels, and listeners a deeper understanding of musical developments. Cyanite’s AI-driven approach brings the same level of clarity to the ever-evolving electronic music landscape, making implicit trends measurable and comparable over time.

Most importantly, Cyanite’s AI delivers an objective perspective on music, which opens a lot of possibilities for profound analysis. 

This data story finds notable changes in voice predominance, emotional tone, and genre diversity using Cyanite’s machine learning models that can differentiate between more than 2,500 genres and offer in-depth mood and compositional evaluations.

The findings indicate a progressive fragmentation of electronic music, an increasing integration of vocal elements, and a marked shift towards darker, more introspective moods.

1. Increasing Prominence of Vocals and the Decline of Instrumental Tracks

A notable trend observed in the analysis is the diminishing presence of instrumental compositions alongside an increase in male vocals.

  •  

Key Findings:

  • Male vocals have become increasingly prominent, suggesting a shift towards vocal-driven electronic music.

  • The overall balance between instrumental and vocal compositions has changed, with lyric-based narratives gaining a stronger foothold in the genre, while instrumental tracks have seen a significant decline between 2022 and 2024.

This trend suggests a convergence between electronic and vocal-centric musical styles, potentially influenced by developments in popular music consumption patterns and the growing demand for more emotionally direct musical expressions.

2. Mood Data Analysis: A Shift Toward Darker, More Introspective Compositions

Over the last three years, there has been a noticeable shift in the emotional terrain of electronic music. Cyanite’s AI-generated mood classifications show an increase in darker, more ambiguous emotional tones and a decline in upbeat and joyful musical elements.

Key Findings:

  • Reduction in the prevalence of “happy” and “uplifting” moods.

  • Growth in moods classified as “mysterious,” “weird,” and “strange”, reflecting an increasing tendency toward introspection and abstraction.

  • Energetic and determined moods remain stable, indicating continuity in the genre’s dynamic core.

These findings align with broader sociocultural shifts, where uncertainty, complexity, and experimentation are becoming more prominent themes in contemporary artistic expression.

3. Genre Expansion and Increased Diversification 

One of the most significant discoveries pertains to the increasing diversification of genre influences. Our AI, which is capable of differentiating between thousands of genres, has identified a 40% increase in distinct genre influences between 2023 and 2024.

This increased hybridization implies that the limits of electronic music are opening up more and more, allowing for the incorporation of non-traditional influences into the genre.

Key Findings:

  • Techno and house music are losing ground to more experimental subgenres.

  • Subgenres such as Breakbeat, IDM, and bass music have gained prominence.

  • Genres previously outside the electronic domain—such as indie pop, shoegaze, and noise pop—are increasingly integrated into electronic compositions.

This genre fragmentation suggests that electronic music is moving toward greater stylistic pluralism, potentially leading to a subcultural diversification within the broader electronic music ecosystem.

Implications for the Future of Electronic Music

These findings have significant implications for artists, producers, and industry professionals seeking to understand and anticipate the trajectory of electronic music.

Key Takeaways:

  • The integration of vocals into electronic music is increasing, signaling a shift away from purely instrumental compositions.
  • Mood expressions are evolving, with a growing emphasis on introspection, complexity, and abstraction.
  • Electronic music is becoming increasingly hybrid, incorporating elements from a diverse range of musical traditions.
  • The rate of subgenre fragmentation is increasing, which raises concerns about how electronic music communities and their consumers will develop in the future.

Future Research Directions

Given these findings, further research could explore:

  • The relationship between sociopolitical factors and musical mood shifts.
  • The extent to which AI-generated insights can predict future genre evolution.
  • How these trends correlate with streaming and consumption behaviors in digital music platforms.

Tagging Beyond Music Discovery – A Strategic Tool

Beyond pure music discovery, this data story highlights how the importance of tagging and metadata analysis is expanding into strategic decision-making. As previously discussed in the Synchblog, structured tagging not only helps with search and recommendation but also shapes business strategies.

For example, one German music publisher used Cyanite’s insights to identify a critical gap in their catalog: While epic and cinematic music remains highly relevant for sync licensing, they had almost none of it in their repertoire. By shifting from gut feeling to data-driven content acquisition, they were able to adjust their catalog strategy accordingly.

AI Data Analysis for labels, publishers, and music libraries:

Data-driven insights generally provide a competitive advantage by optimizing key business areas:

  • Strategic Content Acquisition: Identify gaps in the catalog (e.g., missing genres or moods) and align acquisitions with data-driven demand trends.

     

  • Licensing & Sync Optimization: Prioritize metadata tagging to improve discoverability and match content to industry needs (e.g., film, gaming, advertising).

     

  • Market Positioning & Trend Monitoring: Track shifts in listener preferences, adjust marketing strategies, and ensure the catalog aligns with emerging industry trends.

     

  • A&R & Artist Development: Use genre and mood insights to guide signings and support artists in exploring high-demand styles.

These insights help catalog owners make informed, strategic decisions, replacing gut feeling with actionable market data.


Conclusion

Cyanite’s AI data analysis of Resident Advisor’s Top Tracks (2022–2024) provides compelling evidence of a rapidly evolving electronic music landscape. With vocals becoming increasingly integral, emotional expressions growing darker, and genre boundaries dissolving, the industry is entering a phase of heightened complexity and innovation.

For artists, labels, and curators, understanding these shifts is crucial for adapting to the changing demands of audiences and staying at the forefront of musical development.

By leveraging advanced AI-driven music analysis, we can gain deeper insights into the intricate mechanisms shaping the future of sound.

PR: DAACI and Cyanite Announce Strategic Collaboration

PR: DAACI and Cyanite Announce Strategic Collaboration

Last updated on February 12h, 2025 at 05:14 pm

PRESS RELEASE

DAACI and Cyanite Announce Strategic Collaboration to Enhance AI-Driven Music Discovery and Adaptive Editing for the Sync Community

London/Berlin, February 12, 2025 – DAACI, a leader in adaptive AI music technology, and Cyanite, a pioneering provider of AI-powered music tagging and search solutions, are proud to announce a strategic partnership designed to advance the sync industry. Even Music Ally was reporting about it. The partnership aims to simplify creative sync by combining the task of finding the right song quickly with the delivery of perfectly edited music, cut directly to any video or gaming asset.

Connecting DAACI’s patented Natural Edits technology with Cyanite’s advanced tagging and search capabilities, this partnership offers an API-first approach to revolutionizing the workflows of music libraries, publishers, and sync platforms. These technologies allow platforms to streamline how sync professionals find, customize, and license tracks—all while ensuring seamless integration into existing infrastructures.

Natural Edits, part of DAACI’s Natural Series, enables platforms to offer their users customizable editing options for original tracks—such as looping background beds, precise cuts for specific narratives, or snippets for advertising. Coupled with Cyanite’s AI-driven tagging and search technology, platforms can empower their clients to efficiently discover and adapt music to meet a variety of creative briefs, from film and TV placements to advertising campaigns and gaming, reducing what was once a day-long process to a matter of minutes.

“Our partnership with DAACI is focused on enhancing what music platforms can offer to the sync community,” says Markus Schwarzer, CEO of Cyanite. “Through our API integration, we’re giving music libraries and publishers the ability to make music discovery and customization intuitive, scalable, and efficient for their clients. This is about enabling the industry to evolve while keeping workflows seamless.”

Dr. Joe Lyske, Co-founder of DAACI, adds, “The sync ecosystem thrives on innovation, and our partnership with Cyanite ensures that music platforms can adapt to the changing needs of their clients. With Natural Edits and Cyanite’s tagging and search capabilities, we’re providing music libraries and sync platforms with tools to unlock the full potential of their catalogs, offering tailored solutions for professionals in the field.”

“We’re excited to see this new collaboration between DAACI and Cyanite bring AI-driven music discovery and adaptive editing to the forefront,” said Jeff Perkins, CEO of Soundstripe, a client of both DAACI and Cyanite. “We’re always looking for ways to help content creators move faster while ensuring they have access to the highest-quality music. This partnership makes it even easier for our users to find and customize the perfect track in record time.”

This collaboration signifies a pivotal moment for the sync industry, where advanced AI capabilities are integrated directly into the platforms relied upon by sync professionals. By bridging music discovery and adaptive editing through API-driven solutions, DAACI and Cyanite are equipping the sync industry with the tools it needs to stay competitive and impactful.

ENDS

About DAACI:

DAACI develops next-gen smart and AI creative music tools.

Our series of patented technologies empower music makers to meet the rapidly growing demand for personalised music. Our technologies encompass tools that supercharge the creative process dynamically composing new music in real-time, and smart editing systems that seamlessly adapt existing tracks. Built by a world-class team of musicians and composers, DAACI’s technology is based on over 30 years of research. Incorporating a growing portfolio of 79 granted patents and supported by partnerships with the UKRI Centre for Doctoral Training in Artificial Intelligence and Music at Queen Mary University of London and the innovative Abbey Road Red incubator, DAACI is the go-to solution for creators who make and use music. With our series of pioneering plugins and tools, creators can benefit from our unique approach and our understanding of deep music theory, giving them multiple lifetimes’ worth of musical experience at their fingertips. For more information on DAACI: https://www.daacigroup.com

About Cyanite:

Cyanite helps music companies transform their catalogs into personalized, AI-powered music libraries with advanced search and recommendation features. Based in Mannheim and Berlin, Germany, Cyanite develops software that enables efficient keywording and music discovery for the entertainment and advertising industries. Trusted by leading companies like BMG, Epidemic Sound, and Warner Chappell, Cyanite offers API and no-code solutions to streamline music organization and discovery. Recognized with the VIA 2023 Award for Best New Music Business, Cyanite aims to become the universal intelligence that understands, connects, and recommends music globally.

Press Contacts:

Gemma Robinson

OLEX Communications

gemma@olexcommunications.co.uk

+44 7854 813 153

AI Search Tool for Music Publishing: Best 3 Ways

AI Search Tool for Music Publishing: Best 3 Ways

In the ever-evolving landscape of sync and music publishing, leveraging advanced technology is essential for staying competitive. Cyanite offers an AI search tool for music publishing – enhancing workflows and maximizing your catalog’s potential. 

Here are three of the best ways to utilize Cyanite as a music publisher.

1. Using Cyanite’s Web App as an Internal AI Search Tool for Sync

Cyanite’s web app can serve as an AI search tool for music publishing, allowing publishers to quickly locate the right tracks for sync briefs. This streamlines the entire creative sync process:

 

    • Leverage reference tracks: Use reference tracks through Cyanite’s Similarity Search to swiftly scan your catalog for songs with similar sounds and vibes.

    • Utilize Free Text Search: Enter full briefs, scene descriptions, and other prompts (find examples here) to discover suitable music.

    • Enhance Your Pitches with Visualizations: Enrich your presentations with objective data visualizations to persuade even the most data-driven clients.

All of this not only saves time but lets anyone from your team quickly work with your entire repertoire. It also enhances the likelihood of securing sync placements, and your company’s profile to be able to find surprising, yet appropriate songs.

  •  

With the help of Cyanite’s AI tags and the outstanding search results, we were able to find forgotten gems and give them a new life in movie productions. Without Cyanite, this might never have happened.

Miriam Rech

Sync Manager, Meisel Music

2. Enriching Your DISCO Library or Source Audio with Cyanite Tags

Integrating Cyanite’s tagging capabilities into your DISCO or Source Audio library can significantly enhance your catalog’s discoverability. By automatically tagging tracks with detailed descriptors such as mood, tempo, genre, and lyrical themes, Cyanite enriches your library with objective and consistent language. This ensures you, your team, and your clients find the right music.

This enriched tagging not only improves the user experience but also increases the chances of placements by ensuring that the right tracks are easily searchable. Furthermore, providing your team with tools that deliver meaningful insights contributes to improved employee satisfaction, making their work more efficient and enjoyable.

Read more on how to upload Cyanite tags to your DISCO and Source Audio library.

When integrating catalogs from new signings, acquisitions, or sub-publishing deals, using Cyanite ensures we have consistent & unified tagging across all of our repertoire, regardless of its origin. Both on and off DISCO.

Aaron Mendelsohn

Digital Asset Manager , Reservoir Media

3. Leveraging Music CMS with Cyanite

Cyanite seamlessly integrates with various music content management systems (CMS) such as Reprtoire, Synchtank, Cadenzabox, and Harvest Media, providing music publishers with an AI search tool within their preferred platforms. This integration streamlines catalog management and enhances search functionalities, allowing publishers to efficiently find and manage their music assets.

Cyanite also offers an API for publishers who have developed their own software solutions. This enables direct access to our powerful AI music search features, allowing for customized integration and automation tailored to specific business needs.

By leveraging these integration options, music publishers can optimize their workflows, generate data-driven insights, and respond swiftly to client demands, ultimately enhancing their overall operational efficiency.

We are committed to using AI technologies to optimize our revenues so we can speed the flow of royalties to artists and songwriters. We are delighted to be working with Cyanite to enhance our Synch services.

Gaurav Mittal

CTO, BMG

Conclusion

Cyanite’s AI search tool for music publishing & sync offers publishers powerful tools to optimize their workflows, enhance catalog discoverability, and improve sync licensing processes. By using Cyanite’s web app for internal searches, enriching DISCO and Source Audio libraries with AI-generated tags, and leveraging the CMS or API for seamless integration, publishers can stay ahead in a competitive industry.

Contact us today to learn more about our services and explore the opportunity to try Cyanite for free—no strings attached.

  •  
AI Music Search Algorithms: Gender Bias or Balance?

AI Music Search Algorithms: Gender Bias or Balance?

This is part 1 of 2. To dive deeper into the data we analyzed, click here to check out part 2.

Gender Bias in AI Music: An Introduction

Gender Bias in AI Music Search is often overlooked. With the upcoming release of Cyanite 2.0, we aim to address this issue by evaluating gender representation in AI music algorithms, specifically comparing male and female vocal representation across both our current and updated models.

Finding music used to be straightforward: you’d search by artist name or song title. But as music catalogs have grown, professionals in the industry need smarter ways to navigate vast libraries. That’s where Cyanite’s Similarity Search comes in, offering an intuitive way to discover music using reference tracks. 

In our evaluation, we do not want to focus solely on perceived similarity but also on the potential gender bias of our algorithm. In other words, we want to ensure that our models not only meet qualitative standards but are also fair—especially when it comes to gender representation

In this article, we evaluate both our currently deployed algorithms Cyanite 1.0 and Cyanite 2.0 to see how they perform in representing artists of different genders, using a method called propensity score estimation.

Cyanite 2.0 – scheduled for Nov 1st, 2024, will cover an updated version of Cyanite’s Similarity and Free Text Search, scoring higher in blind tests measuring the similarity of recommended tracks to the reference track.

    Why Gender Bias and Representation Matters in Music AI

    In machine learning (ML), algorithmic fairness ensures automated systems aren’t biased against specific groups, such as by gender or race. For music, this means that AI music search should equally represent both male and female artists when suggesting similar tracks.

    An audio search algorithm can sometimes exhibit gender bias as an outcome of a Similarity Search. For instance, if an ML model is trained predominantly on audio tracks with male vocals, it may be more likely to suggest audio tracks that align with traditional male-dominated artistic styles and themes. This can result in the underrepresentation of female artists and their perspectives.

    The Social Context Behind Artist Representation

    Music doesn’t exist in a vacuum. Just as societal biases influence various industries, they also shape music genres and instrumentation. Certain instruments—like the flute, violin, and clarinet—are more often associated with female artists, while the guitar, drums, and trumpet tend to be dominated by male performers. These associations can extend to entire genres, like country music, where studies have shown a significant gender bias with a decline in female artist representation on radio stations over the past two decades. 

    What this means for AI Music Search models is that if they aren’t built to account for these gendered trends, they may reinforce existing gender- and other biases, skewing the representation of female artists.

    How We Measure Fairness in Similarity Search

    At Cyanite, we’ve worked to make sure our Similarity Search algorithms reflect the diversity of artists and their music. To do this, we regularly audit and update our models to ensure they represent a balanced range of artistic expressions, regardless of gender.

    But how do we measure whether our models are fair? That’s where propensity score estimation comes into play.

    What Are Propensity Scores?

    In simple terms, propensity scores measure the likelihood of a track having certain features—like specific genres or instruments—that could influence whether male or female artists are suggested by the AI. These scores help us analyze whether our models are skewed toward one gender when recommending music.

    By applying propensity scores, we can see how well Cyanite’s algorithms handle gender bias. For example, if rock music and guitar instrumentation are more likely to be associated with male artists, we want to ensure that our AI still fairly recommends tracks with female vocals in those cases.

    Bar chart comparing the average female vocal presence across two Cyanite AI models. The blue bars represent the old model (Cyanite 1.0), and the green bars represent the improved model (Cyanite 2.0). A horizontal dashed purple line at 50% indicates the target for gender parity. The x-axis displays the likelihood of female vocals in different ranges, while the y-axis shows the percentage of female presence.

    Picture 1: We aim for gender parity in each bin, meaning the percentage of tracks with female vocals should be approximately 50%. The closer we are to that horizontal purple dashed line, the better our algorithm performs in terms of gender fairness.

    Comparing Cyanite 1.0 and Cyanite 2.0

    To evaluate our algorithms, we created a baseline model that predicts the likelihood of a track featuring female vocals, relying solely on genre and instrumentation data. This gave us a reference point to compare with Cyanite 1.0 and Cyanite 2.0.

    Take a blues track featuring a piano. Our baseline model would calculate the probability of female vocals based only on these two features. However, this model struggled with fair gender representation, particularly for female artists in genres and instruments dominated by male performers. The lack of diverse gender representation in our test dataset for certain genres and instruments made it difficult for the baseline model to account for societal biases that correlate with these features.

    The Results

    The baseline model significantly underestimated the likelihood of female vocals in tracks with traditionally male-associated characteristics, like rock music or guitar instrumentation. This shows the limitations of a model that only considers genre and instrumentation, as it lacks the capacity to handle high-dimensional data, where multiple layers of musical features influence the outcome.

    In contrast, Cyanite’s algorithms utilize rich, multidimensional embeddings to make more meaningful connections between tracks, going beyond simple genre and instrumentation pairings. This allows our models to provide more nuanced and accurate predictions.

    Despite its limitations, the baseline model was useful for generating a balanced test dataset. By calculating likelihood scores, we paired male vocal tracks with female vocal tracks that had similar characteristics using a nearest-neighbour approach. This helped eliminate outliers, such as male vocal tracks without clear female counterparts and resulted in a balanced dataset of 2,503 tracks, each with both male and female vocal representations.

    When we grouped tracks into bins based on the likelihood of female vocals, our goal was a near-equal presence of female vocals across all bins, with 50% representing the ideal gender balance. We conducted this analysis for both Cyanite 1.0 and Cyanite 2.0.

    The results were clear: Cyanite 2.0 produced the fairest and most accurate representation of both male and female artists. Unlike the baseline model and Cyanite 1.0, which showed fluctuations and sharp declines in female vocal predictions, Cyanite 2.0 consistently maintained balanced gender representation across all probability ranges.

    To see more explanation on how propensity scores can help aid gender bias in AI music and balance the gender gap, check out part 2 of this article.

    Conclusion: A Step Towards Fairer Music Discovery

    Cyanite’s Similarity Search has applications beyond ensuring gender fairness. It helps professionals to:

     

    • Use reference tracks to find similar tracks in their catalogs.
    • Curate and optimize playlists based on similarity results.
    • Increase the overall discoverability of a catalog.

    Our comparative evaluation of artist gender representation highlights the importance of algorithmic fairness in music AI. With Cyanite 2.0, we’ve made significant strides in delivering a balanced representation of male and female vocals, making it a powerful tool for fair music discovery.

    However, it’s crucial to remember that societal biases—like those seen in genres and instrumentation—don’t disappear overnight. These trends influence the data that AI music search models and genAI models are trained on, and we must remain vigilant to prevent them from reinforcing existing inequalities.

    Ultimately, providing fair and unbiased recommendations isn’t just about gender—it’s about ensuring that all artists are represented equally, allowing catalog owners and music professionals to explore the full spectrum of musical talent. At Cyanite, we’re committed to refining our models to promote diversity and inclusion in music discovery. By continuously improving our algorithms and understanding the societal factors at play, we aim to create a more inclusive music industry—one that celebrates all artists equally.

    If you’re interested in using Cyanite’s AI to find similar songs or learn more about our technology, feel free to reach out via mail@cyanite.ai.

    You can also try our free web app to analyze music and experiment with similarity searches without needing any coding skills.