Why AI labels and metadata now matter in licensing

Why AI labels and metadata now matter in licensing

A new industry report from Cyanite, MediaTracks, and Marmoset reveals how professionals are navigating the rise of AI-generated music. Read here.

AI’s move to the mainstream has changed what people expect from music catalogs. Licensing teams now look for clearer data about the music they review. They want to know whether it’s human-made or AI-generated, and they also look for details that help place the music in the right creative or cultural setting. Many check these cues first, then move on to mood or tone.

At Cyanite, we partnered with MediaTracks and Marmoset to understand the level of transparency and cultural context music licensing professionals expect when reviewing AI-generated music. MediaTracks and Marmoset surveyed 144 people across their professional communities—including music supervisors, filmmakers, advertisers, and producers—and we worked with them to interpret the findings and publish this report.

The responses revealed that most people want clear labeling when AI is involved. Yet, despite this shared desire for transparency, only about half of the respondents said they would only work with human-made music.

The full study goes deeper into these findings and shows how they play out in real licensing work.

Why we ran this study

We wanted a clear view of how people make decisions when AI enters the picture. The conversation around AI in music moves fast, and many teams now ask for context that helps them explain their selections to clients. This study aimed to find out which parts of the metadata give them that confidence.

It also looked at how origin details and creator context guide searches and reviews. We wanted to see where metadata supports the day-to-day licensing process and where there are gaps.

Transparency is now a baseline expectation

97% of respondents said they want AI-generated music to be clearly labeled, and 37% used the word “transparency” in their written responses. They want a straightforward read on what they’re listening to. Some tied this to copyright worries. One person put it simply: 

“I’m concerned that if it were AI-generated, where did the AI take the themes or phrases from? Possible copyright infringement issues.”

Transparency doesn’t just apply to the AI label. We found that respondents also see context as part of that clarity—knowing who made the music and where it comes from. This information helps them assess whether the music is a good fit for the project. They use it during searches to filter for cultural background or anything else that’s relevant to the brief.

What these findings mean for the industry

These findings show how much clarity now shapes day-to-day work in music catalogs. People expect AI music to be labeled accordingly, and they lean on context to move through searches and briefs without second-guessing their choices. Human-made music is still highly valued. The real change has been in how teams use origin details to feel sure about their selection.

This sets a new bar for how catalogs present their music. Teams want dependable information, including context that helps them avoid missteps in projects that depend on cultural accuracy or narrative alignment.

This finding ties into how Cyanite supports catalogs today. Our audio-first analysis gives people a clear read of the music itself, which sits alongside the cultural or creative context they already rely on. It helps teams search with more clarity and meet the expectations that are now shaping the industry.

How Cyanite’s advanced search fits in

The study showed how important cultural background and creator context are when people review music. Teams often keep their own notes and metadata for this reason. Cyanite’s Advanced Search supports that need by letting catalogs add and use their own custom information in the search.

Custom Metadata Upload – one of many features of our new Advanced Search, lets you upload your own tags – such as cultural or contextual details that don’t come from the audio analysis – and use them as filters. You can set your own metadata criteria first, and the system will search only within the tracks that match those inputs.

When you then run a Similarity- or Free Text Search, the model evaluates musical similarity inside that filtered subset. As a result, search and discovery reflects both the sound of a track and the context around it.

You can search your catalog for “upbeat indie rock” but you can also search for “upbeat indie rock, human-produced, female-led, one-stop cleared, independent.

Read the full report

The survey drew responses from people who license music often as part of their work and feel the impact of unclear metadata. Their answers show how they think about AI involvement, creator background, and the context they need when they search.

The full report brings these findings together with information about the study—who took part, how often they search, the questions they answered, and how responses differed by role. It also includes partner insights from MediaTracks and Marmoset, along with charts and quotes that show how transparency and context shape real choices in licensing.

You can read the full study here.

The Power of Automatic Music Tagging with AI

The Power of Automatic Music Tagging with AI

Ready to transform how you manage your catalog? Start auto-tagging music with Cyanite

We know managing a large music catalog can feel overwhelming. When metadata is inconsistent or incomplete, tracks become difficult to find and hard to work with. The result is a messy catalog that you have to sort out manually—unless you use AI auto-tagging.

Read more to see how automatic music tagging reduces friction and helps you organize your catalog more accurately.

What is automatic music tagging?

Automatic music tagging is an audio analysis process that identifies a song’s mood, genre, energy, tempo, instrumentation, and other core attributes. A music analyzer AI listens to the track and applies these labels with consistent logic, providing stable metadata across your catalog.

AI tagging supports teams that depend on fast, accurate search. For example, if you run a sync marketplace that needs to respond to briefs quickly, you can surface the right tracks in seconds when the metadata aligns with the sound you’re looking for. If you work at a production library with thousands of incoming submissions, you can review new material more efficiently when the system applies consistent labels from day one. The same applies to music-tech platforms that want stronger discovery features without having to build their own models.

Benefits of auto-tagging for music professionals

AI auto-tagging brings value across the music industry. When tracks enter your system with clear, predictable metadata, teams can work with more confidence and fewer bottlenecks, supporting smoother catalog operations overall.

  • Faster creative exploration: Sync and production teams can filter and compare tracks more quickly during pitches, making it easier to deliver strong options under time pressure.

  • More reliable handoffs between teams: When metadata follows the same structure, creative, technical, and rights teams work from the same information without needing to reinterpret tags.

  • Improved rights and version management: Publishers benefit from predictable metadata when preparing works for licensing, tracking versions, and organizing legacy catalogs.

  • Stronger brand alignment in audio branding: Agencies working on global campaigns can rely on mood and energy tags that follow the same structure across regions, helping them maintain a consistent brand identity.

  • Better technical performance for music platforms: When metadata is structured from the start, product and development teams see fewer ingestion issues, more stable recommendations, and smoother playlist or search behavior.

  • Greater operational stability for leadership: Clear, consistent metadata lowers risk, supports scalability, and gives executives more confidence in the long-term health of their catalog systems.

Why manual music tagging fails at scale

There’s a time and a place for running a music catalog manually: if your track selection is small and your team has the capacity to listen to each song one by one and label them carefully. But as your catalog grows, that process will start to break down.

Tags can vary from person to person, and different editors will likely use different wording. Older metadata rarely matches newer entries. Some catalogs even carry information from multiple systems and eras, which makes the data harder to trust and use.

Catalog managers are not the only ones feeling this pain. This inconsistent metadata slows down the search for creative teams. Developers are also affected when this unreliable data disrupts user-facing recommendation and search features. So the more music you manage, the more this manual-tagging bottleneck grows.

When human collaboration is still needed

While AI can provide consistent metadata at scale, creative judgment still matters. People add the cultural context and creative insight that go beyond automated sound analysis. Also, publishers sometimes adapt tags for rights considerations or for more targeted sync opportunities.

The goal of AI auto-tagging is not to replace human input, but to give your team a stable foundation to build on. With accurate baseline metadata, you can focus on adding the context that carries strategic or commercial value.

Cyanite has maybe most significantly improved our work with its Similarity Search that allows us to enhance our searches objectively, melting away biases and subjective blind spots that humans naturally have.

William Saunders

Co-Owner & Creative Director, MediaTracks

How does AI music tagging work at Cyanite?

At Cyanite, our approach to music analysis is fully audio-based. When you upload a track, our AI analyzes only the sound of the file—not the embedded metadata. Our model listens from beginning to end, capturing changes in mood, instrumentation, and energy across the full duration.

We start by converting the MP3 audio file into a spectrogram, which turns the sound into a visual pattern of frequencies over time. This gives our system a detailed view of the track’s structure. From there, computer vision models analyze the spectrogram to detect rhythmic movement, instrument layers, and emotional cues across the song. After the analysis, the model generates a set of tags that describe these characteristics. We then refine the output through post-processing to keep the results consistent, especially when working with large or fast-growing catalogs.

This process powers our music tagging suite, which includes two core products:

  • Auto-Tagging: identifies core musical attributes such as genre, mood, instrumentation, energy level, movement, valence–arousal position, and emotional dynamics. Each label is generated through consistent audio analysis, which helps maintain stable metadata across new and legacy material.

  • Auto-Descriptions: complement tags with short summaries that highlight the track’s defining features. These descriptions are created through our own audio models, without relying on any external language models. They give you an objective snapshot of how the music sounds, which supports playlisting, catalog review, and licensing workflows that depend on fast context.

Inside Cyanite’s tagging taxonomy

Here’s a taste of the insights our music auto-tagging software can generate for you: 

  • Core musical attributes: BPM, key, meter, voice gender
  • Main genres and free genre tags: high-level and fine-grained descriptors
  • Moods and simple moods: detailed and broad emotional categories
  • Character: the expressive qualities related to brand identity
  • Movement: the rhythmic feel of the track
  • Energy level and emotion profile: overall intensity and emotional tone
  • Energy and emotional dynamics: how intensity and emotion shift over time
  • Valence and arousal: positioning in the emotional spectrum
  • Instrument tags and presence: what instruments appear and how consistently
  • Augmented keywords: additional contextual descriptors
  • Most significant part: the 30-second segment that best represents the song
  • Auto-Description: a concise summary created by Cyanite’s models
  • Musical era: a high-level temporal categorization

Learn more: Check out our full auto-tagging taxonomy here.

To show how these elements work together, we analyzed Jungle’s “Back On 74” using our auto-tagging system. The table below reflects the exact values our model generated.

Visualization of an auto-tagging example song.

Step-by-step Cyanite Auto-Tagging integration

You can get started with Cyanite through our web app or by connecting directly to the Auto-Yagging API. The process is straightforward and designed to fit into both creative and technical workflows.

1. Sign up and verify your account

  • Create a Cyanite account and verify your email address.
  • Verification is required before you can create an integration or work with the API.
  • Once logged in, you’ll land in the Library view, where all uploaded tracks appear with their generated metadata.
A screenshot of a music library with tags

2. Upload your music

You can add music to your Library by:

  • Dragging MP3 files into the Library
  • Clicking Select files to browse your device
  • Pasting a YouTube link and importing the audio

Analysis starts automatically, and uploads are limited to 15 minutes per track.

A screenshot of a music library with an upload window

3. Explore your tags in the web app

Once a file is processed, you can explore all its tags inside the Library. In this view, you can discover:

  • Your songs’ full tag output
  • The representative segment, full-track view, or a custom interval
  • Similarity Search with filters for genre, BPM, or key
  • Quick navigation through your catalog using the search bar

This helps you evaluate your catalog quickly before integrating the results into your own systems.

A screenshot of a music library with an upload window

4. Create an API integration (for scale and automation)

If you want to connect Cyanite directly to your internal tools, you can set up an API integration. Just note that coding skills are required at this stage.

  1. Open the Web App dashboard.
  2. Go to Integrations.
  3. Select Create New Integration.
  4. Select a title.
  5. Fill out the webhook URL and generate or create your own webhook secret.
  6. Click the Create Integration button.

After you create the integration, we generate two credentials:

  • Access token: used to authenticate API requests
  • Webhook secret: used to verify incoming events

Test your integration credentials following this link.

You must store the access token and webhook secret securely. You can regenerate new credentials at any time, but they cannot be retrieved once lost.

Pro tip: We have a sample integration available on GitHub to help you get started.

 

5. Start sending audio to the API

  • Use your access token to send MP3 files to Cyanite for analysis.

Pro tip: For bulk uploads (>1,000 audios), we recommend using an S3 bucket upload to speed up ingestion.

6. Receive your tagging results

  • Your webhook receives the completed metadata as soon as the analysis is finished.
  • If needed, you can also export results as CSV or spreadsheet files.
  • This makes it easy to feed the data into playlisting tools, catalog audits, licensing workflows, or internal search systems.

7. Start using your metadata

Once results are flowing, you can integrate them into the workflows that matter most:

  • Search and recommendation tools
  • Catalog management systems
  • Playlist and curation workflows
  • Rights and licensing operations
  • Sync and creative pipelines
  • Internal music discovery dashboards

Read more: Check out Cyanite’s API documentation

Auto-tag your tracks with Cyanite

AI auto-tagging helps you bring structure and consistency to your catalog. By analyzing the full audio, our models capture mood changes, instrumentation, and energy shifts that manual tagging often misses. The result is metadata you can trust across all your songs.

Our tagging system is already widely adopted; over 150 companies are using it, and more than 45 million songs have been tagged. The system gives teams the consistency they need to scale their catalogs smoothly, reducing manual cleanup, improving search and recommendation quality, and giving you a clearer view of what each track contains.

If you want to organize your catalog with more accuracy and less effort, start tagging your tracks with Cyanite.

FAQs

Q: What is a tag in music?

A: A tag is metadata that describes how a track sounds, such as its mood, genre, energy, or instrumentation. It helps teams search, filter, and organize music more efficiently.

Q: How do you tag music automatically?

A: Automatic tagging uses AI trained on large audio datasets. The model analyzes the sound of the track, identifies musical and emotional patterns, and assigns metadata based on what it hears.

Q: What is the best music tagger?

A: The best auto-tagging music software is the one that analyzes the full audio and delivers consistent results at scale. Cyanite is widely used in the industry because it captures detailed musical and emotional attributes directly from the sound and stays reliable across large catalogs.

Q: How specific can you get when tagging music with Cyanite

A: Cyanite captures detailed attributes such as mood, simple mood, genre, free-genre tags, energy, movement, valence–arousal, emotional dynamics, instrumentation, and more. Discover the full tagging taxonomy here.

What is Music Prompt Search? ChatGPT for music?

What is Music Prompt Search? ChatGPT for music?

Last updated on March 6th, 2025 at 02:14 pm

How Music Prompt Search Works & Why It’s Only Part of the Puzzle

Alongside our Similarity Search, which recommends songs that are similar to one or many reference tracks, we’ve built an alternative to traditional keyword searches. We call it Free Text Search – our prompt-based music search. 

Imagine describing a song before you’ve even heard it:

Dreamy, with soft piano, a subtle build-up, and a bittersweet undertone. Think rainy day reflection.

This is the kind of prompt that Cyanite can turn into music suggestions – not based on genre or mood tags, but on the actual sound of the music. 

Music Prompt Search Example with Cyanite’s Free Text Search

What Is Music Prompt Search?

Prompt search allows you to enter a natural language description (e.g. uplifting indie with driving percussion and a nostalgic feel) and get back music that matches that idea sonically. 

We developed this idea in 2021 and were the first ones to launch a music search that was based on pure text input in 2022. Since then we’ve been improving and refining this kind of AI-powered search so that it can accurately translate text into sound. That way, you will get the closest result to the prompt that your catalog allows for. 

We are not searching for certain keywords that appear in a search. We directly map text to music. We make the system understand which text description fits a song. This is what we call Free Text Search.

Roman Gebhardt

CAIO & Founder, Cyanite

Built with ChatGPT? Not All Prompts Are Created Equal

More recently, different companies have entered the field of prompt-based music search, using large language models like ChatGPT as a foundation. These models are strong at interpreting natural language, but can not understand music the way we do. 

They generate tags based on text input and then search those tags. So in reality, these algorithms work like a traditional keyword search, and only decipher natural language prompts into keywords. 

When Prompt Search Shines

Prompt search is a game-changer when:

  • You have a specific scene or mood in mind
  • You’re working with briefs from film, games, or advertising
  • You want to match the energy or emotional arc of a moment

This is ideal for music supervisors, marketers, and creative producers.

Note: Our Free Text Search just got better!

With our latest update, Free Text Search is now:

✅ Multilingual – use prompts in nearly any language

✅ Culturally aware – understand references like “Harry Potter” or “Mario Kart”

✅ Significantly more accurate and intuitive

It’s available free for all API users on V7 and for all web app accounts created after March 15. Older accounts can request access via email.

Why We Build Our Own Models

We chose to develop every model in-house 

Not only for data security and IP protection, but because music deserves a dedicated algorithm. 

Few things are as complex and deep as the world of music. General-purpose AI doesn’t understand the nuance of tempo shifts, the subtle timbre of analog synths, or the emotional trajectory of a song.

Our models are trained on the sound itself. That means:

    • More precise results
    • Higher musical integrity
    • More confidence when recommending or licensing tracks

If you wanna learn more on how our models are working – check out this blog article and interview with our CAIO Roman Gebhardt.

Want to try our Free Text Search on your own music catalog?

Sync Music Matching with AI-powered Metadata | A Case Study with SyncMyMusic

Sync Music Matching with AI-powered Metadata | A Case Study with SyncMyMusic

The Problem

The sync licensing industry faces a fundamental information asymmetry problem. With hundreds of production music libraries operating globally, producers struggle to identify which companies are actively placing their style of music. Jesse Josefsson, veteran of 10,000+ sync placements, identified this gap as a core market inefficiency.

Genres were wrong, moods were wrong. Just not even close to what I would think as acceptable answers for an auto tagging model.

Jesse Josefsson

Founder, SyncMyMusic

Key Challenges:

    • Producers pitching to inappropriate libraries for years without results
    • Manual research taking days or weeks per opportunity
    • Inaccurate tagging solutions create more problems than they solve
    • Industry professionals “flying blind” when making strategic decisions

The Solution

One of the members said it was so accurate, it was almost spooky because it got things and it labeled things that even they wouldn’t have probably thought of themselves.” – Jesse Josefsson

After evaluating multiple auto-tagging solutions, SyncMyMusic selected Cyanite based on accuracy standards and industry reputation. The platform architecture combines TV placement data with AI-powered music metadata analysis to deliver targeted recommendations.

Why Cyanite:

    • Industry-leading accuracy in genre and mood classification
    • Partnership credibility through SourceAudio integration
    • Responsive customer support with sub-2-hour response times
    • Seamless API integration capabilities

The Implementation

I’m what they would probably call a “vibe coder”. I don’t have coding skills, but if I can do this, you can do this.Jesse Josefsson

Jesse built the entire SyncMatch platform using AI tutoring (ChatGPT/Grok) and automation tools (make.com) without traditional coding experience. The implementation took 2.5 months from concept to MVP, demonstrating how modern no-code approaches can deliver enterprise-grade solutions.

Music CMS Solutions Compatible with Cyanite: A Case Study

Music CMS Solutions Compatible with Cyanite: A Case Study

In today's digital age, efficiently managing vast amounts of content is crucial for businesses, especially in the music industry. For those who decide not to build their own library environment, music Content Management Systems (CMS) have become indispensable tools....

AI Music Discovery: How Marmoset Uses Cyanite | A Case Study

AI Music Discovery: How Marmoset Uses Cyanite | A Case Study

Founded in 2010, Marmoset is a full-service music licensing agency representing hundreds of independent artists and labels. At the heart of it, their core experience involves browsing for music. They offer music discovery for any moving visual media. From sync (movies...

AI-Powered Music Marketing feat. Chromatic Talents

AI-Powered Music Marketing feat. Chromatic Talents

Chromatic Talents acts like a music brand consultancy providing a comprehensive range of services in artist management, development, digital branding, and business development. Find out how they use AI-Powered Music Marketing powered by Cyanite. The goal of the...

Cyanite Advanced Search (API only)

Cyanite Advanced Search (API only)

Ready to supercharge your discovery workflows? Try out the Advanced Search API.

We’re excited to introduce Advanced Search, the biggest upgrade to Similarity and Free Text Search since we launched. With this release, we’re offering a sneak preview into the power of the new Cyanite system.

Advanced Search brings next-level precision, scalability, and usabilityall designed to supercharge your discovery workflows. From advanced filtering to more nuanced query controls, this feature is built for music teams ready to move faster and smarter.

Note: Advanced Search is an API-only feature intended for teams with developer resources who want to integrate Cyanite’s intelligence directly into their own systems.

Advanced Search Feature Overview

Click on the bullet point to jump to each feature directly

Multi-Track Search – multiple search inputs for playlist magic

Similarity Scores: Total Clarity, Total Control

Now each result comes with a clear percentage score, helping you quickly evaluate how close a match really is—both for the overall track and for each top scoring segment. It’s a critical UX improvement that helps users better understand and trust the search results at a glance.

Most Relevant Segments zoom in on the best parts

We’re not just showing you results, we’re showing you their strongest moments. Each track now highlights its Most Relevant Segments for both Similarity and Free Text queries. It’s an instant way to jump to the most relevant slice of content without scrubbing through an entire track. 

Custom Metadata Filters – smarter searches start with smarter filters

Upload your own metadata to filter results before the search even begins. Want only pre-cleared tracks? Looking for music released after 2020? With Custom Metadata Filtering, you can target exactly what you need, making your search dramatically more efficient.

Up to 500 Search Results – sometimes more is more

Tired of hitting a ceiling with limited search returns? Now, Similarity Search and Free Text Search deliver up to 500 results, giving you a much broader snapshot of what’s out there. Whether you’re refining a vibe or exploring diverse sonic textures, you’ll have a fuller landscape to work with.

Testing Advanced Search free for a month gave us the confidence we needed to update our search and tagging systems. The integration was smooth, and we were able to ship several exciting features right away – but we’ve only scratched the surface of its full capabilities!

Jack Whitis

CEO, Wavmaker

Ready to level up your catalog search?

Advanced Search introduces a more powerful way to work with your catalog. It is most useful for teams who already understand our core music discovery tools. If you have not yet tried Similarity Search or Free Text Search, sign up to Cyanite and start finding tracks that match the musical references or creative direction you’re working with. 

When you’re ready to take it a step further, explore a track’s strongest moments or enhance your metadata with custom tags using Advanced Search. Make sure you are operating on Cyanite’s v7 architecture, since it enables the full capabilities of the new system.