How To Prompt: The Guide to Using Cyanite’s Free Text Search

How To Prompt: The Guide to Using Cyanite’s Free Text Search

Ready to search your catalog in natural language? Try Free Text Search.

Do you have trouble translating your vision for music into precise keywords? If so, this guide on how to prompt using Cyanite’s Free Text Search is for you.

It’s a more natural way to search your music catalog and discover tracks. You can use complete sentences to describe soundscapes, film scenes, daily situations, activities, or environments. Prompts can be written in different languages and can include cultural references, so you’re not forced to reduce your idea to a fixed set of tags.

Before you explore what Free Text Search can do, keep in mind that prompt-based search works best when your input is specific. The clearer you are, the easier it is to find what you’re looking for. 

Read more: What is music prompt search?

Why music catalogs struggle with discovery

Most large catalogs contain inconsistent metadata. Many were built before modern tagging standards, then expanded over time through different workflows. New music arrives faster than metadata teams can standardize it, especially with the volume from UGC and AI-generated releases, while older tracks remain described in ways that don’t always support how music is searched for today.

Traditional search relies on tags and keyword logic. This approach can be effective for many searches, but it has limits when ideas are already highly specific, like with a detailed creative brief or a particular scene description. Translating concrete, nuanced needs into tags often loses critical details and context.

That’s where natural language search makes a difference. Instead of defining a specific vision in terms of available tags, you can describe what you need directly or even paste a brief into the search bar. The system interprets intent, mood, and context in ways that complement tag-based discovery.

This helps sync and licensing teams work faster with detailed requests, and gives catalog teams another tool to surface relevant music, especially from underused parts of the catalog.

Read more: How to use AI music search for your music catalog

How Free Text Search amplifies music discovery

Free Text Search lets you look for music in the way you would naturally describe it. Write detailed prompts in full sentences, and Cyanite’s AI interprets the meaning behind your words to match intent with how tracks actually sound in your catalog.

This type of search is designed for situations where intent doesn’t translate cleanly into keywords. Tag-based searches work well when attributes are fixed and clearly defined, and Similarity Search is useful when you already have a reference track and want to find music that sounds close to it. Teams often get good results when they search in their own words first, then move into other search modes to refine the selection.

How to use Free Text Search effectively

In real-life workflows, searches rarely begin from the same place. Sometimes you’ll start with sound, sometimes with a scene, and sometimes with context. 

Not every idea can be reduced to tags or tied to a specific track. Choosing music is a creative process, so the way people search is often creative too. Free Text Search meets users where they are, allowing them to describe intent in natural language and shape discovery around how they think. 

1. Describing sound

With Free Text Search, you can add context and even cultural references to your search, making it possible to find the perfect soundtrack for your project and get the most out of your music catalog. 

This approach is commonly used when responding to sync briefs that describe musical detail and tone.

Sound-focused prompts should name what musical elements are present, then add how those elements are played or arranged. An extra cue about character or attitude can be included when it helps clarify intent.

[Instruments or sound sources] + [how they are played or arranged] + [optional: character or stylistic cue]

  • “Trailer with sparse repetitive piano and dramatic drum hits with Star-Wars-style orchestra themes”
  • “Laid-back future bass with defiant female vocal”
  • “Staccato strings with a piano playing only single notes”
  • “Solo double bass played dramatically with a bow”

These prompts work because they are specific, but not rigid. That level of detail helps surface relevant tracks faster and reduces reliance on perfectly maintained tags, which is especially valuable in large or uneven catalogs.

Common mistakes to avoid

  • Staying too abstract: Words like “cinematic” or “emotional” on their own don’t give enough information to form a clear sound.
  • Listing elements without context: Naming instruments or genres without describing how they are played or arranged often leads to broad results.
  • Overloading the prompt: Packing too many ideas into one sentence can blur intent and pull results in different directions.
  • Writing like a tag list: Free Text Search works best when the prompt reads like a description, not a stack of keywords.

Read more: AI search tool for music publishing: best 3 ways

2. Describing film scenes

Film scenes can evoke a wide range of emotions and visuals. When using Free Text Search for this purpose, consider whether your prompt captures objective elements of the scene or your own interpretation of it.

Publishers often use scene-based prompts to explore deeper parts of their catalog and surface music suited to narrative use cases beyond obvious genre labels.

You can reference popular movies or shows like Pirates of the Caribbean or Stranger Things in your search prompts.

It helps to think like a director. Focus on the action or moment in the scene and what the viewer is experiencing. The clearer the image you describe, the easier it is for the search to interpret what kind of music belongs there, without needing a list of musical traits.

[Action or moment] + [optional: setting or situation] + [optional: stylistic cue]

  • “Riding a bike through Paris”
  • “Thriller score with Stranger-Things-style synths “
  • “Tailing the suspect through a Middle Eastern bazaar”
  • “The football team is getting ready for the game”

An example result for the prompt: “Riding a bike through Paris”

These prompts work because they describe a cinematic moment rather than a list of musical characteristics. A scene like “riding a bike through Paris” suggests a certain musical style and progression, which helps frame how the music should unfold. That context gives Free Text Search a clearer sense of what the track needs to communicate.

To fine-tune your search, add different keywords, like “orchestral,” “industrial rock,” or “hip-hop,” to steer it in the direction you want.

Common mistakes to avoid

  • Writing scenes that only make sense to you personally: Prompts should be interpretable without extra explanation.
  • Dropping the visual context: Turning a scene into a genre description removes what makes this approach effective.
  • Using obscure references: If the reference is not widely known, it may not clarify the scene.

3. Describing activities, situations, and moods

Free Text Search empowers you to be as specific as your project demands. You can describe when and where music will be heard, and what it should communicate. Combining activity, situation, and mood helps direct discovery toward abstract or niche ideas that don’t translate cleanly into tags, making it easier to surface music that fits its intended use.

When writing the prompts, focus on how the music will be used and what it needs to communicate in that situation. Providing clear usage context helps the search narrow results without requiring detailed musical instruction.

[Style or sound] + [intended use or context] + [optional: tone or functional role]

  • “Latin trap for fitness streaming catalog”
  • “Mellow California rock for sports highlight content”
  • “Colorful pop music for lifestyle brand campaign”
  • “Subtle ambient textures for background use”

Example result for the prompt: Mellow California rock for a road trip”

Common mistakes to avoid

  • Leaving out the use case: Mood alone often leads to broad results without direction.
  • Mixing conflicting contexts: Background use and high-impact language can work against each other.
  • Lack of clarity: When the prompt doesn’t include enough context, results stay generic.

Free Text Search is available in the Cyanite web app. You can test prompts, explore results, and refine searches in minutes.

Using prompts to improve discovery

With Free Text Search, you can explore your music catalog using detailed descriptions. This lets you search based on how music is described in real projects, making it easier to find tracks that fit a specific brief, scene, or use case.

Whether you’re pitching music for sync, artists, or labels, looking to underscore a film scene, or setting the mood for an activity, Free Text Search empowers you to explore music in a whole new way.

As you craft your prompts, try to be specific and objective, as this will return better results. Use concrete details like instruments, playing styles, and specific scenes or activities. 

You already have the resources in your catalog. Free Text Search helps you access them more effectively.

Best of Music Similarity Search: Find Similar Songs With the Help of AI

Best of Music Similarity Search: Find Similar Songs With the Help of AI

Jakob

Jakob

CMO at Cyanite

Want a faster way to find tracks with similar sound profiles? Explore Similarity Search in Cyanite.

Searching for similar tracks by typing out what you need in the search bar can limit what a large catalog shows you. When sound isn’t a factor in the search, it can be easy to overlook songs, even if they are a great fit for a brief or playlist.

Our similar song finder AI complements our Free Text Search. It’s an alternative search method that lets you use reference tracks to search your catalog rather than text input.

Similarity Search is designed for music catalogs and platforms that need to navigate large libraries efficiently, whether for sync, marketing, playlisting, or discovery. It’s built to meet the needs of professional catalog workflows, but individual creators and artists can also use it.

In this guide, find out how Similarity Search can help you get the most out of your catalog and uncover matches with more clarity.

How does Cyanite’s Music Similarity Search Work?

Similarity Search, Cyanite’s AI similar song finder, compares a reference track’s audio with the rest of your catalog. It’s available in Cyanite through the API or web app.

You can get started by using a track from your library, a YouTube link, or a Spotify preview as the reference. Library and YouTube tracks are analyzed in full, while Spotify previews use 30-second snippets.

This audio analysis is especially useful when a song’s metadata is incomplete or when the qualities you’re matching are hard to describe in words.

Unlike consumer-facing recommendation systems, Similarity Search is built to operate on entire catalogs, giving teams consistent results across thousands or millions of tracks.

You can start Similarity Search in two ways:

1. From the library

  • Select a track and click “Similarity.”
  • Choose the part of the reference you want to use: Representative Segment, Complete Track, or Custom Interval.

2. From the Search tab

  • Open the Search tab and select Similarity Search.
  • Add a reference track from your Library or an external source.

In both cases, you can review the results and switch between Library or Spotify suggestions, then refine the output using filters like genre, key, BPM, or voice presence to guide the search in a specific direction.

Important note: The similar results from Spotify in the web app solely function as a showcase and cannot be used for commercial purposes.

 

How our AI identifies similarity between tracks

The most common search function in music catalogs is tagging, which relies on accurate metadata to surface the right tracks. But to use tags, you need to have a few keywords that describe at least the mood you’re looking for. 

Similarity Search was designed for the moments when words are not the best starting point. Cyanite’s AI compares the audio of one track with the audio of another. It analyzes measurable elements inside a song’s spectrogram, such as rhythm, harmony, instruments, timbre, and movement, and places each song in relation to the reference. Tracks with closer matches in sound are considered more similar.

For instance, Similarity Search is the perfect feature for tackling briefs that start with a reference track. Instead of relying on tags or descriptions, Similarity Search compares the sound of the reference directly to the rest of your catalog and surfaces close matches.

Another search method is using prompts to find the right track. Cyanite’s Free Text Search lets you describe the sound you’re looking for in natural language. This is useful when you don’t have a specific reference song and you want to express a mix of mood, instrumentation, pacing, or context in one query. In that case, rigid tags may be too limiting for you to find what you’re looking for.

Use Similarity Search when:

  • You have a reference track. 
  • You want to surface tracks that may not appear when searching with keywords.
  • You’re reviewing back catalog areas where descriptors vary or are incomplete.
  • You want to find tracks that may be near-duplicates or versions of the same recording.
  • You want to find sound-adjacent tracks that help build clearer audience segments for marketing or promotion.

Use Free Text Search when:

  • The qualities you need are simple to describe.
  • You want to filter by specific attributes.
  • You want to include your own tags or catalog-specific terms in the query.
  • You’re shaping a query that mixes mood, instrumentation, or context in a way that benefits from natural language.
  • You need flexibility to search in several languages.

Use tag-based search with Auto-Tagging when:

  • You want full control and transparency over why tracks appear in the results.
  • You need reliable, repeatable filtering across a catalog.
  • You’re working with defined attributes, such as genre, mood, tempo, or instrumentation.
  • You want to include or exclude specific characteristics with precision.
  • You’re preparing exports, deliverables, or structured shortlists where consistency matters.

Use cases for Similarity Search

Similarity Search is used across many workflows where sound-based matching is essential.

1. Executing sync and music briefs

Sync work often involves working to short timelines while still needing a precise sound match. Similarity Search supports this by allowing teams to compare the sound of a reference track directly against the catalog, reducing the time spent translating musical intent into tags or keywords. This makes it easier to build focused, sound-accurate shortlists efficiently, without diluting the brief through broad genre or mood labels.

Unlike Spotify’s “Similar Artists” feature, Cyanite’s Similarity Search analyzes the sound itself. That makes our tool better suited for precise sync work.

2. Uncovering catalog blind spots

In large catalogs, attention naturally concentrates on a small subset of tracks, while others quietly fall out of circulation. Similarity Search helps rebalance selection by reconnecting less prominent material to tracks that are already in use, based on sound. This allows overlooked parts of the catalog to surface naturally in real workflows, without relying on re-tagging or manual curation.

With the help of Cyanite’s AI tags and the outstanding search results, we were able to find forgotten gems and give them a new life in movie productions. Without Cyanite, this might never have happened.

Miriam Rech

Sync Manager, Meisel Music

3. Finding duplicates and versions

Large catalogs often contain duplicate or near-duplicate tracks, such as alternate exports or slightly different versions. Similarity Search helps teams identify and manage these overlaps, improving search quality and keeping the catalog consistent.

4. Supporting marketing and audience segmentation

When promoting a new artist, it helps to understand which established artists they genuinely sound similar to. Similarity Search identifies those musical similarities, so marketing teams can target fans of those artists more precisely and align campaigns with listener expectations.

This leads to more relevant targeting, stronger engagement, and less wasted ad spend, without relying on guesswork or genre labels alone.

Read more: Custom audiences for pre-release music campaigns

5. Pitching and optimizing playlists at scale

In playlisting, a single track often sets the sonic frame for the rest of the selection. Similarity Search allows labels, artists, and curators to find other songs that fit the same direction before pitching or publishing.

Matching tracks by sound rather than genre labels alone results in playlists that feel more coherent and pitches that are more likely to resonate.

6. Offering customized recommendations based on user behavior

Many music businesses already run their own recommendation systems based on how people interact with their catalog. Cyanite’s Advanced Search fits into this setup as an API-based sound filter that connects to existing infrastructure.

Teams can use reference tracks and custom metadata filters to generate a sonically coherent set of results, which their own systems then rank or adapt based on user behavior. This keeps sound similarity consistent while allowing each platform to control how recommendations are applied.

7. Similarity Search for creators and artists

Sound matching with Similarity Search can support creative decisions in individual workflows:

  • Determining type beats: Beat producers often create “type beats” to mimic the style of popular artists.
  • Optimizing DJ crates: Creators can surface tracks that mix well with a reference song and use key and harmonic filters to build crates with smoother transitions and consistent energy.
  • Finding samples: Creators can start from a sample they already use and find alternatives that match in rhythm, groove, or harmonic feel. Then, they can narrow options by key and BPM to fit a project directly.

Read more: Optimizing playlists and DJ sets

7. Similarity Search for creators and artists

Sound matching with Similarity Search can support creative decisions in individual workflows:

  • Determining type beats: Beat producers often create “type beats” to mimic the style of popular artists.
  • Optimizing DJ crates: Creators can surface tracks that mix well with a reference song and use key and harmonic filters to build crates with smoother transitions and consistent energy.
  • Finding samples: Creators can start from a sample they already use and find alternatives that match in rhythm, groove, or harmonic feel. Then, they can narrow options by key and BPM to fit a project directly.

Read more: Optimizing playlists and DJ sets

Take music search to the next level with Advanced Search

For even more intuitive catalog searches, try Advanced Search, our search add-on. It works with multiple reference tracks and custom metadata filters, and also lets you include your own tags as part of the search.

Each result includes a score that reflects the full track’s likeness to your reference, and it points out the moments in the audio where that similarity is strongest. When you need a broader set of tracks to review, the mode can return up to 500 results. It also accepts prompts in any language.

Once enabled, you can use Advanced Search by:

  • Adding one or more reference tracks from your Library, YouTube, or Spotify
  • Setting custom or existing tags as filters for Similarity Search and Free Text Search
  • Seeing the most relevant segments to your search highlighted
  • Reviewing the similarity scores for both the track and its top-scoring segments

Note: Advanced Search is an API-only feature intended for teams with developer resources who want to integrate Cyanite’s intelligence directly into their own systems.

Bringing sound and text into one search system

Finding the right music often depends on how clearly you can define what you’re looking for. When you have a reference track, Similarity Search offers the most direct route through a catalog, comparing sound to sound and surfacing close matches without relying on labels, genres, or interpretation. This makes it especially effective for large libraries, where great tracks can be missed when search depends on text alone.

Text-based methods still play an important role. Our Free Text Search lets you explore a catalog from a descriptive starting point.

The Advanced Search add-on builds on both approaches, giving you more control through custom metadata filters, multiple reference tracks, and similarity scoring that explains why each result appears.

Create a free account and start testing Cyanite’s search algorithms to see how this works firsthand.

FAQs

Q: How can Cyanite help me find similar music by song?

A: You can select any track in your catalog and run Similarity Search to find similar music by song. The system compares the audio of your reference with the rest of your library and surfaces tracks that are sonically similar.

Q: How accurate is Cyanite’s Similarity Search compared to Spotify’s recommendations?

A: Unlike Spotify, which relies on user behavior, Cyanite focuses on the track’s actual sound. This makes our matches more sonically accurate for use cases where the song’s tonality is crucial.

Q: Can I use Similarity Search without coding skills?

A: Yes! Our free web app lets you analyze music and run similarity searches without any coding knowledge.

Q: How does Similarity Search help in marketing campaigns?

A: Similarity Search can help you identify which artists and tracks share a similar sound to the music you’re planning to promote, helping you understand the musical landscape and pinpoint your target audience. With this insight, you can target fans of those sonically similar artists on social media and streaming platforms, making your campaigns more precise and effective.

Q: How can I use Similarity Search on my own platform?

A: You can easily connect your platform with our API and offer Similarity Search within your service to your users. Similarity Search is also available in most CMS for music such as SourceAudio, HarvestMedia, and Cadenzabox.

Q: What’s the difference between Similarity Search and Free Text Search?

A: Similarity Search compares audio and surfaces tracks that sit sonically close to a reference. Free Text Search interprets your wording and returns music that aligns with your description.

Music Analysis API: AI-Powered Tagging & Search

Music Analysis API: AI-Powered Tagging & Search

Ready to bring AI search and tagging into your platform? Start integrating with the Cyanite API.

Modern music platforms manage thousands of tracks, yet many still rely on metadata systems that weren’t built for the speed or complexity of today’s catalogs. As libraries grow, teams must deal with missing tags, inconsistent descriptors, and limited search options. 

Product teams and catalog managers need faster and deeper ways to search for and analyze music. They also need those capabilities to live inside their own product, not in a disconnected external workflow.

The Cyanite API was built for this reality. Keep reading to discover what it enables and how companies across the industry use it in production.

How the Cyanite API improves your catalog workflows

If you manage music on your own platform, the Cyanite Music Analysis API gives you a reliable way to bring our music intelligence into your product. You can integrate the features your users need with full control over how they experience them.

  • Ease of use: With our API, you can upload tracks from your system to Cyanite and get fast, accurate tagging and similarity search results. The integration is fully embedded—your users stay within your platform while Cyanite processes the audio in the background. If you already have a Cyanite account, you can access the API for free to run a small test analysis.
  • Fast deployment and integration: We use a GraphQL API, so you can query only the data you need and shape responses to fit your workflows. This flexibility makes it easier to adapt your integration over time as you learn how your users interact with Cyanite. Soon, we will also offer a REST API.
  • Quality of support: We keep our API documentation structured and easy to follow, with clear step-by-step instructions and real examples. When you need guidance, we support you directly so you can integrate Cyanite into your system smoothly.
A screenshot showing the first steps needed to create an API integration

What you gain by integrating the Cyanite API

Everything you see in the Cyanite Web App is available through the API. You can integrate the same analysis and search capabilities into your own system and tailor them to your platform’s architecture.

AI-powered music tagging

Cyanite automatically tags your music based on its audio content. Our system delivers a rich set of tags for the full track and for every 15-second segment, helping you map changes in energy, mood, instrumentation, and other key attributes.

We convert each track into a spectrogram, apply computer vision to understand its musical structure, and refine the results through post-processing. This gives you a consistent metadata layer that supports curation, licensing, and catalog navigation at scale.

Learn more: Explore our tagging taxonomy to see how our model works

The Cyanite API includes an interactive Query Builder that helps developers work with this metadata efficiently. You can use it to test queries, explore the full schema, and view example JSON outputs for every tag and attribute (global and segment-based). This makes it easy to map our tagging fields directly to your internal data model.

Music Similarity Search

A screenshot showing the first steps needed to create an API integration

Cyanite delivers similar-sounding songs for any reference audio file or YouTube link. You can upload a file or paste a link, and the system analyzes the track before comparing it to your music library. We also store the reference song in your library so you can choose which part of the track you want to base the search on. This lets you explore different segments—for example, the chorus and the verse—to find the best match.

If you work with Spotify, you can also use Spotify track IDs. Once we’ve analyzed the standard 30-second preview, the results stay stored for faster retrieval in future searches.

Read more: Best of music similarity search: find similar songs with the help of AI

Free Text Search

A screenshot showing the first steps needed to create an API integration

With Free Text Search, you can write full sentences and let Cyanite interpret their meaning. The system understands the semantics of natural language, whether you describe a musical idea or the mood of a scene. This approach removes the usual constraints of music search and makes it easy for anyone to find the right tracks.

Here are some example prompts:

  • “dark atmospheric strings with slow build-up”
  • “warm indie guitar with nostalgic mood”
  • “energetic Latin pop for a dance scene”

Read more: How to prompt: the guide to using Cyanite’s Free Text Search

Advanced Search (Similarity & Free Text Search add-on)

A screenshot showing the first steps needed to create an API integration

Advanced Search helps you and your users find the ideal track in your catalog based on your meticulously curated criteria and exact specifications. It’s an extended feature set that enhances Similarity and Free Text Search with custom filtering, ranking, and multi-reference capabilities. Advanced Search is ideal for platforms needing precision, ranking logic, hybrid search models, or complex business rules. Some of the key capabilities include:

  • Custom metadata upload: Upload your own metadata fields (regions, rights, subcatalogs, cultural tags), and use them as filters in search queries.
  • Similarity scores: Retrieve normalized similarity scores for ranking and help users better understand and trust the search results at a glance.
  • Multi-track search: Provide up to 50 reference tracks at once. The system merges their sonic profiles to deliver more accurate creative matches.
  • Up to 500 ranked results: Get deep catalog visibility—ideal for editorial teams and large platforms.

Most similar segments: Retrieve segment-level match information between reference and result.

Crates

A screenshot of Cyanite's API documentation showing the creation of crates.

Crates complement Advanced Search and help keep large, complex catalogs manageable.

Use them to define focused subsets within your library so you can manage specific parts of the catalog with more precision. They also help you control access based on rights, support teams who only work with a particular portion of the catalog, and keep curated subcatalogs separate when needed. You can also run similarity searches within a crate, which is useful for segmented or specialized discovery workflows.

How leading platforms have used our API

The Cyanite API is already embedded into the workflows of leading music platforms, helping them power search, discovery, and catalog navigation. Companies like Epidemic Sound, Marmoset, Musicbed, Reelworld, and MAIA Universe use our intelligence directly in their own interfaces to deliver faster, more intuitive music experiences.

You can explore their implementations above.

Cyanite has maybe most significantly improved our work with its Similarity Search that allows us to enhance our searches objectively, melting away biases and subjective blind spots that humans naturally have.

Alex Paguirigan

Product Manager, Marmoset

Ready to start your integration?

The Cyanite API gives you detailed tagging and intelligent search that understands both sound and language, so you can upgrade your catalog experience on your own terms. Build as lightly or as deeply as your product needs—you have full control.

If you’re ready to start coding, sign up to Cyanite and explore the API. You can begin shaping the integration in your environment right away.

FAQs – API Integration

Q: How long does the integration process take?

A: Cyanite’s API integration is typically completed on our side within just a few days. However, the time required for front-end implementation and customization depends on the complexity and scope of your project. Based on our experience, a full integration – including testing, optimization, and deployment – usually takes 2 to 6 weeks to achieve a seamless, fully functional interface.

Q: What Cyanite features are available via API?

A: All features that we offer in our Web App are available via API. This includes all of our latest search & tagging algorithms. It is also possible to get insights for your catalog as a whole from data via the API. To learn more about catalog insights read this article.

Q: How much does the API cost?

A: The API usage fee is 290€/month. However, tuehe total price of the subscription depends on your catalog size and requested features. Please fill out this Typeform and we will get back to you with a quote.

Q: I am using a third-party catalog management system. How can I get Cyanite’s results into that?

A: Cyanite is fully integrated with Cadenza Box, Harvest Media, Music Master, Reprtoir, Synchtank, and Tune Bud for Auto-Tagging and Search. Also, DISCO or Source Audio customers can easily upload Cyanite’s Auto-Tagging and Auto-Descriptions to their libraries. Just reach out to business@cyanite.ai and we’ll look together over the format requirements of your library system.

The Importance of Music Auto-Tagging for Content Strategies

The Importance of Music Auto-Tagging for Content Strategies

An Introduction

By Jakob Höflich, Co-Founder and CMO of Cyanite

When I was 19, I worked at community radio 4ZZZ in Brisbane, tasked with digitizing daily CD deliveries, tagging their genre, and sorting them in the library. It was a tedious and challenging task – every mistake could persist in the library until corrected. And let’s face it, this rarely was the case. This was one of the experiences that motivated me to found Cyanite many years later, also to help catalog owners tag their catalogs with AI and to eradicate the legacy of tagging mistakes made in the past 25 years of digitization.

While Auto-Tagging to create a clean and better searchable library has become a commodity, with various music companies worldwide leveraging this to alleviate the burden on their tagging teams and create more space for creative work, there is one underappreciated use case that has recently grown in significance: using Auto-Tagging data on a global catalog basis to derive actionable insights for your content strategy.

From Hunches to Data-Driven Insights

If you own or work with a music catalog, you likely have a solid understanding of its character. But what if the number of songs goes in the tens of thousands or even beyond? How confident can you be to know the profile of the catalog and what it stands for? When making important decisions about the creative direction of your catalog, especially with multiple stakeholders involved, this ‘feeling’ can tend to be subjective and leave room for guesswork. It’s the music’s subjective and “magical” nature that makes it hard to quantify and discuss.

That’s why keywords remain crucial when managing and developing a catalog where AI can be so helpful to your work. By providing data insights, AI can turn these hunches into consistent and concrete knowledge.

Here are three benefits of leveraging music Auto-Tagging for your content strategy.

1. Deep Understanding of a Catalog’s Character

AI music Auto-Tagging dives deep into the sound character of a catalog. By translating the complexity of music into concrete datapoints such as genres and moods, it allows for a shared, objective and consistent understanding across your team or company. Imagine having a precise breakdown of your catalog’s characteristics at your fingertips. Like, which percentage of my repertoire is Rock, Funk, Disco? Does it stand for upbeat or more melancholic tones? Do I maybe have a gender problem by favoring one more over the other? This not only enhances internal cohesion but also aligns strategies and decisions.

Pro Tip: Our Auto-Tagging focuses on creative metadata, extracting information such as genres, BPM, key, and energy. Curious? Check out our full taxonomy. It does not extract copyright or performance metadata. A smart move here is to pair the AI-driven insights with other data pools. For example, pairing AI-insights with performance and sales data can reveal things like: Only 2% of my catalog is Hip Hop, yet this content has a 200% higher performance rate.

2. Uncovering Blind Spots and Highlighting Trends

AI’s ability to uncover blind spots and highlight trends within a catalog is another significant benefit. This data-driven approach can reveal underutilized niches or trends when data is placed on a timeline. Whether it’s identifying a resurgence in a particular genre or pinpointing areas with high sync opportunities, AI insights shed light on the hidden corners of a catalog. Particularly for sync teams of companies that do not have a distinct genre profile it is beneficial to have a balanced catalog to answer to the upmost possible amounts of briefs with adequate content.

3. Informed Decisions for Catalog Acquisition

Lastly, AI-driven insights are not limited to managing your existing catalog. They are invaluable when evaluating to-be-acquired catalogs. While frontline repertoire might be familiar, B-sides and deep cuts often remain mysterious territories. By thoroughly analyzing these lesser-known tracks, AI can contribute a creative due diligence aspect by providing a comprehensive understanding, which in turn informs better acquisition decisions. This ensures you’re investing in a catalog that has the ability to complement your existing one.

Contrarily, if you want to sell a catalog, comprehensive tagging data on your repertoire can help you identify the perfect acquirer or prove the future longevity of your catalog to drive up the multiple.

A Real-World Example

A German publisher utilizing Cyanite’s AI insights discovered previously underappreciated genres, allowing them to optimize their catalog strategy effectively. The analysis showed that the genres Hip Hop, Funk, and RnB and the mood Epic were underrepresented even though both have been extremely valuable qualities for successful sync placements in the last years.

Visual representations, such as pie charts and graphs, can further show how AI can dissect and categorize catalog elements, providing clear, actionable insights.

All data above can be retrieved via our API.

Conclusion: Embracing the Future with AI

AI music Auto-Tagging can be a great help for developing content strategies in the music industry. These actionable insights provide a deep, data-driven understanding of catalogs, uncover blind spots, highlight trends, and inform strategic decisions for catalog acquisitions.

Undoubtedly, AI can’t and shouldn’t replace the final decision-making process as it can’t anticipate the future as us humans do. But it can be used as a great tool to navigate this process with data that make it easier – and often more convincingly – to talk about the magic of music.

As we live in a time where content production is at an all-time peak providing the sync market with opportunities as never before, every song in the catalog should have the same chance of being discovered. Having a well-organized and indexed catalog is key to that.

 

How to Use AI Music Search for Your Music Catalog

How to Use AI Music Search for Your Music Catalog

Ready to level up your search workflows? Try AI-powered music search in Cyanite.

Even the most carefully organized catalog reaches a point where text metadata can no longer support effective search on its own. Genres blur, moods can overlap, and large libraries hold thousands of tracks that look similar on paper but sound different when you listen. When you’re working on a brief, your search method needs to reflect the sound itself—not just the words attached to it.

AI music search enables your catalog to reveal more. By working with audio alongside the metadata, it returns search results that match the intent behind a brief rather than the exact words used in a query. You get a shortlist faster and surface strong tracks that would otherwise stay buried.

We see this need showing up across the catalogs we serve, so we put together this guide to outline how AI music search works in Cyanite and how it supports faster, more intuitive discovery in real-world workflows.

Learn more: See how AI music tagging works in Cyanite and how it supports large catalogs.

What is AI music search?

Traditional catalog search depends heavily on how consistently tracks are described. It works well when metadata is uniform and when everyone searches in the same way. But this is rarely the case in practice. Different people use different language, and many musical qualities are easier to hear than to articulate precisely.

AI music search approaches the problem by analysing the sound itself. This allows the system to understand rhythm, harmony, instrumentation, intensity, and voice presence. These sonic attributes are then used alongside existing metadata to guide search results.

Instead of matching exact keywords, the system focuses on musical similarity and intent. That means you can start a search from a reference track or a descriptive sentence without losing nuance along the way.

AI music search does not replace structured tagging. Instead, it builds on it as an additional way to explore a catalog when sound, context, or creative intent are easier to hear than describe.

At the same time, well-structured tagging remains the baseline to navigate a catalog in many day-to-day scenarios. AI-driven search becomes most valuable when teams need to move beyond fixed labels or explore music from a different angle.

How different types of AI music search work together

In practice, AI music search is most effective when it supports multiple ways of thinking about music. These are three ways we enable catalog music search in Cyanite:

  1. Audio-based search
  2. Prompt-based search
  3. Customizable advanced search features

These tools are designed to work together. Audio gives a clear view of how a track moves, text helps describe what you’re looking for, and advanced filters narrow the field to traits that matter for the request. Using them together keeps the catalog flexible and reduces the chance of great tracks being missed.

Exploring your catalog through Similarity Search

Similarity Search starts from sound. Cyanite analyzes a reference track’s audio and compares it with the rest of your catalog, returning tracks with a similar shape or mood. 

The reference can come from within your library or from an external source, such as Spotify, YouTube, or an uploaded audio file. You can also choose which part of the reference track to use, such as the chorus, the intro, or a specific section that best represents the desired direction.

This approach is especially useful when a brief comes with a musical example rather than a written description. Instead of translating sound into words and back again, you can search directly from what you hear. If you work with multiple reference tracks or an entire playlist, the Advanced Search features below are here to help.

Read more: Similar song finder AI for catalogs: Use Cyanite to search your library by sound

Searching with language using Free Text Search

Not every search starts with a reference track. Free Text Search allows users to describe music in natural language, using full sentences rather than rigid keywords.  

Prompts can reference mood, pacing, instrumentation, scene context, or use case. They can also include cultural references and be written in different languages. The system interprets the prompt’s meaning and matches it against the audio-based understanding of the catalog, without relying on external language models.

This makes search accessible to a wider range of users, including those who may not be familiar with a catalog’s internal tagging conventions.

Read more: How to prompt: the guide to using Cyanite’s Free Text Search

Advanced Search

For more specific searches, you often need additional control. Advanced Search builds on Similarity and Free Text Search by adding structured filters and deeper insight into why tracks appear in the results.

This mode allows teams to:

  • View similarity scores that show how closely results align with a reference or prompt
  • Run similarity searches using up to 50 reference tracks at once
  • Upload custom metadata and use it as additional filters
  • Identify the most similar segments within each track

Testing Advanced Search free for a month gave us the confidence we needed to update our search and tagging systems. The integration was smooth, and we were able to ship several exciting features right away—but we’ve only scratched the surface of its full capabilities!” Jack Whitis, CEO at Wavmaker

Read more: How to level up your AI search with Advanced Search features

AI music search: build vs buy

Organizations considering AI search often decide based on whether they want to build internally or integrate an existing solution. It typically depends on the time, cost, and ongoing work you can take on.

Building an in-house system can make sense for teams with significant machine-learning expertise and long-term resources. It typically requires a dedicated engineering team, a large and well-structured training dataset, and ongoing investment to maintain and improve model quality as catalogs and user needs evolve.

However, for most catalogs, integrating a tested system is the more practical path. Cyanite offers AI music search through a web app, an API, and integrations with major catalog management systems. Teams can adopt advanced search capabilities without taking on the long-term cost and complexity of maintaining their own models.

Smaller teams can start with the web app and scale usage over time. Larger organizations can integrate search directly into their own platforms, with pricing that aligns more predictably with catalog size.

Cyanite’s approach to AI music search

Cyanite is built to help teams understand their catalog through sound. We bring audio, language, and filters into one place so you can move through briefs without switching tools.

Audio-first analysis

Cyanite listens to the full track from beginning to end and captures how it develops in instrumentation, energy, and mood. This audio-first approach drives Similarity Search, Free Text Search, and Advanced Search. Because the focus stays on the audio rather than popularity and text-only metadata, you reach tracks that often get overlooked.

Data security and model ownership

Your audio remains within Cyanite’s environment.

  • Audio analysis and search models are built and maintained in-house.
  • No files are sent to external AI providers.
  • All processing meets GDPR requirements.

Teams with specific copyright needs can use upload workflows specifically designed for internal and client-facing work.

Built for catalog scale

Full tracks are analysed in depth, with thousands of sonic details compared. This means large libraries can be processed quickly without search performance slowing as the catalog grows. Search performance remains steady at high volume, which makes it easier to bring new material into the library without disrupting ongoing work. 

Search that adapts to the workflow

Similarity Search, Free Text Search, and Advanced Search all draw from the same audio analysis, which makes it easy to move between a reference track, a written prompt, or a set of filters in a single workflow. Advanced Search adds scoring and segment highlights when you need more context, while the other modes help you move quickly through creative requests. Together, these tools support different working styles and keep results consistent across teams and briefs.

Try AI music recognition with your own tracks

AI music search helps catalogs stay workable as they grow. By reading the audio and supporting both reference-based and prompt-based queries, it reduces search time and brings more of the catalog into play.

Want to see how this works with your own tracks? You can test Similarity Search and Free Text Search in the web app, or explore Advanced Search through the API.

FAQs – API Integration

Q: How does AI music recognition work in a catalog?

A: AI music recognition interprets patterns in the audio and compares them across the catalog. This reduces reliance on metadata wording and supports searches that begin with a reference track or a natural-language prompt.

Q: Is Cyanite the same as an AI music finder or consumer music search engine?

A: No. Consumer-facing music search and recommendation systems are typically driven by listening behavior and user interaction data. Cyanite focuses on sound-based analysis and metadata, making it suitable for professional catalog search, editorial workflows, and internal systems.

Streaming platforms use Cyanite to complement behavioral data with objective audio understanding, especially for catalog organization, discovery, and editorial use cases.

Q: Can Cyanite be used in my CMS for music?

Cyanite is fully integrated with SourceAudio, Cadenzabox, Harvest Media, Music Master, Reprtoir, Synchtank, and TuneBud. DISCO users can also import Cyanite’s Auto-Tagging and Auto-Descriptions into their libraries. These integrations support a wide range of Cyanite use cases across catalog management systems.

Q: Who uses Cyanite?

A: Music publishers, production libraries, sync teams, audio branding agencies, and music-tech platforms use Cyanite for tagging, search, playlist building, onboarding, and catalog analysis. Artists and producers use the web app for fast tagging and discovery.

Q: Can I integrate AI search into my own platform?

A: Yes. The API supports Similarity Search, Free Text Search, Advanced Search, and audio analysis, making it possible to add AI-powered discovery directly into your product.

How to Write a Song Bio for DSPs with Cyanite+ChatGPT

How to Write a Song Bio for DSPs with Cyanite+ChatGPT

A Step-by-step guide: How to write a song bio for DSPs like Spotify, Youtube, Bandcamp, Soundcloud, Tidal, Deezer, and Pond5

In one of our latest blog posts, we showcased the best ways to utilize Cyanite as an artist or producer. With AI on the rise, you won’t be surprised that you can also make use of these tools to write a song bio.

Cyanite + ChatGPT: How to elevate your pitches for various DSPs.

Distributors and DSPs want to know as much as they can about your song to ensure they can place it in the right playlists and target the right audience – and we all know how challenging it can be to find the right words to describe our own music. Writing a good song bio is crucial to get playlisted on DSPs.

When uploading my songs to Pond5, this process used to take me around 2 hours per track and now it is just 20 minutes with Cyanite + ChatGPT.

Guillermo Pareja, Cyanite User

We are going to give you a step-by-step guide on how to streamline this process.

1. The Power of AI Tagging – Log into your Cyanite Account

First things first: If you don’t have an account yet, sign up here. Cyanite provides a ton of metadata for your song, and for up to 5 songs per month, it’s free. Need more? We’ve got you covered. Just upgrade to a subscription for an additional 15 monthly analyses, or get some extra one-time credits.

2. Upload your song to Cyanite

Head to “Library” and upload the song you want to use. For this use case, we need the genre, advanced moods, character, and movement tags.

To get the best results, choose the right tags from each “mean” section, offering you a percentage of every tag. Leave out the ones that you might not agree with personally, but be careful not to be too subjective!

For even better results: use Cyanite’s Augmented Keywords and AI Descriptions to enhance your prompt in the next step – only available for subscription users

3. The Almighty Chat GPT

For the next step, you need to log into your ChatGPT account and start writing your prompt. Be as specific as possible about the platform you want to use the description and keywords for.

Let ChatGPT know what you are looking for – this could be anything from whole text descriptions or keywords to a brand new song title.

Then copy the genre, character, mood, and movement tags and paste them into the chatbot. 

For a better result, add Cyanites AI Description to the prompt.

4. Pimp Up Your Prompt

You might have already found what you’re looking for. But there’s potentially more. Just like with copying homework, you want to be careful not to have the same results as everyone else on Spotify.

Try to enrich your ChatGPT prompt with personal details. This could range from anything like musical inspirations to the place where it’s been recorded. 

Pretty much anything you can think of.

5. Refine your Results

When writing your song bio, there’s no magical solution for everything, so in each case, this can be a source of inspiration rather than a finished text description or song title.

So it’s best to rewrite the text in your own words.

In any case, you’ve surely obtained results that will help you distribute your track on a variety of platforms.

Curious how the song we used sounds?

In case you discover further interesting use-cases for Cyanite, feel free to reach out to us here.

A big thank you to Guillermo Pareja for supporting us on this post. He got in touch with us to let us know about this use case and how benificial it has been to him.

Your Cyanite Team.