The Power of Automatic Music Tagging with AI

The Power of Automatic Music Tagging with AI

Ready to transform how you manage your catalog? Start auto-tagging music with Cyanite

We know managing a large music catalog can feel overwhelming. When metadata is inconsistent or incomplete, tracks become difficult to find and hard to work with. The result is a messy catalog that you have to sort out manually—unless you use AI auto-tagging.

Read more to see how automatic music tagging reduces friction and helps you organize your catalog more accurately.

What is automatic music tagging?

Automatic music tagging is an audio analysis process that identifies a song’s mood, genre, energy, tempo, instrumentation, and other core attributes. A music analyzer AI listens to the track and applies these labels with consistent logic, providing stable metadata across your catalog.

AI tagging supports teams that depend on fast, accurate search. For example, if you run a sync marketplace that needs to respond to briefs quickly, you can surface the right tracks in seconds when the metadata aligns with the sound you’re looking for. If you work at a production library with thousands of incoming submissions, you can review new material more efficiently when the system applies consistent labels from day one. The same applies to music-tech platforms that want stronger discovery features without having to build their own models.

Benefits of auto-tagging for music professionals

AI auto-tagging brings value across the music industry. When tracks enter your system with clear, predictable metadata, teams can work with more confidence and fewer bottlenecks, supporting smoother catalog operations overall.

  • Faster creative exploration: Sync and production teams can filter and compare tracks more quickly during pitches, making it easier to deliver strong options under time pressure.

  • More reliable handoffs between teams: When metadata follows the same structure, creative, technical, and rights teams work from the same information without needing to reinterpret tags.

  • Improved rights and version management: Publishers benefit from predictable metadata when preparing works for licensing, tracking versions, and organizing legacy catalogs.

  • Stronger brand alignment in audio branding: Agencies working on global campaigns can rely on mood and energy tags that follow the same structure across regions, helping them maintain a consistent brand identity.

  • Better technical performance for music platforms: When metadata is structured from the start, product and development teams see fewer ingestion issues, more stable recommendations, and smoother playlist or search behavior.

  • Greater operational stability for leadership: Clear, consistent metadata lowers risk, supports scalability, and gives executives more confidence in the long-term health of their catalog systems.

Why manual music tagging fails at scale

There’s a time and a place for running a music catalog manually: if your track selection is small and your team has the capacity to listen to each song one by one and label them carefully. But as your catalog grows, that process will start to break down.

Tags can vary from person to person, and different editors will likely use different wording. Older metadata rarely matches newer entries. Some catalogs even carry information from multiple systems and eras, which makes the data harder to trust and use.

Catalog managers are not the only ones feeling this pain. This inconsistent metadata slows down the search for creative teams. Developers are also affected when this unreliable data disrupts user-facing recommendation and search features. So the more music you manage, the more this manual-tagging bottleneck grows.

When human collaboration is still needed

While AI can provide consistent metadata at scale, creative judgment still matters. People add the cultural context and creative insight that go beyond automated sound analysis. Also, publishers sometimes adapt tags for rights considerations or for more targeted sync opportunities.

The goal of AI auto-tagging is not to replace human input, but to give your team a stable foundation to build on. With accurate baseline metadata, you can focus on adding the context that carries strategic or commercial value.

Cyanite has maybe most significantly improved our work with its Similarity Search that allows us to enhance our searches objectively, melting away biases and subjective blind spots that humans naturally have.

William Saunders

Co-Owner & Creative Director, MediaTracks

How does AI music tagging work at Cyanite?

At Cyanite, our approach to music analysis is fully audio-based. When you upload a track, our AI analyzes only the sound of the file—not the embedded metadata. Our model listens from beginning to end, capturing changes in mood, instrumentation, and energy across the full duration.

We start by converting the MP3 audio file into a spectrogram, which turns the sound into a visual pattern of frequencies over time. This gives our system a detailed view of the track’s structure. From there, computer vision models analyze the spectrogram to detect rhythmic movement, instrument layers, and emotional cues across the song. After the analysis, the model generates a set of tags that describe these characteristics. We then refine the output through post-processing to keep the results consistent, especially when working with large or fast-growing catalogs.

This process powers our music tagging suite, which includes two core products:

  • Auto-Tagging: identifies core musical attributes such as genre, mood, instrumentation, energy level, movement, valence–arousal position, and emotional dynamics. Each label is generated through consistent audio analysis, which helps maintain stable metadata across new and legacy material.

  • Auto-Descriptions: complement tags with short summaries that highlight the track’s defining features. These descriptions are created through our own audio models, without relying on any external language models. They give you an objective snapshot of how the music sounds, which supports playlisting, catalog review, and licensing workflows that depend on fast context.

Inside Cyanite’s tagging taxonomy

Here’s a taste of the insights our music auto-tagging software can generate for you: 

  • Core musical attributes: BPM, key, meter, voice gender
  • Main genres and free genre tags: high-level and fine-grained descriptors
  • Moods and simple moods: detailed and broad emotional categories
  • Character: the expressive qualities related to brand identity
  • Movement: the rhythmic feel of the track
  • Energy level and emotion profile: overall intensity and emotional tone
  • Energy and emotional dynamics: how intensity and emotion shift over time
  • Valence and arousal: positioning in the emotional spectrum
  • Instrument tags and presence: what instruments appear and how consistently
  • Augmented keywords: additional contextual descriptors
  • Most significant part: the 30-second segment that best represents the song
  • Auto-Description: a concise summary created by Cyanite’s models
  • Musical era: a high-level temporal categorization

Learn more: Check out our full auto-tagging taxonomy here.

To show how these elements work together, we analyzed Jungle’s “Back On 74” using our auto-tagging system. The table below reflects the exact values our model generated.

Visualization of an auto-tagging example song.

Step-by-step Cyanite Auto-Tagging integration

You can get started with Cyanite through our web app or by connecting directly to the Auto-Yagging API. The process is straightforward and designed to fit into both creative and technical workflows.

1. Sign up and verify your account

  • Create a Cyanite account and verify your email address.
  • Verification is required before you can create an integration or work with the API.
  • Once logged in, you’ll land in the Library view, where all uploaded tracks appear with their generated metadata.
A screenshot of a music library with tags

2. Upload your music

You can add music to your Library by:

  • Dragging MP3 files into the Library
  • Clicking Select files to browse your device
  • Pasting a YouTube link and importing the audio

Analysis starts automatically, and uploads are limited to 15 minutes per track.

A screenshot of a music library with an upload window

3. Explore your tags in the web app

Once a file is processed, you can explore all its tags inside the Library. In this view, you can discover:

  • Your songs’ full tag output
  • The representative segment, full-track view, or a custom interval
  • Similarity Search with filters for genre, BPM, or key
  • Quick navigation through your catalog using the search bar

This helps you evaluate your catalog quickly before integrating the results into your own systems.

A screenshot of a music library with an upload window

4. Create an API integration (for scale and automation)

If you want to connect Cyanite directly to your internal tools, you can set up an API integration. Just note that coding skills are required at this stage.

  1. Open the Web App dashboard.
  2. Go to Integrations.
  3. Select Create New Integration.
  4. Select a title.
  5. Fill out the webhook URL and generate or create your own webhook secret.
  6. Click the Create Integration button.

After you create the integration, we generate two credentials:

  • Access token: used to authenticate API requests
  • Webhook secret: used to verify incoming events

Test your integration credentials following this link.

You must store the access token and webhook secret securely. You can regenerate new credentials at any time, but they cannot be retrieved once lost.

Pro tip: We have a sample integration available on GitHub to help you get started.

 

5. Start sending audio to the API

  • Use your access token to send MP3 files to Cyanite for analysis.

Pro tip: For bulk uploads (>1,000 audios), we recommend using an S3 bucket upload to speed up ingestion.

6. Receive your tagging results

  • Your webhook receives the completed metadata as soon as the analysis is finished.
  • If needed, you can also export results as CSV or spreadsheet files.
  • This makes it easy to feed the data into playlisting tools, catalog audits, licensing workflows, or internal search systems.

7. Start using your metadata

Once results are flowing, you can integrate them into the workflows that matter most:

  • Search and recommendation tools
  • Catalog management systems
  • Playlist and curation workflows
  • Rights and licensing operations
  • Sync and creative pipelines
  • Internal music discovery dashboards

Read more: Check out Cyanite’s API documentation

Auto-tag your tracks with Cyanite

AI auto-tagging helps you bring structure and consistency to your catalog. By analyzing the full audio, our models capture mood changes, instrumentation, and energy shifts that manual tagging often misses. The result is metadata you can trust across all your songs.

Our tagging system is already widely adopted; over 150 companies are using it, and more than 45 million songs have been tagged. The system gives teams the consistency they need to scale their catalogs smoothly, reducing manual cleanup, improving search and recommendation quality, and giving you a clearer view of what each track contains.

If you want to organize your catalog with more accuracy and less effort, start tagging your tracks with Cyanite.

FAQs

Q: What is a tag in music?

A: A tag is metadata that describes how a track sounds, such as its mood, genre, energy, or instrumentation. It helps teams search, filter, and organize music more efficiently.

Q: How do you tag music automatically?

A: Automatic tagging uses AI trained on large audio datasets. The model analyzes the sound of the track, identifies musical and emotional patterns, and assigns metadata based on what it hears.

Q: What is the best music tagger?

A: The best auto-tagging music software is the one that analyzes the full audio and delivers consistent results at scale. Cyanite is widely used in the industry because it captures detailed musical and emotional attributes directly from the sound and stays reliable across large catalogs.

Q: How specific can you get when tagging music with Cyanite

A: Cyanite captures detailed attributes such as mood, simple mood, genre, free-genre tags, energy, movement, valence–arousal, emotional dynamics, instrumentation, and more. Discover the full tagging taxonomy here.

Music Analysis API: AI-Powered Tagging & Search

Music Analysis API: AI-Powered Tagging & Search

Ready to bring AI search and tagging into your platform? Start integrating with the Cyanite API.

Modern music platforms manage thousands of tracks, yet many still rely on metadata systems that weren’t built for the speed or complexity of today’s catalogs. As libraries grow, teams must deal with missing tags, inconsistent descriptors, and limited search options. 

Product teams and catalog managers need faster and deeper ways to search for and analyze music. They also need those capabilities to live inside their own product, not in a disconnected external workflow.

The Cyanite API was built for this reality. Keep reading to discover what it enables and how companies across the industry use it in production.

How the Cyanite API improves your catalog workflows

If you manage music on your own platform, the Cyanite Music Analysis API gives you a reliable way to bring our music intelligence into your product. You can integrate the features your users need with full control over how they experience them.

  • Ease of use: With our API, you can upload tracks from your system to Cyanite and get fast, accurate tagging and similarity search results. The integration is fully embedded—your users stay within your platform while Cyanite processes the audio in the background. If you already have a Cyanite account, you can access the API for free to run a small test analysis.
  • Fast deployment and integration: We use a GraphQL API, so you can query only the data you need and shape responses to fit your workflows. This flexibility makes it easier to adapt your integration over time as you learn how your users interact with Cyanite. Soon, we will also offer a REST API.
  • Quality of support: We keep our API documentation structured and easy to follow, with clear step-by-step instructions and real examples. When you need guidance, we support you directly so you can integrate Cyanite into your system smoothly.
A screenshot showing the first steps needed to create an API integration

What you gain by integrating the Cyanite API

Everything you see in the Cyanite Web App is available through the API. You can integrate the same analysis and search capabilities into your own system and tailor them to your platform’s architecture.

AI-powered music tagging

Cyanite automatically tags your music based on its audio content. Our system delivers a rich set of tags for the full track and for every 15-second segment, helping you map changes in energy, mood, instrumentation, and other key attributes.

We convert each track into a spectrogram, apply computer vision to understand its musical structure, and refine the results through post-processing. This gives you a consistent metadata layer that supports curation, licensing, and catalog navigation at scale.

Learn more: Explore our tagging taxonomy to see how our model works

The Cyanite API includes an interactive Query Builder that helps developers work with this metadata efficiently. You can use it to test queries, explore the full schema, and view example JSON outputs for every tag and attribute (global and segment-based). This makes it easy to map our tagging fields directly to your internal data model.

Music Similarity Search

A screenshot showing the first steps needed to create an API integration

Cyanite delivers similar-sounding songs for any reference audio file or YouTube link. You can upload a file or paste a link, and the system analyzes the track before comparing it to your music library. We also store the reference song in your library so you can choose which part of the track you want to base the search on. This lets you explore different segments—for example, the chorus and the verse—to find the best match.

If you work with Spotify, you can also use Spotify track IDs. Once we’ve analyzed the standard 30-second preview, the results stay stored for faster retrieval in future searches.

Read more: Best of music similarity search: find similar songs with the help of AI

Free Text Search

A screenshot showing the first steps needed to create an API integration

With Free Text Search, you can write full sentences and let Cyanite interpret their meaning. The system understands the semantics of natural language, whether you describe a musical idea or the mood of a scene. This approach removes the usual constraints of music search and makes it easy for anyone to find the right tracks.

Here are some example prompts:

  • “dark atmospheric strings with slow build-up”
  • “warm indie guitar with nostalgic mood”
  • “energetic Latin pop for a dance scene”

Read more: How to prompt: the guide to using Cyanite’s Free Text Search

Advanced Search (Similarity & Free Text Search add-on)

A screenshot showing the first steps needed to create an API integration

Advanced Search helps you and your users find the ideal track in your catalog based on your meticulously curated criteria and exact specifications. It’s an extended feature set that enhances Similarity and Free Text Search with custom filtering, ranking, and multi-reference capabilities. Advanced Search is ideal for platforms needing precision, ranking logic, hybrid search models, or complex business rules. Some of the key capabilities include:

  • Custom metadata upload: Upload your own metadata fields (regions, rights, subcatalogs, cultural tags), and use them as filters in search queries.
  • Similarity scores: Retrieve normalized similarity scores for ranking and help users better understand and trust the search results at a glance.
  • Multi-track search: Provide up to 50 reference tracks at once. The system merges their sonic profiles to deliver more accurate creative matches.
  • Up to 500 ranked results: Get deep catalog visibility—ideal for editorial teams and large platforms.

Most similar segments: Retrieve segment-level match information between reference and result.

Crates

A screenshot of Cyanite's API documentation showing the creation of crates.

Crates complement Advanced Search and help keep large, complex catalogs manageable.

Use them to define focused subsets within your library so you can manage specific parts of the catalog with more precision. They also help you control access based on rights, support teams who only work with a particular portion of the catalog, and keep curated subcatalogs separate when needed. You can also run similarity searches within a crate, which is useful for segmented or specialized discovery workflows.

How leading platforms have used our API

The Cyanite API is already embedded into the workflows of leading music platforms, helping them power search, discovery, and catalog navigation. Companies like Epidemic Sound, Marmoset, Musicbed, Reelworld, and MAIA Universe use our intelligence directly in their own interfaces to deliver faster, more intuitive music experiences.

You can explore their implementations above.

Cyanite has maybe most significantly improved our work with its Similarity Search that allows us to enhance our searches objectively, melting away biases and subjective blind spots that humans naturally have.

Alex Paguirigan

Product Manager, Marmoset

Ready to start your integration?

The Cyanite API gives you detailed tagging and intelligent search that understands both sound and language, so you can upgrade your catalog experience on your own terms. Build as lightly or as deeply as your product needs—you have full control.

If you’re ready to start coding, sign up to Cyanite and explore the API. You can begin shaping the integration in your environment right away.

FAQs – API Integration

Q: How long does the integration process take?

A: Cyanite’s API integration is typically completed on our side within just a few days. However, the time required for front-end implementation and customization depends on the complexity and scope of your project. Based on our experience, a full integration – including testing, optimization, and deployment – usually takes 2 to 6 weeks to achieve a seamless, fully functional interface.

Q: What Cyanite features are available via API?

A: All features that we offer in our Web App are available via API. This includes all of our latest search & tagging algorithms. It is also possible to get insights for your catalog as a whole from data via the API. To learn more about catalog insights read this article.

Q: How much does the API cost?

A: The API usage fee is 290€/month. However, tuehe total price of the subscription depends on your catalog size and requested features. Please fill out this Typeform and we will get back to you with a quote.

Q: I am using a third-party catalog management system. How can I get Cyanite’s results into that?

A: Cyanite is fully integrated with Cadenza Box, Harvest Media, Music Master, Reprtoir, Synchtank, and Tune Bud for Auto-Tagging and Search. Also, DISCO or Source Audio customers can easily upload Cyanite’s Auto-Tagging and Auto-Descriptions to their libraries. Just reach out to business@cyanite.ai and we’ll look together over the format requirements of your library system.

Case Study: How Syncvault uses Cyanite’s AI Tagging To Unlock the Power of Music Promotion

Case Study: How Syncvault uses Cyanite’s AI Tagging To Unlock the Power of Music Promotion

Introduction

In the vast landscape of music tools for artists, London-based company SyncVault stands out as a reliable platform, empowering artists and brands to promote their music, products, and services. 

With an engaged community of social media influencers and content creators, SyncVault opens doors to new opportunities in the world of music promotion. 

To amplify their impact, SyncVault sought a state-of-the-art solution to unlock the full potential of their curated music catalog. This is where Cyanite entered the picture, offering AI-powered music analysis and tagging technology.

 

Defining the Challenge: Enhancing Music Metadata Insight

SyncVault aimed to extract deeper insights and data from their diverse repertoire of songs. 

Unlike conventional licensing providers with extensive libraries, SyncVault has a small and highly curated selection of tracks for which it required a solution capable of accurately generating multi-genre metadata and assigning appropriate weightage to each genre to improve music search and data insight.

 

Discovering the Suitable Partner: Cyanite

SyncVault found an ideal partner in Cyanite, which was recommended by their own network and whose product offering aligned seamlessly with SyncVault’s objectives. 

First, Cyanite’s comprehensive and accurate music analysis and tagging technology met their specific requirements. Cyanite’s taxonomy, which offers various tags in over 25 different classes, won over the team after a free tagging trial of 100 songs.

Second, SyncVault was impressed by Cyanite’s transparent, scalable, and competitive pricing model.

 

The Transformation: Streamlined Efficiency and Accuracy

After signing an agreement and booking a 1-year subscription, SyncVault seamlessly integrated Cyanite’s solutions into their workflow in just a few weeks. 

Picture 1: Mood-based keywords and search results on SyncVault platform

Additionally, Cyanite’s AI technology enhanced SyncVault’s music analytics, providing valuable insights into song structure, tempo, genre, key, mood, and more.

Empowering Team and Users: Elevating the SyncVault Experience

Cyanite’s auto-tagging capabilities significantly improved SyncVault’s efficiency and productivity, enabling its small team to categorize their repertoire faster and more consistently.

Furthermore, users experienced an enhanced music search, allowing them to filter and find the perfect soundtrack for their creative needsmore quickly. The partnership with Cyanite transformed SyncVault’s platform, fostering a thriving community where music resonates with listeners.

Picture 2: A look at how Syncvault’s curation team uses Cyanite tags in the backend.

Additionally, Cyanite’s AI technology enhanced SyncVault’s music analytics, providing valuable insights into song structure, tempo, genre, key, mood, and more.

A Promising Future: Expanding Horizons

SyncVault is experiencing a steady expansion of its service as it adds more tracks to the Content ID management system. Its catalogue is growing month on month creating more opportunities for licensing tracks for its brand partners.

SyncVault envisions extending its music promotion services to Content ID clients, creating more opportunities for brands to discover the ideal songs for their creative campaigns.

As SyncVault continues its expansion, Cyanite’s AI search and recommendation tools such as Similarity Search or Free Text Search would work seamlessly with their catalogue further enhancing the customer experience and forging new frontiers in music promotion. Integrating auto-tagging was just the first step towards an even deeper partnership between two music-enthusiastic companies.

If you want to learn more about SyncVault, you can check out their platform here: https://syncvault.com/

If you want to learn more about our API services, check out our docs here: https://api-docs.cyanite.ai/

PR: Cyanite and Pond5 sign new partnership

PR: Cyanite and Pond5 sign new partnership

PRESS RELEASE

Cyanite and Pond5 sign new partnership to use AI tagging across Pond5’s 2 million-strong music catalogue

Berlin/Mannheim/New York – June 27, 2023 – AI-powered music tagging and search company Cyanite has announced a new partnership with Pond5, the world’s largest marketplace for royalty-free stock videos. Pond5, a Shutterstock subsidiary, also offers a vast collection of music tracks, sound effects, images, and more.

Under the new partnership, Cyanite will leverage its AI technology to improve tagging of Pond5’s growing 2 million-strong catalogue. The goal is to enrich existing metadata with a more consistent and objective music language in order to improve search recall and relevance. This will in turn drive increased user engagement and conversion. Creators from around the world will be able to quickly and easily discover the perfect music to be used in film and TV production, social media, advertising, gaming, and much more.

Pond5 is known industry-wide for the breadth and depth of its expansive music collection. With a pricing model led by artists, creators can source tracks for diverse needs, with a myriad of genres and price points on offer. Previously, Pond5 surfaced tracks for interested buyers by leaning primarily on contributor-generated tags. It needed to improve tagging and search viability by expanding on generalisability and objectivity to consistently provide the right music for search queries. The improved tagging strategy powered by Cyanite has allowed Pond5 to more accurately categorize content whilst enabling users to efficiently search for content.

Markus Schwarzer, CEO, Cyanite, comments:
If you start an AI tagging company, you dream of the chance to categorize a catalogue of 2 million songs where you can really show what the technology is capable of. It’s not a trivial task in terms of the technical processing AND accuracy of AI tagging. A lot of things can go wrong potentially even harming Pond5’s ability to execute its core business. We are very happy that Pond5 put their trust in us to take on this important task and even more that the results are so convincing. With this collaboration, we aim to simplify the complexity of music search, empowering content creators and customers to quickly and efficiently discover the perfect music for their needs.”

Ben Remetz, VP Product, Pond5 says:
At Pond5, we’re passionate about empowering content creators and leading the industry through innovative technology and a commitment to quality. Our partnership with Cyanite is just one example of how we’re always striving to provide our customers with the best possible experience.

Pond5 uploads over 2.5 million assets across all media each month – its collection includes 36 million licensable video clips, and over 10 million customers licensing content from 185 countries, allowing Pond5 to pay out over $100 million in royalties to artists. The platform’s customers include major media companies such as Netflix, Disney, NBC, BBC, Discovery Channel, and The Wall Street Journal.

Cyanite was launched in 2019 in Germany and has already tagged over 15 million songs worldwide for customers such as BMG, Synchtank, and APM Music. Cyanite recently launched  Free Text Search, the first search engine that can instantly translate complex text input into its closest musical equivalents. The partnership with Pond5 is another major milestone toward the company’s vision of creating a universal intelligence for music.

ENDS

About Cyanite

Cyanite helps music companies to turn their catalogs into their own personal Spotify – powering music libraries with the simplicity, visibility, and functionality to perform how they and their users expect them to.

From its offices located in Berlin and Mannheim, Cyanite builds powerful AI-based analysis and recommendation solutions to efficiently tag and search music. This enables music, entertainment, and advertising companies to quickly and cost-effectively deliver the right songs for their customers’ search queries.

Cyanite supports some of the most renowned and innovative players in the music and advertising industry via API or no-code solutions. Among the music companies using Cyanite are the production music libraries APM Music and Far Music (RTL), the music publishers BMG, Nettwerk Music Group, NEUBAU Music and Schubert Music, and the sound branding agencies amp sound branding, Universal Music Solutions, and Human Worldwide.

Cyanite’s vision is to become the universal intelligence that understands, connects, and recommends the world’s music–an intelligence that can translate music into anything and anything into music.

Website: https://cyanite.ai/

Web App: https://app.cyanite.ai/register

API: https://api-docs.cyanite.ai/

LinkedIn: Cyanite.ai

About Pond5

Pond5 is the world’s largest video-first content marketplace, with over 36 million royalty-free video clips, plus millions of music tracks, sound effects, images, and more. The company was founded in 2006 and was acquired by Shutterstock in 2022.

Pond5 strives to create world-class storytellers by providing creators of all types with the content they need to tell stories, share knowledge, and inspire audiences. Driven by a commitment to its passionate and growing global community of more than 100,000 professional visual and audio artists, Pond5 provides a platform where creative work can flourish. The marketplace serves the needs of creators across industries—from individual users to major corporations—with competitive pricing and an array of purchase options including a unique pay-per-item model. Purchases are backed by a broad and flexible license, a best price guarantee, and a dedicated team standing by to provide expert assistance.

Website: https://pond5.com/

 

Guest post for Hypebot: How AI can generate new revenue for existing music catalogs?

Guest post for Hypebot: How AI can generate new revenue for existing music catalogs?

Our CEO Markus Schwarzer has published a guest post on UK-based music industry medium Hypebot.

In this guest post, our CEO Markus elaborates on how AI can be used to resurface, reuse, and monetize long-forgotten music, addressing concerns about its impact on the music industry. By leveraging AI-driven curation and tagging capabilities, music catalog owners can extract greater value from their collections, enabling faster search, diverse curation, and the discovery of hidden music, while still protecting artists and intellectual property rights.

You can read the full guest post below or head over to Hypebot via this link.


by Markus Schwarzer, CEO of Cyanite

AI-induced anxiety is ever-growing.

Whether it’s the fear that machines will evolve capabilities beyond their coders’ control, or the more surreal case of a chatbot urging a journalist to leave his wife, paranoia that artificial intelligence is getting too big for its boots is building. One oft-cited concern, voiced in an open letter from a group of AI-experts and researchers calling themselves the Future of Life Institute calling for a pause in AI development, is whether, alongside mundane donkeywork, we risk automating more creative human endeavors.

It’s a question being raised in recording studios and music label boardrooms. Will AI begin replacing flesh and blood artists, generating music at the touch of a button?

While some may discount these anxieties as irrational and accuse AI skeptics of being dinosaurs who are failing to embrace the modern world, the current developments must be taken seriously.

AI poses a potential threat to the livelihood of artists and in the absence of new copyright laws that specifically deal with the new technology, the music industry will need to find ways to protect its artists.

We all remember when AI versions of songs by The Weeknd and Drake hit streaming services and went viral. Their presence on streaming services was short-lived but it’s a very real example of how AI can potentially destabilise the livelihood of artists. Universal Music Group quickly put out a statement asking the music industry “which side of history all stakeholders in the music ecosystem want to be on: the side of artists, fans and human creative expression, or on the side of deep fakes, fraud and denying artists their due compensation.

“there are vast archives of music of all genres lying dormant and thousands of forgotten tracks”

However, there are ways that AI can deliver real value to the industry – and specifically to the owners of large music catalogues. Catalogue owners often struggle with how to extract the maximum value out of the human-created music they’ve already got.

But we can learn from genAI approaches. Recently introduced by AI systems like Midjourney, ChatGPT or Riffusion, prompt-based search experiences are prone to creep into anyone’s user behavior. But instead of having to fall back to bleak replicas of human-created images, texts, or music, AI engines can give music catalogue owners the power to build comparable search experience with the advantage of surfacing well-crafted and sounding songs with a real human and a real story behind it.

There are vast archives of music of all genres lying dormant, and thousands of forgotten tracks within existing collections, that could be generating revenue via licensing deals for film, TV, advertising, trailers, social media clips and video games; from licences for sampling; or even as a USP for investors looking to purchase unique collections. It’s not a coincidence that litigation over plagiarism is skyrocketing. With hundreds of millions of songs around, there is a growing likelihood that the perfect song for any use case already exists and just needs to be found rather than mass-generated by AI.

With this in mind, the real value of AI to music custodians lies in its search and curation capabilities, which enable them to find new and diverse ways for the music in their catalogues to work harder for them.

How AI music curation and AI tagging work

To realize the power of artificial intelligence to extract value from music catalogues, you need to understand how AI-driven curation works.

Simply put, AI can do most things a human archivist can do,but much, much faster; processing vast volumes of content, and tagging, retagging, searching, cross-referencing and generating recommendations in near real-time. It can surface the perfect track – the one you’d forgotten, didn’t know you had, or would never have considered for the task in hand – in seconds.

This is because AI is really good at auto-tagging, a job few humans relish. It can categorise entire music libraries by likely search terms, tagging each recording by artist and title, and also by genre, mood, tempo and language. As well as taking on a time-consuming task, AI removes the subjectivity of a human tagger, while still being able to identify the sentiment in the music and make complex links between similar tracks. AI tagging is not only consistent and objective (it has no preference for indie over industrial house), it also offers the flexibility to retag as often as needed.

The result is that, no matter how dusty and impenetrable a back catalogue, all its content becomes accessible for search and discovery. AI has massively improved both identification and recommendation for music catalogues. It can surface a single song using semantic search, which identifies the meaning of the lyrics. Or it can pick out particular elements in the complexities of music in your library which make it sound similar to another composition (one that you don’t own the rights to, for example). This allows AI to use reference songs to search through catalogues for comparable tracks.

The power of AI music catalog search

The value of AI to slice and dice back catalogs in these ways is considerable for companies that produce and licence audio for TV, film, radio and multimedia projects. The ability to intelligently search their archives at high speed means they can deliver exactly the right recording to any given movie scene or gaming sequence.

Highly customisable playlists culled from a much larger catalogue are another benefit of AI-assisted search. While its primary function is to allow streaming services such as Spotify to deliver ‘you’ll like this’ playlists to users, for catalogue owners it means extracting infinitely refinable sub-sets of music which can demonstrate the archive’s range and offer a sonic smorgasbord to potential clients.

“the extraction of ‘hidden’ music”

Another major value-add is the extraction of ‘hidden’ music. The ability of AI to make connections based on sentiment and even lyrical hooks and musical licks, as well as tempo, instruments and era, allows it to match the right music to any project with speed and precision only the most dedicated catalogue curator could fulfil. With its capacity to search vast volumes of content, AI opens the entirety of a given library to every search, and surfaces obscure recordings. Rather than just making money from their most popular tracks, therefore, the owners of music archives can make all of their collection work for them.

The tools to do all of this already exist. Our own solution is a powerful AI engine that tags and searches an entire catalogue in minutes with depth and accuracy. Meanwhile, AudioRanger is an audio recognition AI which identifies the ownership metadata of commercially released songs in music libraries. And PlusMusic is an AI that makes musical pieces adaptive for in-game experiences. As the gaming situation changes, the same song will then adapt to it.

Generative AI – time for careful reflection

The debate on the role of generative AI in the music industry won’t be solved anytime soon and it shouldn’t. We should reflect carefully on the incorporation of any technology that might potentially reshape our industry. We should ask questions such as: how do we protect artists? How do we use the promise of generative AI to enhance human art? What are the legal and ethical challenges that this technology poses? All of these issues must be addressed in order for the industry to reap the benefits of generative AI.

Adam Taylor, President and CEO of the American production music company APM Music, shared with me that he believes it is vital to safeguard intellectual property rights, including copyright, as generative AI technologies grow across the world. As he puts it: “While we are great believers in the power of technology and use it throughout our enterprise, we believe that all technology should be used in responsible ways that are human-centric. Just as it has been throughout human history, we believe that our collective futures are intrinsically tied to and dependent on retaining the centrality of human-centered art and creativity.

The debate around the role of generative AI models will continue to play out as we look for ways to embrace new technologies and protect artists, and naturally there are those like Adam who will wish to adopt a cautious approach. But while there are many who are reluctant to wholeheartedly embrace generative AI models, andthere are many more who are willing to embrace analysis and search AI to protect their catalogues and make them more efficient and searchable.

Ultimately, it’s down to the industry to take control of this issue, find a workable level of comfort with AI capabilities, and build AI-enhanced music environments that will vastly improve the searchability – and therefore usefulness – of existing, human-generated music.

If you want to get more updates from Markus’ view on the music industry, you can connect with him on LinkedIn here.

 

More Cyanite content on AI and music