Experience Our Biggest Web App Update with 5,000+ New Genres! ???? Discover Now

Introducing: Cyanite’s Keyword Cleaning System for Music Libraries

Introducing: Cyanite’s Keyword Cleaning System for Music Libraries

In this article, we present the common challenge of inconsistencies of keyword tagging in music databases. We discuss what causes these problems and how Cyanite developed a Keyword Cleaning system to automatically solve and overcome these. We will present four use cases for our Keyword Cleaning system and the potential impact it may have on music businesses.

Introduction of the problem

The way we perceive music is highly individual. So is the way we describe music. What is food for many dinner conversations is important to be aware of when handling bigger amounts of musical pieces professionally.

To leverage diverse monetization opportunities with musical assets, many music companies sort music catalogs by assigning keyword tags to all the audio files in their music database. These tags may describe the mood and genre of a song or categorize its instruments or tempo. This way music companies ensure accessibility and searchability of any musical asset even in very large music catalogs.

These tags follow the companies’ individual understanding of music – their catalog language. The specific nature of a catalog language may be understood under two aspects:

1. Objective catalog language (tagging): the entity of keywords and tags often described as taxonomy or tag anthology (quantity, classes and wording). „Which tags do I use.

2. Subjective catalog language (understanding of tagging): the understanding of tags and their connection to certain sound qualities. „When do I assign a certain tag?“

Objective catalog language is inherent to the music catalog or the company that owns it. Subjective catalog language, however, is inherent to every individual person that tags the music.

Having a consistent catalog language leads to a brilliant search experience and is the perfect condition for thorough exploitation of your assets. A lot of work can go into building and maintaining an own catalog language. However, 3 main events can quickly erode it and thus erode tagging quality and meaningfulness:

Event 1: Catalog acquisitions or integrations.

Event 2: Differences in the form of the day of tagging staff.

Event 3: The hiring of new tagging staff.

Not being aware of this can cause the annihilation of the work of decades. Songs can’t be found and revenue streams can’t be realized as before, seriously harming a company’s ability to execute their business model.

More importantly – music searching staff don’t trust the music search anymore which leads them to building up highly individual systems of workarounds to finding suitable music or a very limited „go-to-catalog“ of songs that they use more often rather than grasping on the entire music catalog.

Aaron Chavez © Unsplash

Our solution

Addressing these issues, Cyanite developed a way to bring together (translate) two catalog languages – objective or subjective – with minimum information loss and maximum speed, using AI.

We base our approach on a measure we denote as keyword similarity, describing the degree of semantic similarity of a pair of tags. To give an example, the keywords “enthusiastic” and “euphoric” should have a rather similar meaning when used for the description of a musical mood. We would therefore expect a high degree of keyword similarity. On the contrary, “enthusiastic” and “gloomy” represent a quite contrary pair of descriptive attributes which should point towards a low degree of keyword similarity.

Most music catalogs contain a multi-label tagging scheme, meaning the possibility for a single piece of music to be assigned multiple tags. We take use of this fact and focus on the track-wise co-occurrence of tags, hypothesizing that a frequent joint attribution of a tag pair will indicate a high degree of interrelation and, thus, keyword similarity.

We developed a natural language processing (NLP) AI system capable of learning the semantic interrelation of keywords in any library. With this, we are able to derive a quantitative measure for any combination of keywords contained in one or several music catalogs. This analysis is the basis for a variety of groundbreaking use cases to overcome challenges many music companies are struggling with.

 

Use Case 1: Catalog language translation

This challenge arises when two (or more) differently tagged music catalogs shall be integrated into each other (potentially after a catalog acquisition or when choosing a different distribution outlet). Manually translating tags is tedious and may lead to significant information loss as sometimes the same tags are not used equally (see “subjective catalog language” above).

Our system is able to understand and map every tag in relation to each other. It does it with both taxonomies understanding the respective catalog language. In a second step it maps both catalogue languages on top of each other drawing direct relations between tags and their understanding. The third step marks the translation of the single song tagging from one catalog language into the one the catalog shall be integrated in. The system automatically re-tags every song in a new catalog language.

Use Case 2: Keyword Cleaning of inconsistent keyword tagging

Companies with high fluctuation in tagging staff face this challenge – or it may be a company with a particularly large catalog (>100,000 songs) that picked up some legacy over the years: Inconsistencies in keyword tagging. This is one of the biggest problem catalogs can face as it seriously diminishes the searchability and search experience of the catalog leading to mistrust of the system, individual workarounds and eventually losing the customer for good. Or it leads the customer to directly contact the library’s sales team and search staff which harms the capability of your business to scale.

After understanding the respective catalog language of your catalog our Cyanite Keyword Cleaning system can detect tags with low keyword similarity that may contradict the other tags and flag the respective songs. To assess if a tag was wrongfully assigned (or may be missing), we offer an audio-based tagging solution for these anomalies to detect whether or not a tag is suitable or not. In case of the latter the tag is then deleted.   

Use Case 3: Taxonomy Cleaning. Detection of redundancies and blind spots.

Languages change over time – and with it change catalog languages. Some catalogs have 15,000+ different keywords in their taxonomy. It should come as no surprise that songs with older keyword tags are less being found. The choice to a slimmer taxonomy can elevate searchability and overall search experience of catalogs.

This raises the question of whether all tags are necessary and meaningful or not. To test this, our Cyanite system can detect tags that are equal in meaning by scanning through your keyword tagging. Then it consolidates redundancies condensing a taxonomy to only meaningful disjunct keyword classes.

Use Case 4: Open search

If you rely on customers handing in sync briefings and then search your catalog yourself, your business will lack scalability. So you might want to open up your catalog search to every potential client. For this you want to make sure, that you deliver the right music to every music search and every individual understanding of music – you need to speak the language of every of your customers.

To achieve this, our Cyanite Keyword system can translate a vast amount of keywords into semantically related tags. This means that if you only tag the keyword „euphoric” for very upbeat, outgoing and happy songs, but the client wants to search for „enthusiastic”, our Cyanite Keyword system understands and will present the suitable songs out of your catalog. This is important for keyword that were tagged significantly less in your catalog to be able to show a good variety of music.

Use Case 5: Automatic tagging in your own catalog language.

Let’s say your clients and customers got used to your specific keyword tagging – your catalog language. It means that your catalog language is an integral part of the stickiness of your platform and will lead customers to retain to your service. If you introduce automatic tagging through deep learning systems such as the Cyanite Tagging system, you want to keep the automatic tags in your catalog language so that your customers keep on finding the right music.

To achieve this, our Cyanite Keyword system and the Cyanite Tagging system work together on translating our auto-tags into your catalog language. Your customers won’t even notice that you switched to AI-tagging.

How to get started!

If the approach of Cyanite’s Keyword Cleaning resonates with you, the first step is to have a look into your metadata. For that, please reach out to sales@cyanite.ai. Together, we will dive into your tagging scheme and assess the possibility of a Keyword Cleaning project. 

WISE Panel Video: AI – Musician’s Friend or Foe?

WISE Panel Video: AI – Musician’s Friend or Foe?

WISE hosted a virtual panel moderated by Kalam Ali (Co-Founder, Sound Obsessed) to connect music industry experts and have an open discussion about AI technologies adoption for artists. Among the guests there were Rania Kim (Creative Director, Sound Obsessed & Portrait XO), Harry Yeff/Reeps One (Director, Composer, and Artist, R1000 Studios), Heiko Hoffmann (VP Artist, Beatport) and Markus Schwarzer (CEO, Cyanite). 

 

All united by their interest in music and the future ahead of it, they shared their views on the different access points for AI to be embraced for what it is in the bigger picture: a solution to improve performances, to enhance the UX and to give inspiration within music production.

Education is the means to ensure a deeper understanding of this technology, now still highly questioned as damaging to connection people have with music. A realistic assessment of what opportunities there are for artists in implementing AI and, at the same time, what the risks of improper use are, can break these fear barriers.

Finding a middle ground between men and the autonomy of AI is key, especially in these days where a digital approach is often the only feasible way to make life feel as normal as it should be.  

The extended video of the talk is available on Youtube.

 

 

 

YouTube

By loading the video, you agree to YouTube’s privacy policy.
Learn more

Load video

Electronic music experts reveal 4 essential factors on AI-tech adoption for SMEs

Electronic music experts reveal 4 essential factors on AI-tech adoption for SMEs

We are excited to publish a recent study by Laura Callegaro, a master researcher from the Berlin School of Economics and Law, a longtime electronic music expert and co-founder of the Berlin-based techno label JTseries.

In this guest article, Laura shares 4 essential factors on adoption of AI solutions within the industry based on her research. 

Originally written by Laura Callegaro

 

In electronic music, original scenes are challenged as some small-scale events burgeon into festivals, as market growth and fan-bases develop, DJ cultures become celebrity cultures – with luxury brands like Porsche signing up female Djs – and as electronic music events become cultural experiences. For a market that has quickly turned from niche to mainstream, it may not be a surprise to see that the week’s top 100 most played Spotify tracks are almost entirely dominated by electronic music production. 

A recent survey presented at the International Music Summit in 2019, ranks electronic music as the World’s 3rd most popular genre with an estimated 1.5 billion people typically listening to it.

Source: IFPI Music Consumer Insight Report 2018

“The introduction of mature AI allows creatives and corporations alike to reimagine the creative process.” 

 

This snapshot of contemporary popularity works just as another clear indicator of a new mainstream – in which electronic music has become more central – within the global popular music market. AI is having a significant impact on those roles that are currently most systematic and routine in nature: search, audit and elements of due diligence. The introduction of mature AI allows creatives and corporations alike to reimagine the creative process, target new fans, and identify the next set of musical stars with greater accuracy and precision than we ever imagined.

The research highlights the challenges and opportunities induced by AI in this booming industry, and it focuses on the “What and Why” of SMEs managerial processes under a new edge. Many academic studies have analyzed the cultural, political and social dynamics of this field, but very few have analyzed the economics of this industry. Through semi-structured interviews of both actors – providers and users of AI music marketing tools – combined with qualitative research of primary data, the study relies on the so-called TOE framework. This is one of the most insightful for IT and system adoption research and it helps to identify the different factors of adoption into three dimensions of enterprise context: technological, organizational and environmental.

 

FACTOR 01: Trust the machine

From the data analyzed, It has been demonstrated that there is overall trust in AI systems – 90% of positive sentiment – which are perceived as free of bias. In fact, just one out of 4 users pointed out that these machines are programmed by humans, and therefore it is impossible to have AI 100% free of bias: matter unfortunately already proved by facts. In relation with the research done, this important factor shows the essential need of a tight collaboration between humans and AI powered machines, which allow us to perform ingenious results by analyzing infinite numbers of data in a matter of seconds.

 

It’s essential that humans and AI-powered machines collaborate.

FACTOR 02: Agility wins

Firm size divided the respondents’ opinion widely. The variance between the answers was based mainly on the agility of the decision making process and financial resources of the organization analyzed. A common idea among all the interviewees is that the failure of new technology would have less impact on larger firms, which normally are the ones with larger financial resources. On the other side, 50% of them recognized that agility of smaller companies simplifies the adoption process and makes it more efficient.

Based on the findings, it can be argued that technical skills and financial resources are connected. We can also notice a variation between the replies considering the role played by financial resources, where the responses were heavily dependent on the cost of the exact system the respondents had operated with.

Agility fosters efficiency

FACTOR 03 – Tech-savviness is not key

Technical skills and financial resources in organizational context are strictly connected and in the adoption phase they can sometimes turn into constraints, especially when considering SMEs users. Surprisingly 70% of providers and 50% of users don’t see tech skills or financial resources among personnel as an important factor in the adoption process. Background and expertise in AI technology are not as necessary as understanding how to employ it within the company and the huge benefit coming from the implementation of such technology outweigh the cost.

While providers of AI music solutions do not perceive tech skills as an essential factor, pointing out that sales directors have normally limited knowledge of marketing technology tools and did not use this type of innovative solutions in the past. Consequently, digital tools are not often at the top of sales managers priority list, but they recognize the value of adopting them. 

Surprisingly 70% of providers and 50% of users don’t see tech skills or financial resources among personnel as an important factor in the adoption process. Background and expertise in AI technology are not as necessary as understanding how to employ it within the company and the huge benefit coming from the implementation of such technology outweigh the cost.
Laura Callegaro

Researcher & Music Industry Expert

FACTOR 04 – Shift of power

Knowledge through data is more and more accessible

It is clear that there is a larger trend towards searching for technologies that can analyze various industry data points on up and coming artists and predict who the next big stars may be. What this study has brought to light, and that has been confirmed also by the interviewees (especially by providers) is that we are assisting a big shift of power. From managers, booking agents and label owners straight to the artists’ hands: thanks to new technology applied to marketing and the manifold new ways of music consumption, potentially everyone could be the manager of himself.

However, at the moment, this could be a false hope since marketing and managerial skills are still required.

For the music business AI may serve as one of the most influential tools for growth, as we enter a new era where humans – from artists and songwriters to A&Rs (artists and repertoire) and digital marketers in labels – will be complemented by AI in various forms and at different extent. This study and the global challenges the industry is facing are just additional proofs of the essential need of AI in this ever evolving industry. 

“We enter a new era where humans will be complemented by AI in
various forms and at different extent
.” 

About the author

Laura Callegaro conducted this study during her master’s at the Berlin School of Economics and Law. She is a longtime electronic music expert and a real marketing wizard. As co-founder of the Berlin-based techno label JTseries and music-arts collective ENIGMA, Laura is actively contributing in revamping the music industry.

 

3 Reasons Music Catalogues Should Embrace AI Innovation During The Corona Quarantine

3 Reasons Music Catalogues Should Embrace AI Innovation During The Corona Quarantine

Since the Coronavirus outbreak, the music industry has taken a major economic hit. But despite the notable impact on the live music scene, demand for content is still high, and even likely to increase, as people spend more time indoors. A change in the environment places new demands on us, as people and businesses, to adapt in new ways. Music catalogue owners, for example, can improve their offering through AI innovation – and make it easy for users to find songs that match a specific emotion, mood or context.

Finding things is easier when there’s a way to recognise them. This rings true for every database, whether a physical building housing volumes of books or a digital library compiled of thousands of songs. Thankfully, human beings (resourceful as we are) created taxonomies; classification systems that help keep things organised. A taxonomy is the greatest asset your music catalogue can have (aside from great musical assets, of course).

Using a taxonomy for your content library is like having SEO keywords for your website. Your customers can only find you through Google if your website includes those words relevant to their needs. Similarly, the only way to retrieve a “happy” song from your library is if it’s correctly tagged to match that description.

Setting up an efficient taxonomy that uses the right tags will improve your customer’s search experience – and, ultimately, lead to your business performing better. To learn how to do this, check out our free guide: “How to find the right taxonomy for your music catalogue”.

But where does artificial intelligence feature in this? Simply put, an algorithm reads your taxonomy and produces the search results your customer is looking for. This type of AI innovation is a reliable, long-term improvement to your catalogue that you can implement immediately.

Let’s explore three reasons music catalogues should tap into AI innovation – both now and post-quarantine.

AI innovation – an opportunity

Every crisis, no matter how severe, presents opportunities. The old ways of doing things are challenged, and quickly become outdated if they can’t cope with the pressure. Instead, new methods are introduced. We’ve seen this with the Coronavirus pandemic. Around the globe, businesses are banding together, turning to emerging technologies and forming new approaches that will serve towards establishing a brighter future. Startups are 3D-printing medical supplies and ventilators, while remote working tools become part of the daily work-life.

Some of mankind’s most useful contributions have come directly from history’s worst periods. The Black Plague decimated most of Europe. But it also led to better health and safety standards, a greater demand for knowledge and less labor-intensive work. During the First World War, Marie Curie invented the first mobile X-ray machines to help diagnose wounded soldiers – tools which are widely used today in emergency rooms and ICUs.

The time to explore new technologies is now. AI is already gaining widespread adoption across multiple industries, including music and entertainment. From technical adjustments on mixes to creative music generation, it’s gradually becoming an ever-present feature. This was the case pre-Coronavirus, but will likely continue moving forward.

AI innovation – resistance

Innovation is a natural response to change, no matter how disastrous or difficult. It’s how humans survived plagues, wars, famine, disease and any number of horrifying scenarios (Norman Borlaug genetically modified wheat to save the world from a food shortage in the 1970s). As we’re forced to overcome something, we strive to find a solution; we innovate, until things get better.

This innovation includes the technologies we choose. And desperate times call for reliable technologies. That’s because people have their hands full day-to-day, dealing with the crisis in front of them. It’s not just the threat of job safety and personal health, but also that of family members, friends and colleagues.

Where humans are overburdened, AI is always dependable. In fact, it’s currently being used to diagnose humans for illness and can potentially detect future epidemics before they strike.

AI can be added to your business to keep basic things running. This leaves more time for your staff to care for your customers, as (keeping a personal connection) personal contact becomes more important now than ever. Let’s say you have a large music library. AI can cover the repetitive groundwork of basic tagging, so supervisors have time to do the exciting high-level tagging. Turning to AI technology can reduce work pressure, and act as a form of resistance to uncertainty and – paradoxically – frees you up to appear more human to a potential client.

Learn on your downtime.

The Coronavirus crisis is keeping us physically isolated, but digitally connected. As a result, the way we do business appears to be changing. If remote work was already becoming more commonplace over the last years, it’s now a necessity. Tools like Slack, Teams and Zoom enable communication and collaboration from anywhere. The freedom to choose between home, your favourite neighbourhood café, co-working spaces or (if you insist) the office, might’ve been labeled a luxury. But living on lock-down means remote work becomes a critical infrastructure for the modern workforce in maintaining the global economy.

Being at home means you’ve got more time, and more control over how you spend it. By not commuting to work, attending social events or taking part in your usual outdoor activities, you’ve got the opportunity to research and try out new ideas. It’s tempting to watch funny YouTube videos or indulge in Netflix with all that extra time (and you should; that’s one thing the Internet has perfected). But choosing to get familiar with AI means you’ll be future-proofing your business, and better preparing yourself no matter how long it takes to return to normal.

Quarantine or not – AI is a technology that’s here to stay. And so are the world, and the people fighting to preserve it. Now is the time to embrace change. By getting your taxonomy right, you’ll permanently improve your music catalogue and your users’ search experience going forward.

If you’d like to supercharge your music catalogue with AI, schedule a free 15-minute call with Cyanite co-founder Jakob.

 

AI Music Now: 3 Ways how AI can be used in the Music Industry

AI Music Now: 3 Ways how AI can be used in the Music Industry

Mention “AI music” and most people seem to think of AI-generated music. In other words, they picture a robot, machine or application composing, creating and possibly performing music by itself; essentially what musicians already do very well. First, let’s address every industry professional’s worst Terminator-induced fears (should they have any): AI will never replace musicians.

Even if music composed and generated by AI is currently riding a rising wave of hype (include link to previous article), we’re far from a scenario where humans aren’t in the mix. The perception around AI infiltrating the industry comes from a lack of attention towards what AI can actually do for music professionals. That’s why it’s important to cut through the noise and discuss different use cases possible right now.

Let’s look at three ways to use AI in the music industry and why they should be embraced.

AI-based Music Generation

 

The most popular application of AI in music is in the field of AI-generated music. You might’ve have heard about AIVA and Endel (which sound like the names of a pair of northern European fairy-tale characters). AIVA, the first AI to be recognized as a composer by the music world, writes entirely original compositions. Last year, Endel, an AI that creates ambient music, signed a distribution deal with Warner Music. Both these projects signal a shift towards AI music becoming mainstream.

Generative music systems are built on machine learning algorithms and data. The more data you have, the more examples an algorithm can learn from, leading to better results after it’s completed the learning process – this is known in AI-circles as ‘training’. Although AI-generation doesn’t deliver supremely high quality yet, some of AIVA’s supposed self-made compositions stack up well compared against modern composers.

If anything, it’s the chance for co-creation that excites today’s musicians. Contemporary artists like Taryn Southern and Holly Herndon use AI technology to varying degrees, with drastically different results. Southern’s pop-ready album, I AM AI, released in 2018. It was produced with the help of AI music-generating tools such as IBM’s Watson and Google’s Magenta.

Magenta is included in the latest Ableton Live release, a widely-used piece of music production software. As more artists begin to play with AI-music tools like these, the technology becomes an increasingly valuable creative partner.

YouTube

By loading the video, you agree to YouTube's privacy policy.
Learn more

Load video

AI-based Music Editing

Before the music arrives for your listening pleasure, it undergoes a lengthy editing process. This includes everything from mixing the stems – the different grouped elements of a song, like vocals and guitars – to mastering the finished mixdown (the rendered audio file of the song made by the sound engineer after they’ve tweaked it to their liking).

This whole song-editing journey is filled with many hours of attentive listening and considered action. Because of the amount of choices involved, having an AI to assist in making technical suggestions can speed things up. Equalization is a crucial editing step, which is as much technical as it is artistic. This refers to an audio engineer balancing out the specific frequencies of a track’s sounds, so they complement rather than conflict with each other. Using an AI to perform these basic EQ functions can provide an alternative starting point for the engineer.

Another example of fine-tuning music for consumption is the mastering process. Because published music must stick to strict formatting to for radio and TV, or film, it needs to be mastered. This final step before release usually requires a mastering engineer. They basically make the mix sound as good as possible, so it’s ready for playback on any platform.

Some of the technical changes mastering engineers make are universal. For example, they need to make every mixdown louder to match the standard of music that’s out there; or even to match the other songs on an album. Using universal techniques means AI can help, because you’ve got practices it can learn from. These practices can then be automatically applied and tailored to the song.

Companies like LANDR and Izotope are already on board. LANDR offers an AI-powered mastering service that caters to a variety of styles, while Izotope developed a plugin that includes a “mastering assistant“. Once again, AI can act as a useful sidekick for those spending hours in the editing process.

AI-based Music Analysis

Analysis is what happens when you break something down into smaller parts. In AI music terms, analysis is the process of breaking down a song into parts. Let’s say you’ve got a library full of songs and you’d like to identify all the exciting orchestral music (maybe you’re making a trailer for the next Avengers-themed Marvel movie). Through AI, analysis can be performed to highlight the most relevant music for your trailer based on your selected criteria (exciting; orchestral).

There are two types of analysis that make this magic possible: symbolic analysis and audio analysis. While symbolic analysis gathers musical information about a song from the score – including the rhythm, harmony and chord progressions, for example – audio or waveform analysis considers the entire song. This means understanding what’s unique about the fully-rendered wave (like those you see when you hit play on SoundCloud) and comparing it against other waves. Audio analysis enables the discovery of songs based on genre, timbre or emotion.

Both symbolic and audio analysis use feature extraction. Simply put, this is when you pull numbers out of a dataset. The better your data – meaning quality, well-organized and clearly tagged – the easier it is to pick up on ‘features’ of your music. These could be ‘low-level’ features like loudness, how much bass is present or the type of rhythms common in a genre. Or they could be ‘high level’ features, referring more broadly to the artist’s style, based on lyrics and the combination of musical elements at play.

AI-based music analysis makes it easier to understand what’s unique about a group of songs. If your algorithm learns the rhythms unique to Drum and Bass music, it can discover those songs by genre. And if it learns how to spot the features that make a song “happy” or “sad”, then you can search by emotion or mood. This allows for better sorting, and finding exactly what you pictured. Better sorting means faster, more reliable retrieval of the music you need, making you project process more efficient and fun.

With Cyanite we offer music analysis services via an API solution to tackle large music databases or the ready-to-use web app CYANITE. Create a free account to test AI-based tagging and music recommendations.

5 Technology Trends for Catalog Owners – How Technology is Changing the Music Industry?

5 Technology Trends for Catalog Owners – How Technology is Changing the Music Industry?

The music industry is technology-driven. As new technologies become mainstream, how customers use them affects how music industry players organize their catalogs. Even though traditional structures make it a challenge for music labels, publishing houses, and distribution companies to adapt quickly, to truly monetize the potential value of a music catalog, a continuously evolving market needs to be addressed. 

This article explores the state of technology in the music industry and outlines 5 emerging technologies that are disrupting the field.

The Current State of Technology in the Music Industry

Digital technology has been affecting the music industry for many years. Nowadays professional musicians can record music at home and the control over the distribution channels is mainly in the hands of digital platforms. These developments plus the proliferation of social media and video channels mark the democratization of the music industry. 

The pandemic brought about the inability to hold live performances which in turn propelled digital technology to even more growth. At the same time, Tik Tok reached its popularity around the same time and its easily discoverable bite-sized music has been celebrated by younger music fans. 

In 2022 the market continues to develop with new technology in music industry emerging and the center of entertainment shifting from live venues to home and virtual reality.

Emerging Technologies in the Music Industry 

These four major technology trends affect the future of the music industry and are increasingly important for music catalog owners.

Trend 1 – New media production & consumption channels

 

@Alexander Shatov from Unsplash

User-generated content (UGC) amplifies the amount of music content created these days. The delivery and consumption of music are now often happening through UGC channels such as Instagram reels, Facebook Watch, and Tik Tok. Big streaming platforms are under clear pressure as social media continues to gain further musical ground. The proliferation of these channels means that everyone can be a creator and produce music. 

This is not a new trend. Since the launch of Spotify, the amount of music content produced and consumed has skyrocketed. It was fueled by the freemium approach adopted by most streaming services. Users sign up for free and have access to an endless catalog of content. As a result, artists and creators were able potentially to reach millions of listeners worldwide.

With this incentive, content creators have jumped on board, signing exclusive deals with these platforms.  All these developments plus the rise of UGC have led to more music content than we can consume in our lifetime. 

As further entry points continue to appear for independent creators to offer content, this fully opens the floodgates of the UGC flood. AI-generated music will also be submitted by creators, which multiplies release cadences exponentially. Trawling through all the data to categorize it becomes challenging.  The music industry responded to these challenges with the development of AI tagging and classification engines, that can categorize the catalog and help create more targeted campaigns for music releases on various platforms. Just recently Soundcloud acquired Musiio – an automated tagging and playlisting engine to help categorize Soundcloud’s vast music library which proves how important categorization is for these platforms.

Trend 2 – Using AI to evaluate and benchmark a catalog

 

@Jeremy Bezanger from Unsplash

To respond to a constant increase in the amount of music content, AI is being used as the main tool for sorting and organizing the library. The basic thing such an AI does is it tags music in the catalog automatically so the classification is consistent. It can also analyze the constant stream of new songs and tag them according to the catalog’s classification. The ability of AI to categorize large amounts of music data as well as do the tagging on the fly keeps the catalog’s volume manageable

Not only does AI work with new content, but it also helps music library owners get the most out of the library in terms of revenue. AI is used to bring to light the back catalog where all the niche songs are stored in the tail and revive old music genres and subgenres. It solves the so-called long-tail problem using a combination of tagging, which makes old and niche songs easier to discover for search engines, and similarity search algorithms that find tracks similar to popular artists based on metadata.

Standing aside is the inability of search engines to respond to the needs of customers, which is one of the reasons behind the rise of user-generated content. Finding fitting songs is still a challenge as most music remains uncategorized and manually tagged. Using AI to improve the search function in the catalog is a new music technology that’s coming forward. 

To read more about AI for tagging and benchmarking, see the article on the 4 Applications of AI in the Music Industry.

Trend 3 – The rise of AI-generated music

 

@marcelalaskoski from Unsplash

It is clear that AI presents manifold opportunities to music catalog owners. But what about the music itself and music creators. Although AI-generated music dates back to the Illiac Suite of 1957, it attracted more interest during the last decade – just in 2019, the first music-making AI signed a deal with a major label.

While the quality of AI-generated music keeps improving, an algorithm that can generate Oscar-worthy film scores or emotionally riveting material is a distant reality. Currently, AI is used more as a tool for assisting in music creation, generating ideas that producers or artists turn into tracks. For example, Google’s Magenta provides such a tool

That said, music catalog owners need to be aware that AI-generated music will continue to improve. Those looking for alternatives to score their projects may consider exploring it as an option. In the future, the chances are high that AI-generated music will end up in your catalog along with other tracks, which returns us back to the question of proper classification and music search. While AI-generated music is definitely an opportunity for the music industry, it raises several problems including copyright issues and classification.

Trend 4 – Music for Extended Reality

 

A new wave of technology trends brings new forms of media content. The two applications most relevant for music catalog owners are Augmented Reality (AR) and Virtual Reality (VR).

Both rely on immersion, which refers to how believable the experience is for the user. Music is used to increase this believability. Just like the movie score creates an emotional connection with the viewer, music in AR and VR can enhance and stimulate the effect of the virtual space you’re moving around in.

The emotional and situational contexts are therefore critical. It is likely that AR and VR will follow the game industry to provide immersive music experiences. For example, adaptive soundtracks are already used in games where the music changes based on where the character is in the game and their perspective. Apple is rumored to release such an AR/VR set at the end of 2022 where music adapts to the environment. 

For AR and VR, you’d need to identify songs that adapt to the positioning, movement, and changing emotional state of users. This would mean tagging the songs for mood and other XR-related factors if you want to increase the speed of finding the right song.

Trend 5 – Music search will be assisted with technologies like Google 

The quality of the search function supported by AI tagging is already high enough, but the way music is searched for is going through a transformation. The future of music search looks similar to what Google offers now, which is the search result based on the user’s input of phrases or sentences in the search bar. According to our research, the ability of AI to translate music into text-based descriptions is one of the most anticipated technologies of 2022. 

Right now you can only search music by its meta-information such as the artist or title, or by specific descriptors, for example, mood or genre. For example, in Cyanite keyword search by weights allows to select up to 10 keywords and then specify their weights from 0 to 1 to find the right fitting track. You can also use Similarity Search which takes the reference track and gives you a list of tracks that match. To see this use case in action, see the Video Interview – How Cinephonix Integrated AI Search into Their Music Library.

The AI-based text descriptions take into account many characteristics of the song so simply typing “richly textured grand orchestral anthem featuring a lusty tenor and mezzo soprano will return a list of songs that correspond to the search query.

How the music business will change in the next 5-10 years

The development of technologies has always been challenging for the music industry. First, artists and labels lost their regular sources of income from CD sales, then the pandemic brought about the destruction of the live venues. 

AI is set to bring even more disruption. Users and AI generate an avalanche of new content that makes music professionals worried about the quality of music and the loss of a human element that is attached to it. At the same time, the speed of development of these technologies is overwhelming as they produce a crazy amount of content that needs to be classified and sorted. 

On the other hand, AI as a tool is used by labels and managers to automate repetitive tasks so they can focus on more complex goals. So these emerging technologies not only disrupt the industry but also help the music players to adapt to the ever-changing landscape. AI-assisted tagging, AI text descriptions for search, and new channels of distribution such as AR and VR represent revenue drivers and new ways of monetization for everyone involved.

I want to try out Cyanite’s AI platform – how can I get started?

If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed

Contact us with any questions about our frontend and API services via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.