Experience Our Biggest Web App Update with 5,000+ New Genres! 🎉 Discover Now

WISE Panel Video: AI – Musician’s Friend or Foe?

WISE Panel Video: AI – Musician’s Friend or Foe?

WISE hosted a virtual panel moderated by Kalam Ali (Co-Founder, Sound Obsessed) to connect music industry experts and have an open discussion about AI technologies adoption for artists. Among the guests there were Rania Kim (Creative Director, Sound Obsessed & Portrait XO), Harry Yeff/Reeps One (Director, Composer, and Artist, R1000 Studios), Heiko Hoffmann (VP Artist, Beatport) and Markus Schwarzer (CEO, Cyanite). 

 

All united by their interest in music and the future ahead of it, they shared their views on the different access points for AI to be embraced for what it is in the bigger picture: a solution to improve performances, to enhance the UX and to give inspiration within music production.

Education is the means to ensure a deeper understanding of this technology, now still highly questioned as damaging to connection people have with music. A realistic assessment of what opportunities there are for artists in implementing AI and, at the same time, what the risks of improper use are, can break these fear barriers.

Finding a middle ground between men and the autonomy of AI is key, especially in these days where a digital approach is often the only feasible way to make life feel as normal as it should be.  

The extended video of the talk is available on Youtube.

 

 

 

YouTube

By loading the video, you agree to YouTube’s privacy policy.
Learn more

Load video

Electronic music experts reveal 4 essential factors on AI-tech adoption for SMEs

Electronic music experts reveal 4 essential factors on AI-tech adoption for SMEs

We are excited to publish a recent study by Laura Callegaro, a master researcher from the Berlin School of Economics and Law, a longtime electronic music expert and co-founder of the Berlin-based techno label JTseries.

In this guest article, Laura shares 4 essential factors on adoption of AI solutions within the industry based on her research. 

Originally written by Laura Callegaro

 

In electronic music, original scenes are challenged as some small-scale events burgeon into festivals, as market growth and fan-bases develop, DJ cultures become celebrity cultures – with luxury brands like Porsche signing up female Djs – and as electronic music events become cultural experiences. For a market that has quickly turned from niche to mainstream, it may not be a surprise to see that the week’s top 100 most played Spotify tracks are almost entirely dominated by electronic music production. 

A recent survey presented at the International Music Summit in 2019, ranks electronic music as the World’s 3rd most popular genre with an estimated 1.5 billion people typically listening to it.

Source: IFPI Music Consumer Insight Report 2018

“The introduction of mature AI allows creatives and corporations alike to reimagine the creative process.” 

 

This snapshot of contemporary popularity works just as another clear indicator of a new mainstream – in which electronic music has become more central – within the global popular music market. AI is having a significant impact on those roles that are currently most systematic and routine in nature: search, audit and elements of due diligence. The introduction of mature AI allows creatives and corporations alike to reimagine the creative process, target new fans, and identify the next set of musical stars with greater accuracy and precision than we ever imagined.

The research highlights the challenges and opportunities induced by AI in this booming industry, and it focuses on the “What and Why” of SMEs managerial processes under a new edge. Many academic studies have analyzed the cultural, political and social dynamics of this field, but very few have analyzed the economics of this industry. Through semi-structured interviews of both actors – providers and users of AI music marketing tools – combined with qualitative research of primary data, the study relies on the so-called TOE framework. This is one of the most insightful for IT and system adoption research and it helps to identify the different factors of adoption into three dimensions of enterprise context: technological, organizational and environmental.

 

FACTOR 01: Trust the machine

From the data analyzed, It has been demonstrated that there is overall trust in AI systems – 90% of positive sentiment – which are perceived as free of bias. In fact, just one out of 4 users pointed out that these machines are programmed by humans, and therefore it is impossible to have AI 100% free of bias: matter unfortunately already proved by facts. In relation with the research done, this important factor shows the essential need of a tight collaboration between humans and AI powered machines, which allow us to perform ingenious results by analyzing infinite numbers of data in a matter of seconds.

 

It’s essential that humans and AI-powered machines collaborate.

FACTOR 02: Agility wins

Firm size divided the respondents’ opinion widely. The variance between the answers was based mainly on the agility of the decision making process and financial resources of the organization analyzed. A common idea among all the interviewees is that the failure of new technology would have less impact on larger firms, which normally are the ones with larger financial resources. On the other side, 50% of them recognized that agility of smaller companies simplifies the adoption process and makes it more efficient.

Based on the findings, it can be argued that technical skills and financial resources are connected. We can also notice a variation between the replies considering the role played by financial resources, where the responses were heavily dependent on the cost of the exact system the respondents had operated with.

Agility fosters efficiency

FACTOR 03 – Tech-savviness is not key

Technical skills and financial resources in organizational context are strictly connected and in the adoption phase they can sometimes turn into constraints, especially when considering SMEs users. Surprisingly 70% of providers and 50% of users don’t see tech skills or financial resources among personnel as an important factor in the adoption process. Background and expertise in AI technology are not as necessary as understanding how to employ it within the company and the huge benefit coming from the implementation of such technology outweigh the cost.

While providers of AI music solutions do not perceive tech skills as an essential factor, pointing out that sales directors have normally limited knowledge of marketing technology tools and did not use this type of innovative solutions in the past. Consequently, digital tools are not often at the top of sales managers priority list, but they recognize the value of adopting them. 

Surprisingly 70% of providers and 50% of users don’t see tech skills or financial resources among personnel as an important factor in the adoption process. Background and expertise in AI technology are not as necessary as understanding how to employ it within the company and the huge benefit coming from the implementation of such technology outweigh the cost.
Laura Callegaro

Researcher & Music Industry Expert

FACTOR 04 – Shift of power

Knowledge through data is more and more accessible

It is clear that there is a larger trend towards searching for technologies that can analyze various industry data points on up and coming artists and predict who the next big stars may be. What this study has brought to light, and that has been confirmed also by the interviewees (especially by providers) is that we are assisting a big shift of power. From managers, booking agents and label owners straight to the artists’ hands: thanks to new technology applied to marketing and the manifold new ways of music consumption, potentially everyone could be the manager of himself.

However, at the moment, this could be a false hope since marketing and managerial skills are still required.

For the music business AI may serve as one of the most influential tools for growth, as we enter a new era where humans – from artists and songwriters to A&Rs (artists and repertoire) and digital marketers in labels – will be complemented by AI in various forms and at different extent. This study and the global challenges the industry is facing are just additional proofs of the essential need of AI in this ever evolving industry. 

“We enter a new era where humans will be complemented by AI in
various forms and at different extent
.” 

About the author

Laura Callegaro conducted this study during her master’s at the Berlin School of Economics and Law. She is a longtime electronic music expert and a real marketing wizard. As co-founder of the Berlin-based techno label JTseries and music-arts collective ENIGMA, Laura is actively contributing in revamping the music industry.

 

3 Reasons Music Catalogues Should Embrace AI Innovation During The Corona Quarantine

3 Reasons Music Catalogues Should Embrace AI Innovation During The Corona Quarantine

Since the Coronavirus outbreak, the music industry has taken a major economic hit. But despite the notable impact on the live music scene, demand for content is still high, and even likely to increase, as people spend more time indoors. A change in the environment places new demands on us, as people and businesses, to adapt in new ways. Music catalogue owners, for example, can improve their offering through AI innovation – and make it easy for users to find songs that match a specific emotion, mood or context.

Finding things is easier when there’s a way to recognise them. This rings true for every database, whether a physical building housing volumes of books or a digital library compiled of thousands of songs. Thankfully, human beings (resourceful as we are) created taxonomies; classification systems that help keep things organised. A taxonomy is the greatest asset your music catalogue can have (aside from great musical assets, of course).

Using a taxonomy for your content library is like having SEO keywords for your website. Your customers can only find you through Google if your website includes those words relevant to their needs. Similarly, the only way to retrieve a “happy” song from your library is if it’s correctly tagged to match that description.

Setting up an efficient taxonomy that uses the right tags will improve your customer’s search experience – and, ultimately, lead to your business performing better. To learn how to do this, check out our free guide: “How to find the right taxonomy for your music catalogue”.

But where does artificial intelligence feature in this? Simply put, an algorithm reads your taxonomy and produces the search results your customer is looking for. This type of AI innovation is a reliable, long-term improvement to your catalogue that you can implement immediately.

Let’s explore three reasons music catalogues should tap into AI innovation – both now and post-quarantine.

AI innovation – an opportunity

Every crisis, no matter how severe, presents opportunities. The old ways of doing things are challenged, and quickly become outdated if they can’t cope with the pressure. Instead, new methods are introduced. We’ve seen this with the Coronavirus pandemic. Around the globe, businesses are banding together, turning to emerging technologies and forming new approaches that will serve towards establishing a brighter future. Startups are 3D-printing medical supplies and ventilators, while remote working tools become part of the daily work-life.

Some of mankind’s most useful contributions have come directly from history’s worst periods. The Black Plague decimated most of Europe. But it also led to better health and safety standards, a greater demand for knowledge and less labor-intensive work. During the First World War, Marie Curie invented the first mobile X-ray machines to help diagnose wounded soldiers – tools which are widely used today in emergency rooms and ICUs.

The time to explore new technologies is now. AI is already gaining widespread adoption across multiple industries, including music and entertainment. From technical adjustments on mixes to creative music generation, it’s gradually becoming an ever-present feature. This was the case pre-Coronavirus, but will likely continue moving forward.

AI innovation – resistance

Innovation is a natural response to change, no matter how disastrous or difficult. It’s how humans survived plagues, wars, famine, disease and any number of horrifying scenarios (Norman Borlaug genetically modified wheat to save the world from a food shortage in the 1970s). As we’re forced to overcome something, we strive to find a solution; we innovate, until things get better.

This innovation includes the technologies we choose. And desperate times call for reliable technologies. That’s because people have their hands full day-to-day, dealing with the crisis in front of them. It’s not just the threat of job safety and personal health, but also that of family members, friends and colleagues.

Where humans are overburdened, AI is always dependable. In fact, it’s currently being used to diagnose humans for illness and can potentially detect future epidemics before they strike.

AI can be added to your business to keep basic things running. This leaves more time for your staff to care for your customers, as (keeping a personal connection) personal contact becomes more important now than ever. Let’s say you have a large music library. AI can cover the repetitive groundwork of basic tagging, so supervisors have time to do the exciting high-level tagging. Turning to AI technology can reduce work pressure, and act as a form of resistance to uncertainty and – paradoxically – frees you up to appear more human to a potential client.

Learn on your downtime.

The Coronavirus crisis is keeping us physically isolated, but digitally connected. As a result, the way we do business appears to be changing. If remote work was already becoming more commonplace over the last years, it’s now a necessity. Tools like Slack, Teams and Zoom enable communication and collaboration from anywhere. The freedom to choose between home, your favourite neighbourhood café, co-working spaces or (if you insist) the office, might’ve been labeled a luxury. But living on lock-down means remote work becomes a critical infrastructure for the modern workforce in maintaining the global economy.

Being at home means you’ve got more time, and more control over how you spend it. By not commuting to work, attending social events or taking part in your usual outdoor activities, you’ve got the opportunity to research and try out new ideas. It’s tempting to watch funny YouTube videos or indulge in Netflix with all that extra time (and you should; that’s one thing the Internet has perfected). But choosing to get familiar with AI means you’ll be future-proofing your business, and better preparing yourself no matter how long it takes to return to normal.

Quarantine or not – AI is a technology that’s here to stay. And so are the world, and the people fighting to preserve it. Now is the time to embrace change. By getting your taxonomy right, you’ll permanently improve your music catalogue and your users’ search experience going forward.

If you’d like to supercharge your music catalogue with AI, schedule a free 15-minute call with Cyanite co-founder Jakob.

 

AI Music Now: 3 Ways how AI can be used in the Music Industry

AI Music Now: 3 Ways how AI can be used in the Music Industry

Mention “AI music” and most people seem to think of AI-generated music. In other words, they picture a robot, machine or application composing, creating and possibly performing music by itself; essentially what musicians already do very well. First, let’s address every industry professional’s worst Terminator-induced fears (should they have any): AI will never replace musicians.

Even if music composed and generated by AI is currently riding a rising wave of hype (include link to previous article), we’re far from a scenario where humans aren’t in the mix. The perception around AI infiltrating the industry comes from a lack of attention towards what AI can actually do for music professionals. That’s why it’s important to cut through the noise and discuss different use cases possible right now.

Let’s look at three ways to use AI in the music industry and why they should be embraced.

AI-based Music Generation

 

The most popular application of AI in music is in the field of AI-generated music. You might’ve have heard about AIVA and Endel (which sound like the names of a pair of northern European fairy-tale characters). AIVA, the first AI to be recognized as a composer by the music world, writes entirely original compositions. Last year, Endel, an AI that creates ambient music, signed a distribution deal with Warner Music. Both these projects signal a shift towards AI music becoming mainstream.

Generative music systems are built on machine learning algorithms and data. The more data you have, the more examples an algorithm can learn from, leading to better results after it’s completed the learning process – this is known in AI-circles as ‘training’. Although AI-generation doesn’t deliver supremely high quality yet, some of AIVA’s supposed self-made compositions stack up well compared against modern composers.

If anything, it’s the chance for co-creation that excites today’s musicians. Contemporary artists like Taryn Southern and Holly Herndon use AI technology to varying degrees, with drastically different results. Southern’s pop-ready album, I AM AI, released in 2018. It was produced with the help of AI music-generating tools such as IBM’s Watson and Google’s Magenta.

Magenta is included in the latest Ableton Live release, a widely-used piece of music production software. As more artists begin to play with AI-music tools like these, the technology becomes an increasingly valuable creative partner.

YouTube

By loading the video, you agree to YouTube's privacy policy.
Learn more

Load video

AI-based Music Editing

Before the music arrives for your listening pleasure, it undergoes a lengthy editing process. This includes everything from mixing the stems – the different grouped elements of a song, like vocals and guitars – to mastering the finished mixdown (the rendered audio file of the song made by the sound engineer after they’ve tweaked it to their liking).

This whole song-editing journey is filled with many hours of attentive listening and considered action. Because of the amount of choices involved, having an AI to assist in making technical suggestions can speed things up. Equalization is a crucial editing step, which is as much technical as it is artistic. This refers to an audio engineer balancing out the specific frequencies of a track’s sounds, so they complement rather than conflict with each other. Using an AI to perform these basic EQ functions can provide an alternative starting point for the engineer.

Another example of fine-tuning music for consumption is the mastering process. Because published music must stick to strict formatting to for radio and TV, or film, it needs to be mastered. This final step before release usually requires a mastering engineer. They basically make the mix sound as good as possible, so it’s ready for playback on any platform.

Some of the technical changes mastering engineers make are universal. For example, they need to make every mixdown louder to match the standard of music that’s out there; or even to match the other songs on an album. Using universal techniques means AI can help, because you’ve got practices it can learn from. These practices can then be automatically applied and tailored to the song.

Companies like LANDR and Izotope are already on board. LANDR offers an AI-powered mastering service that caters to a variety of styles, while Izotope developed a plugin that includes a “mastering assistant“. Once again, AI can act as a useful sidekick for those spending hours in the editing process.

AI-based Music Analysis

Analysis is what happens when you break something down into smaller parts. In AI music terms, analysis is the process of breaking down a song into parts. Let’s say you’ve got a library full of songs and you’d like to identify all the exciting orchestral music (maybe you’re making a trailer for the next Avengers-themed Marvel movie). Through AI, analysis can be performed to highlight the most relevant music for your trailer based on your selected criteria (exciting; orchestral).

There are two types of analysis that make this magic possible: symbolic analysis and audio analysis. While symbolic analysis gathers musical information about a song from the score – including the rhythm, harmony and chord progressions, for example – audio or waveform analysis considers the entire song. This means understanding what’s unique about the fully-rendered wave (like those you see when you hit play on SoundCloud) and comparing it against other waves. Audio analysis enables the discovery of songs based on genre, timbre or emotion.

Both symbolic and audio analysis use feature extraction. Simply put, this is when you pull numbers out of a dataset. The better your data – meaning quality, well-organized and clearly tagged – the easier it is to pick up on ‘features’ of your music. These could be ‘low-level’ features like loudness, how much bass is present or the type of rhythms common in a genre. Or they could be ‘high level’ features, referring more broadly to the artist’s style, based on lyrics and the combination of musical elements at play.

AI-based music analysis makes it easier to understand what’s unique about a group of songs. If your algorithm learns the rhythms unique to Drum and Bass music, it can discover those songs by genre. And if it learns how to spot the features that make a song “happy” or “sad”, then you can search by emotion or mood. This allows for better sorting, and finding exactly what you pictured. Better sorting means faster, more reliable retrieval of the music you need, making you project process more efficient and fun.

With Cyanite we offer music analysis services via an API solution to tackle large music databases or the ready-to-use web app CYANITE. Create a free account to test AI-based tagging and music recommendations.

5 Technology Trends for Catalog Owners – How Technology is Changing the Music Industry?

5 Technology Trends for Catalog Owners – How Technology is Changing the Music Industry?

The music industry is technology-driven. As new technologies become mainstream, how customers use them affects how music industry players organize their catalogs. Even though traditional structures make it a challenge for music labels, publishing houses, and distribution companies to adapt quickly, to truly monetize the potential value of a music catalog, a continuously evolving market needs to be addressed. 

This article explores the state of technology in the music industry and outlines 5 emerging technologies that are disrupting the field.

The Current State of Technology in the Music Industry

Digital technology has been affecting the music industry for many years. Nowadays professional musicians can record music at home and the control over the distribution channels is mainly in the hands of digital platforms. These developments plus the proliferation of social media and video channels mark the democratization of the music industry. 

The pandemic brought about the inability to hold live performances which in turn propelled digital technology to even more growth. At the same time, Tik Tok reached its popularity around the same time and its easily discoverable bite-sized music has been celebrated by younger music fans. 

In 2022 the market continues to develop with new technology in music industry emerging and the center of entertainment shifting from live venues to home and virtual reality.

Emerging Technologies in the Music Industry 

These four major technology trends affect the future of the music industry and are increasingly important for music catalog owners.

Trend 1 – New media production & consumption channels

 

@Alexander Shatov from Unsplash

User-generated content (UGC) amplifies the amount of music content created these days. The delivery and consumption of music are now often happening through UGC channels such as Instagram reels, Facebook Watch, and Tik Tok. Big streaming platforms are under clear pressure as social media continues to gain further musical ground. The proliferation of these channels means that everyone can be a creator and produce music. 

This is not a new trend. Since the launch of Spotify, the amount of music content produced and consumed has skyrocketed. It was fueled by the freemium approach adopted by most streaming services. Users sign up for free and have access to an endless catalog of content. As a result, artists and creators were able potentially to reach millions of listeners worldwide.

With this incentive, content creators have jumped on board, signing exclusive deals with these platforms.  All these developments plus the rise of UGC have led to more music content than we can consume in our lifetime. 

As further entry points continue to appear for independent creators to offer content, this fully opens the floodgates of the UGC flood. AI-generated music will also be submitted by creators, which multiplies release cadences exponentially. Trawling through all the data to categorize it becomes challenging.  The music industry responded to these challenges with the development of AI tagging and classification engines, that can categorize the catalog and help create more targeted campaigns for music releases on various platforms. Just recently Soundcloud acquired Musiio – an automated tagging and playlisting engine to help categorize Soundcloud’s vast music library which proves how important categorization is for these platforms.

Trend 2 – Using AI to evaluate and benchmark a catalog

 

@Jeremy Bezanger from Unsplash

To respond to a constant increase in the amount of music content, AI is being used as the main tool for sorting and organizing the library. The basic thing such an AI does is it tags music in the catalog automatically so the classification is consistent. It can also analyze the constant stream of new songs and tag them according to the catalog’s classification. The ability of AI to categorize large amounts of music data as well as do the tagging on the fly keeps the catalog’s volume manageable

Not only does AI work with new content, but it also helps music library owners get the most out of the library in terms of revenue. AI is used to bring to light the back catalog where all the niche songs are stored in the tail and revive old music genres and subgenres. It solves the so-called long-tail problem using a combination of tagging, which makes old and niche songs easier to discover for search engines, and similarity search algorithms that find tracks similar to popular artists based on metadata.

Standing aside is the inability of search engines to respond to the needs of customers, which is one of the reasons behind the rise of user-generated content. Finding fitting songs is still a challenge as most music remains uncategorized and manually tagged. Using AI to improve the search function in the catalog is a new music technology that’s coming forward. 

To read more about AI for tagging and benchmarking, see the article on the 4 Applications of AI in the Music Industry.

Trend 3 – The rise of AI-generated music

 

@marcelalaskoski from Unsplash

It is clear that AI presents manifold opportunities to music catalog owners. But what about the music itself and music creators. Although AI-generated music dates back to the Illiac Suite of 1957, it attracted more interest during the last decade – just in 2019, the first music-making AI signed a deal with a major label.

While the quality of AI-generated music keeps improving, an algorithm that can generate Oscar-worthy film scores or emotionally riveting material is a distant reality. Currently, AI is used more as a tool for assisting in music creation, generating ideas that producers or artists turn into tracks. For example, Google’s Magenta provides such a tool

That said, music catalog owners need to be aware that AI-generated music will continue to improve. Those looking for alternatives to score their projects may consider exploring it as an option. In the future, the chances are high that AI-generated music will end up in your catalog along with other tracks, which returns us back to the question of proper classification and music search. While AI-generated music is definitely an opportunity for the music industry, it raises several problems including copyright issues and classification.

Trend 4 – Music for Extended Reality

 

A new wave of technology trends brings new forms of media content. The two applications most relevant for music catalog owners are Augmented Reality (AR) and Virtual Reality (VR).

Both rely on immersion, which refers to how believable the experience is for the user. Music is used to increase this believability. Just like the movie score creates an emotional connection with the viewer, music in AR and VR can enhance and stimulate the effect of the virtual space you’re moving around in.

The emotional and situational contexts are therefore critical. It is likely that AR and VR will follow the game industry to provide immersive music experiences. For example, adaptive soundtracks are already used in games where the music changes based on where the character is in the game and their perspective. Apple is rumored to release such an AR/VR set at the end of 2022 where music adapts to the environment. 

For AR and VR, you’d need to identify songs that adapt to the positioning, movement, and changing emotional state of users. This would mean tagging the songs for mood and other XR-related factors if you want to increase the speed of finding the right song.

Trend 5 – Music search will be assisted with technologies like Google 

The quality of the search function supported by AI tagging is already high enough, but the way music is searched for is going through a transformation. The future of music search looks similar to what Google offers now, which is the search result based on the user’s input of phrases or sentences in the search bar. According to our research, the ability of AI to translate music into text-based descriptions is one of the most anticipated technologies of 2022. 

Right now you can only search music by its meta-information such as the artist or title, or by specific descriptors, for example, mood or genre. For example, in Cyanite keyword search by weights allows to select up to 10 keywords and then specify their weights from 0 to 1 to find the right fitting track. You can also use Similarity Search which takes the reference track and gives you a list of tracks that match. To see this use case in action, see the Video Interview – How Cinephonix Integrated AI Search into Their Music Library.

The AI-based text descriptions take into account many characteristics of the song so simply typing “richly textured grand orchestral anthem featuring a lusty tenor and mezzo soprano will return a list of songs that correspond to the search query.

How the music business will change in the next 5-10 years

The development of technologies has always been challenging for the music industry. First, artists and labels lost their regular sources of income from CD sales, then the pandemic brought about the destruction of the live venues. 

AI is set to bring even more disruption. Users and AI generate an avalanche of new content that makes music professionals worried about the quality of music and the loss of a human element that is attached to it. At the same time, the speed of development of these technologies is overwhelming as they produce a crazy amount of content that needs to be classified and sorted. 

On the other hand, AI as a tool is used by labels and managers to automate repetitive tasks so they can focus on more complex goals. So these emerging technologies not only disrupt the industry but also help the music players to adapt to the ever-changing landscape. AI-assisted tagging, AI text descriptions for search, and new channels of distribution such as AR and VR represent revenue drivers and new ways of monetization for everyone involved.

I want to try out Cyanite’s AI platform – how can I get started?

If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed

Contact us with any questions about our frontend and API services via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

How to Build Trust with the Music Industry Gatekeepers – 3 Working Ways

How to Build Trust with the Music Industry Gatekeepers – 3 Working Ways

Let’s face it: your success as an artist is significantly influenced by the amount of love you get from the gatekeepers of the music industry. But who are the gatekeepers in the music industry? Music gatekeepers are the people and entities that control access to your audience. Namely, these are bloggers, playlist curators, magazines, social media influencers, A&R’s, radio DJs, etc. They have the keys to the doors or “gates” which provide access to the audience and sales. They can also become a bottleneck on your way to music fans.

Usually, the track reaches the audience through a complicated chain from the artist to a record label, radio station, press, retail stores, and event booking agents. So gatekeeping in music is a regular and widely accepted practice. In recent years, some of the gatekeepers were substituted by streaming platforms and music curators. These “new” gatekeepers combine human and algorithmic editorial practices to form the playlists and influence music listeners. This source establishes that there are around 400 music curators worldwide who work for Spotify, Apple Music, Deezer, and Google Play Music. There are also personal playlists and music blogs’ playlists on all the platforms. 

The key to their heart is: 

  • Understanding how they got there
  • Helping them stay relevant to their audience.

Gatekeepers in music are not that different from you. It took them hard work and time to establish trust with their audience which is why they have so much influence. Knowing that the gatekeepers will deliver what the audience wants, makes people stick to a blog, playlist, or radio station. The listeners know what to expect. For music industry gatekeepers, this means that they have to deliver fresh content at a good level of continuity – both in time and style.

So in reality, the music industry gatekeepers are dependent on YOU and your music! This is important to understand. Yet, you are not the only artist around. This means, they constantly have to choose between several songs or artists, and you need to make it easy for them to decide in your favor.

 YOU DO THIS BY EARNING THEIR TRUST.

 

  1. Show them that you mean it.
  2. Show them they are not alone in supporting you.
  3. Show them that their audience will like the track.
Tip 1. Show them that you mean it
Gatekeepers want to endorse an artist for a longer period of time. For this, they need to know that you stick around. Is there only one EP and then you’re done or are you organically building a sustainable career? It falls back badly on a gatekeeper if they keep recommending artists to their audience that are out of the picture pretty soon. Naming a strong team behind you (producer, label, photograph, etc.) or a couple of tour dates can help you prove that you are in the industry for the long run. 

A brand image that evolves around your music also shows a serious level of professionalism. Don’t ignore your branding as an artist as it helps create a consistent identity that the audience gravitates toward. This read by Spinnup sums up this topic very well.

Tip 2. Show them they are not alone in supporting you
Your social media and Spotify game are key to this. Followers, engagement, listeners, growth! Use all the stats you have access to. If you have no Social Media presence yet, use anything that might make the music gatekeeper trust that your career is growing. This can be blog features, record reviews, or previous playlist rankings. 

Some gatekeepers will only take your song if they see that you’re actively promoting it on your side. You can do that by advertising on Facebook, Instagram, or Twitter. For a clever hack on social media advertising, visit the guide How to Create Custom Audiences for Pre-Release Music Campaigns in Facebook, Instagram, and Google.

Tip 3. Show them that their audience will like the track
This is especially important if you’re just starting your career and have nothing to show yet. As pointed out earlier, gatekeepers must keep some degree of consistency in the musical styles they recommend. Show them that your music goes in line with their previous music picks. You can achieve this by doing your homework:

1. Listen carefully to all their music, radio shows, playlists, mixes, etc., and find common characteristics.

 

2. Make use of music analysis tools to support your pitching with objective data.

The analysis tools give some sort of neutral perspective on your music and the gatekeeper’s offerings and serve as an objective advocate. A solution like Cyanite displays the emotional profile and musical style of your songs. When you analyze the playlist that you want to be placed in and the profiles match, chances are high that the playlist curator will consider your song.

Let’s explore an example. Below you see two screenshots taken from Cyanite: the first one is the emotional profile of the song Too Much by the British band Pale, and the second one is the independently curated playlist Chill Indie Rock. Imagine you are pitching the song to the curator of the playlist. You can add the “emotional fit” as another piece to your story.

However, if it doesn’t match, find a playlist that does. We created a more detailed guide on how to pitch playlists in music streaming – read it before you start pitching Spotify curators.

Screenshot 1: Track Mood Analysis of Pale – Too much

Click on the button to load the content from open.spotify.com.

Load content

Screenshot 2: Playlist Chill Indie Rock

Click on the button to load the content from open.spotify.com.

Load content

Networking with the music industry gatekeepers
This point relates mainly to music blogs and the press. These industry gatekeepers receive thousands of emails each day from artists and managers pitching their music. 

Networking in the music industry involves being active on social media and commenting and sending supporting tweets to blog owners. A mutual connection by email or LinkedIn is ideal, as it increases trust, but it is hard to come by. By supporting the gatekeeper and being an active fan, you can make sure it is not a completely cold approach

While it is important to network with the gatekeepers through social media channels and email, it is more important to understand the angle of the media and how well your music fits into it. To see how you can identify the right match see the article How to Write Press Releases and Music Pitches with Cyanite.

Conclusion

Gatekeepers have hundreds of people contacting them every day but only a limited amount of time to sift through the endless stream of music. For the most part, they have a very distinct kind of music/artist they cover. ‘Musical style’ is only a small part of it – they also consider the stage of your career, looks, affiliation to other artists, political affiliation and engagement, and your lifestyle.

If you are not somewhat a perfect match: don’t bother writing to them as it is better to spend that time creating new music and/or building up your Social Media game and brand image around your music. With the gatekeepers that match your music though, make sure you use a warm approach, show them that you are serious about your music, and make use of the analytics and AI tools out there to enrich your story with objective data that will help the gatekeepers make a positive decision.

About the Author

Markus ist the Co-Founder and CEO of CYANITE. Before he co-founded the boutique label Serve & Volley Rec. and worked at the music promotion agency Shoot Music.

I want to use Cyanite to reach the music gatekeepers – how can I get started?

Please contact us with any questions about our Cyanite AI via sales@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

If you want to get the first grip on Cyanite’s technology, you can also register for our free web app to analyze music and try similarity searches without any coding needed.