Experience Our Biggest Web App Update with 5,000+ New Genres! 🎉 Discover Now

Cyanite API Update – New Instruments and More

Cyanite API Update – New Instruments and More

We are proud to announce the latest API update. This update includes 8 new instrument classifiers, improved BPM, Key, and Time Signature detection, and improved Similarity Search.  For details, see full documentation here: https://api-docs.cyanite.ai/blog/2021/05/06/changelog-2021-05-06/

Now let’s take a closer look at it!

New Instrument Classifiers

Now, additionally to our percussion classifier, we expose the synthpiano, acousticGuitar,  electricGuitar, stringsbassbassGuitar and brassWoodwinds classifiers.

We also deprecated the AudioAnalysisV6Result.instruments field.

The new field AudioAnalysisV6Result.instrumentTags exposes a list of detected instruments throughout the track.

New Key and BPM classifier

The FullScaleMusicalAnalysis and FastMusicalAnalysis types are now deprecated and will be removed from the API on 15 June 2021.

The new AudioAnalysisV6Result.key and AudioAnalysisV6Result.bpm expose improved key and bpm values over the now deprecated FullScaleMusicalAnalysis and FastMusicalAnalysis.

New Time Signature Classifier

The AudioAnalysisV6Result.timeSignature now exposes the time signature of the track (e.g. 3/4 or 4/4) as a string.

New Improved Similarity Search

We improved the performance of our similarity search based on audio features. In addition to that, we exposed a new single field that allows searching similar tracks for both library and Spotify tracks with additional filters for bpm, key, and genre.

Go ahead and start coding

Contact us with any questions about our API services via mail@cyanite.ai. Give us a shout-out on Twitter, LinkedIn or wherever you feel like. Don’t hold back with feedback on what we can improve.

If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed.

If you are a coder and want to join the ride, please send your application to careers@cyanite.ai.

Introducing: Cyanite’s Keyword Cleaning System for Music Libraries

Introducing: Cyanite’s Keyword Cleaning System for Music Libraries

In this article, we present the common challenge of inconsistencies of keyword tagging in music databases. We discuss what causes these problems and how Cyanite developed a Keyword Cleaning system to automatically solve and overcome these. We will present four use cases for our Keyword Cleaning system and the potential impact it may have on music businesses.

Introduction of the problem

The way we perceive music is highly individual. So is the way we describe music. What is food for many dinner conversations is important to be aware of when handling bigger amounts of musical pieces professionally.

To leverage diverse monetization opportunities with musical assets, many music companies sort music catalogs by assigning keyword tags to all the audio files in their music database. These tags may describe the mood and genre of a song or categorize its instruments or tempo. This way music companies ensure accessibility and searchability of any musical asset even in very large music catalogs.

These tags follow the companies’ individual understanding of music – their catalog language. The specific nature of a catalog language may be understood under two aspects:

1. Objective catalog language (tagging): the entity of keywords and tags often described as taxonomy or tag anthology (quantity, classes and wording). „Which tags do I use.

2. Subjective catalog language (understanding of tagging): the understanding of tags and their connection to certain sound qualities. „When do I assign a certain tag?“

Objective catalog language is inherent to the music catalog or the company that owns it. Subjective catalog language, however, is inherent to every individual person that tags the music.

Having a consistent catalog language leads to a brilliant search experience and is the perfect condition for thorough exploitation of your assets. A lot of work can go into building and maintaining an own catalog language. However, 3 main events can quickly erode it and thus erode tagging quality and meaningfulness:

Event 1: Catalog acquisitions or integrations.

Event 2: Differences in the form of the day of tagging staff.

Event 3: The hiring of new tagging staff.

Not being aware of this can cause the annihilation of the work of decades. Songs can’t be found and revenue streams can’t be realized as before, seriously harming a company’s ability to execute their business model.

More importantly – music searching staff don’t trust the music search anymore which leads them to building up highly individual systems of workarounds to finding suitable music or a very limited „go-to-catalog“ of songs that they use more often rather than grasping on the entire music catalog.

Aaron Chavez © Unsplash

Our solution

Addressing these issues, Cyanite developed a way to bring together (translate) two catalog languages – objective or subjective – with minimum information loss and maximum speed, using AI.

We base our approach on a measure we denote as keyword similarity, describing the degree of semantic similarity of a pair of tags. To give an example, the keywords “enthusiastic” and “euphoric” should have a rather similar meaning when used for the description of a musical mood. We would therefore expect a high degree of keyword similarity. On the contrary, “enthusiastic” and “gloomy” represent a quite contrary pair of descriptive attributes which should point towards a low degree of keyword similarity.

Most music catalogs contain a multi-label tagging scheme, meaning the possibility for a single piece of music to be assigned multiple tags. We take use of this fact and focus on the track-wise co-occurrence of tags, hypothesizing that a frequent joint attribution of a tag pair will indicate a high degree of interrelation and, thus, keyword similarity.

We developed a natural language processing (NLP) AI system capable of learning the semantic interrelation of keywords in any library. With this, we are able to derive a quantitative measure for any combination of keywords contained in one or several music catalogs. This analysis is the basis for a variety of groundbreaking use cases to overcome challenges many music companies are struggling with.

 

Use Case 1: Catalog language translation

This challenge arises when two (or more) differently tagged music catalogs shall be integrated into each other (potentially after a catalog acquisition or when choosing a different distribution outlet). Manually translating tags is tedious and may lead to significant information loss as sometimes the same tags are not used equally (see “subjective catalog language” above).

Our system is able to understand and map every tag in relation to each other. It does it with both taxonomies understanding the respective catalog language. In a second step it maps both catalogue languages on top of each other drawing direct relations between tags and their understanding. The third step marks the translation of the single song tagging from one catalog language into the one the catalog shall be integrated in. The system automatically re-tags every song in a new catalog language.

Use Case 2: Keyword Cleaning of inconsistent keyword tagging

Companies with high fluctuation in tagging staff face this challenge – or it may be a company with a particularly large catalog (>100,000 songs) that picked up some legacy over the years: Inconsistencies in keyword tagging. This is one of the biggest problem catalogs can face as it seriously diminishes the searchability and search experience of the catalog leading to mistrust of the system, individual workarounds and eventually losing the customer for good. Or it leads the customer to directly contact the library’s sales team and search staff which harms the capability of your business to scale.

After understanding the respective catalog language of your catalog our Cyanite Keyword Cleaning system can detect tags with low keyword similarity that may contradict the other tags and flag the respective songs. To assess if a tag was wrongfully assigned (or may be missing), we offer an audio-based tagging solution for these anomalies to detect whether or not a tag is suitable or not. In case of the latter the tag is then deleted.   

Use Case 3: Taxonomy Cleaning. Detection of redundancies and blind spots.

Languages change over time – and with it change catalog languages. Some catalogs have 15,000+ different keywords in their taxonomy. It should come as no surprise that songs with older keyword tags are less being found. The choice to a slimmer taxonomy can elevate searchability and overall search experience of catalogs.

This raises the question of whether all tags are necessary and meaningful or not. To test this, our Cyanite system can detect tags that are equal in meaning by scanning through your keyword tagging. Then it consolidates redundancies condensing a taxonomy to only meaningful disjunct keyword classes.

Use Case 4: Open search

If you rely on customers handing in sync briefings and then search your catalog yourself, your business will lack scalability. So you might want to open up your catalog search to every potential client. For this you want to make sure, that you deliver the right music to every music search and every individual understanding of music – you need to speak the language of every of your customers.

To achieve this, our Cyanite Keyword system can translate a vast amount of keywords into semantically related tags. This means that if you only tag the keyword „euphoric” for very upbeat, outgoing and happy songs, but the client wants to search for „enthusiastic”, our Cyanite Keyword system understands and will present the suitable songs out of your catalog. This is important for keyword that were tagged significantly less in your catalog to be able to show a good variety of music.

Use Case 5: Automatic tagging in your own catalog language.

Let’s say your clients and customers got used to your specific keyword tagging – your catalog language. It means that your catalog language is an integral part of the stickiness of your platform and will lead customers to retain to your service. If you introduce automatic tagging through deep learning systems such as the Cyanite Tagging system, you want to keep the automatic tags in your catalog language so that your customers keep on finding the right music.

To achieve this, our Cyanite Keyword system and the Cyanite Tagging system work together on translating our auto-tags into your catalog language. Your customers won’t even notice that you switched to AI-tagging.

How to get started!

If the approach of Cyanite’s Keyword Cleaning resonates with you, the first step is to have a look into your metadata. For that, please reach out to sales@cyanite.ai. Together, we will dive into your tagging scheme and assess the possibility of a Keyword Cleaning project. 

Cyanite API Update – Version 6 now live!

Cyanite API Update – Version 6 now live!

After months of hard work, our new API update is finally live! Part of the new classifier generation are our 13 new moods, EDM sub-genres, percussion and the sound-based musical era of a song. Here you can find the new API generation’s full documentation: https://api-docs.cyanite.ai/blog/2021/02/05/changelog-2020-02-05

The new API update includes:

• 13 new moods

17 different genres

• 8 EDM sub-genres

• Voice – male/female/instrumental

• Voice presence

• Percussion

• Musical era

• Experimental keywords

We also set up versioning because we do not want to force anyone immediately to upgrade to the latest generation. We now introduce each new generation as separate GraphQL fields.

Further, we added a new webhook format that is more flexible and consistent.

Now let’s take a closer look at it!

Mood

The mood multi-label classifier provides the following labels:

aggressivecalmchilleddarkenergeticepichappyromanticsadscarysexyetherealuplifting

Each label has a score reaching from 0-1, where 0 (0%) indicates that the track is unlikely to represent a given mood and 1 (100%) indicates a high probability that the track represents a given mood.

Since the mood of a track might not always be properly described by a sigle tag, the mood classifier is able to predict multiple moods for a given song instead of only one. A track could be classified with dark (Score: 0.9), while also being classified with aggressive (Score: 0.8).

The mood can be retrieved both averaged over the whole track and segment-wise over time with 15s temporal resolution. In addition the score the API also exposes a list which includes the most likely moods, or the term ambiguous in case of none of the audio not reflecting any of our mood tags properly.

Genre

The genre multi-label classifier provides the following labels:

ambientbluesclassicalcountryelectronicDancefolkindieAlternativejazzlatinmetalpoppunkrapHipHopreggaernbrocksingerSongwriter

Each label has a score reaching from 0-1 where 0 (0%) indicates that the track is unlikely to represent a given genre and 1 (100%) indicates a high probability that track represents a given genre.

Since music could break genre borders the genre classifier can predict multiple genres for a given song instead of only predicting one genre. A track could be classified with rapHipHop (Score: 0.9) but also reggae (Score: 0.8).

The genre can be retrieved both averaged over the whole track and segment-wise over time with 15s temporal resolution. In addition the score the API also exposes a list which includes the most likely genres.

EDM Sub-Genre

In case a track’s genre got classified as electronicDance, the EDM sub-genre classifier is available for going to a deeper analysis layer, applying the following labels for edm sub-genres:

breakbeatDrumAndBassdeepHouseelectrohouseminimaltechHousetechnotrance

Each label has a score reaching from 0-1 where 0 (0%) indicates that the track is unlikely to represent a given sub-genre and 1 (100%) indicates a high probability that track represents a given sub-genre.

The EDM sub-genre can be retrieved both averaged over the whole track and segment-wise over time with 15s temporal resolution. In addition the score the API also exposes a list which includes the most likely EDM sub-genres.

Voice

The voice classifier categorizes the audio as female or male singing voice or instrumental (non-vocal).

Each label has a score reaching from 0-1 where 0 (0%) indicates that the track is unlikely to have the given voice elements and 1 (100%) indicates a high probability that track contains the given voice elements.

The voice classifier results can be retrieved both averaged over the whole track and segment-wise over time with 15s temporal resolution.

Voice Presence

This label describes the amount of singing voice throughout the full duration of the track and may be nonelowmedium or high.

Percussion

The instrument classifier currently only predicts the presence of a percussive instrument, such as drums or drum machines or similar. The result is displayed under the label of percussion.

The label has a score reaching from 0-1 where 0 (0%) indicates that the track is unlikely to contain a given instrument and 1 (100%) indicates a high probability that track contains a given instrument.

The instrument classifier result can be retrieved both averaged over the whole track and segment-wise over time with 15s temporal resolution.

Musical Era

The musical era classifier describes the era the audio was likely produced in, or which the sound of production suggests.

Experimental Keywords

Experimental taxonomy that can be associated with the audio. The data is experimental and expected to change. The access must be requested from the cyanite sales team.

Example keywords:

upliftingedmfriendlymotivatingpleasanthappyenergeticjoyblissgladnessauspiciouspleasureforcefuldeterminedconfidentpositiveoptimisticagileanimatedjourneypartydrivingkickingimpellingupbeat,

Go ahead and start coding

Contact us with any questions about our API services via mail@cyanite.ai. Give us a shout-out on Twitter, LinkedIn or wherever you feel like. Don’t hold back with feedback on what we can improve.

If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed.

If you are a coder and want to join the ride, please send your application to careers@cyanite.ai.

PR: BPM Supreme integrates Cyanite’s algorithms

PR: BPM Supreme integrates Cyanite’s algorithms

PRESS RELEASE

 

BPM Supreme is the first digital record pool worldwide to integrate Cyanite’s artificial intelligence for individual music recommendation

 The digital record pool BPM Supreme from San Diego will soon use the algorithms of the Berlin and Mannheim-based technology company Cyanite. The AI enables BPM Supreme to suggest music according to moods and to provide users with individualized music suggestions. BPM Supreme is one of the world’s first online music services for DJs that integrates algorithms to enhance the user experience.

Mannheim/Berlin/San Diego, December 1st, 2020BPM Supreme is one of the world’s leading digital record pools. For a monthly fee DJs get unlimited access to the entire catalog featuring thousands of new releases and exclusive remixes. The BPM Supreme brand also includes a new online sample library for producers and music makers, BPM Create, as well as a record pool specialized in Latino music, BPM Latino. BPM Latino will also integrate Cyanite’s algorithms.

Supported by Cyanite’s Deep Learning technology, the music search on BPM Supreme will be made even more intuitive, e.g. by introducing moods as search categories. BPM Supreme users will be able to find suitable music for their DJ sets and playlists easily and optimize them with the help of intelligent recommendations.

With the cooperation with BPM Supreme, Cyanite has taken on board its first major customer in the United States. In addition, SWR, Mediengruppe RTL, NEUBAU Music, and Meisel Music as well as production music providers Soundtaxi, Filmmusic io, and RipCue Music already use Cyanite’s technology.

Jakob Höflich, founder and co-director of Cyanite: “BPM Supreme embodies the ability to break through the industry and quickly adapt to the constantly evolving market. They have proven many times that they have the spirit to pioneer the industry through new business models and technologies. We are very proud that they have chosen Cyanite as their AI partner to go this crucial step into the future with.

Angel “AROCK” Castillo, Founder and CEO of BPM Supreme said: “Together with Cyanite we will enter the next phase of BPM Supreme towards an AI driven future and enable our users to find music even better with state-of-the-art discovery functions.” 

Anyone wishing to try Cyanite’s technology can register for the free Web App and upload music or integrate Cyanite into an existing database system via an API.

Try Cyanite for free: https://app.cyanite.ai/login

Full press release and material here: https://drive.google.com/drive/folders/1X9Ug29ISA-QdOBOHUMQFLOaqelL9HrXl?usp=sharing

 

BPM Supreme’s music search interface

Background to BPM Supreme:
BPM Supreme is a leading digital music service for professional DJs delivering an extensive selection of new releases and exclusive tracks through a user-friendly platform and mobile app. With an innovative approach to music discovery, the company’s mission is to be the most trusted source for DJ-ready content. BPM Supreme names many notable DJs as users of the platform, such as DJ Jazzy Jeff, Z-Trip, A-Trak, The Chainsmokers, and DJ Snoopadelic. Over the past ten years, BPM Supreme has partnered with some of the music industry’s most prominent companies, including Sony Music Entertainment, Universal Music Group, Empire Records, Dim Mak Records, Mad Decent, Roland, Pioneer DJ, Denon DJ, and Serato.
Learn more at
www.bpmsupreme.com

 

Background to Cyanite:
Cyanite believes that state-of-the-art technology should not be exclusive to big tech companies. The start-up is one of Europe’s leading independent innovators in the field of music-AI and supports some of the most renowned and innovative players in the music and audio industry. Customers and music companies using Cyanite include the Mediengruppe RTL, the radio station SWR, the music publishers NEUBAU Music and Meisel Music, and the production music libraries Soundtaxi, RipCue Music, and filmmusic.io. Cyanite’s mission is to help music companies make the transition to the age of AI without spending expensive resources on tech-innovation. The 13-person team from Mannheim and Berlin operates at the interface between the music industry, data science, and software engineering. The founding team emerged from the Popakademie Baden-Württemberg – Germany’s top-university for the music business. They are completed by a team of data scientists from one of the world’s most renowned chairs for Music Information Retrieval at the Technical University of Berlin. The company, known from the magazines Musikwoche and Music Ally among others, has received numerous awards from TechCrunch, Google, the German Government, Business Punk, Music WorX, German Accelerator and is currently a participant in the Accelerator Marathon LABS of the music company Marathon Artists in London. Cyanite is supported by the city of Mannheim and various business angels, such as Germany’s Business Angel of the Year 2019, Dr. Andrea Kranzer.

Press contact
Jakob Höflich
Co-Founder
+49 172 447 0771
Jakob (@) cyanite.ai 

Headquarter Mannheim
elceedee UG (haftungsbeschränkt)

Badenweiler Str. 4
68239 Mannheim

Office Berlin
Cyanite
Gneisenaustraße 44/45
10961 Berlin

Website: https://www.cyanite.ai/
LinkedIn: Cyanite.ai 
Twitter: Cyanite.ai

Cyanite Update 2020 🥁 The new Library and updated Similarity Search

Cyanite Update 2020 🥁 The new Library and updated Similarity Search

Introducing Cyanite’s new features

We have implemented feedback from Cyanite users around the world into our latest version and are more than excited to finally launch it. This version includes AI tagging for your own songs, sonic similarity searches for your own databases and a brand new and refined detail view for better communicating and comparing your music.

Library: Manage & tag your music

Drag and drop your music into our new library view and have it tagged in minutes. Automatically analyze your music on various features like mood, genre, bpm, key, voice, energy, and mood dynamics.

Similarity Search: Find similar songs

Find similar songs in your own library in seconds. Our improved Similarity Search lets you search your own database with any reference track from Spotify, and lets you filter the results by mood, genre, voice, and timbre.

Detail view: Deep dive into a song

Understand your music at a glance. Use the data-driven interface to find the best song parts in seconds and communicate your music better in any pitch from Spotify to synch.

I want to try out Cyanite’s AI platform – how can I get started?

If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed.

Contact us with any questions about our frontend and API services via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

PR: RTL integrates Cyanite’s algorithms into its music library FAR MUSIC

PR: RTL integrates Cyanite’s algorithms into its music library FAR MUSIC

PRESS RELEASE:

 

 

RTL integrates Cyanite’s recommendation algorithms into its own production music library FAR MUSIC

 

The Mannheim-based technology company Cyanite enriches the in-house production music library FAR MUSIC of Mediengruppe RTL GmbH with its innovative algorithms for music analysis and recommendation. Cyanite’s AI will make it easier for editors and journalists to quickly and intuitively find the right music for their film clips. The Mediengruppe RTL GmbH is the largest customer so far on Cyanite’s mission to accompany music and media companies into the AI age.

Mannheim/Cologne, August 18th, 2020 – The Mediengruppe RTL GmbH is one of the largest German media companies. Part of which are the TV-Channels RTL, RTL II, VOX, and n-tv as well as the music publisher i2i Music, whose catalogue is available to all RTL editors for all types of use via the music platform FAR MUSIC.

With the help of the Cyanite deep learning technology, the music search on FAR MUSIC will be made easier and more intuitive. This will increase user-friendliness, ensure high-quality content, and reduce the costs for licensing third-party copyrights.

Specifically, RTL / i2i Music will use Cyanite’s algorithms for automatic music keywording and for detecting acoustic similarities in songs. In addition to musicological factors such as BPM or key signature, genre, mood, vocals, energy, and instruments are recognized and correspondingly indexed. Furthermore, RTL editors will be able to find similar-sounding pieces on FAR MUSIC via reference songs from YouTube or Spotify.

The Mediengruppe RTL GmbH is Cyanite’s largest customer to date. They join SWR, NEUBAU Music and Meisel Music, and the production music libraries Soundtaxi, Filmmusic.io and RipCue Music, who already use Cyanite’s technology.  

Markus Schwarzer, co-founder, and CEO of Cyanite: “Having such a large and renowned company as RTL / i2i Music among our customers marks a milestone for us. It shows how more and more companies are relying on the new possibilities of artificial intelligence to optimize processes and take advantage of new business opportunities.

Lutz Fassbender, Director Copyright Affairs, Mediengruppe RTL GmbH: “In Cyanite, we have found the perfect AI partner who equips us with important, innovative future technology for our music platform and whom we trust to develop future models.

Anyone who would like to convince themselves of Cyanite’s technology can register for the Web App free of charge and upload music or integrate Cyanite into an existing database system via a programming interface.

 

Try Cyanite for free: https://app.cyanite.ai/login

Full press release and material herehttps://drive.google.com/drive/folders/1fIzNSMHSviLPQq4q97AipWcffGfhJB9s?usp=sharing

 

 

 

The current interface of FAR MUSIC

Background to Mediengruppe RTL / i2i Music / FAR MUSIC:

i2i Music is an interface and service provider between producers, editors, and marketing experts on the one hand and composers on the other. The publishing house distributes commissioned compositions for film, television, and radio and has music produced for the advertising sector. The core tasks of the publishing house include the financing of music projects as well as their complete administrative implementation: the creation of music lists, the control of GEMA documentation, and the worldwide royalty collection. The production music offer of i2i Music is called FAR MUSIC and is aimed at filmmakers, editors, and producers of trailers, advertising, and online content of the Mediengruppe RTL GmbH. The platform offers a wide variety of musical styles and provides tracks of all genres for download. Music seekers can use keyword or filter searches, combine music and sound effects in playlists and download their favourites. The FAR MUSIC catalogue includes international labels from Germany, Great Britain, and the USA.

 

Background to Cyanite:

Cyanite believes that state-of-the-art technology should not be exclusive to big tech companies. The start-up is one of Europe’s leading independent innovators in the field of music-AI and supports some of the most renowned and innovative players in the music and audio industry. Customers and music companies using Cyanite include the radio station SWR, the music publishers NEUBAU Music and Meisel Music, and the production music libraries Soundtaxi, RipCue Music, and filmmusic.io. Cyanite’s mission is to help music companies make the transition to the age of AI without spending expensive resources on tech-innovation. The 10-person team from Mannheim and Berlin operates at the interface between the music industry, data science, and software engineering. The founding team emerged from the Popakademie Baden-Württemberg – Germany’s top-university for the music business. They are completed by a team of data scientists from one of the world’s most renowned chairs for Music Information Retrieval at the Technical University of Berlin. The company, known from the magazines Musikwoche and Music Ally among others, has already received numerous awards from TechCrunch, Google, the German Government, Business Punk, Music WorX, German Accelerator and is currently a participant in the Accelerator Marathon LABS of the music company Marathon Artists in London. Cyanite is financed by the city of Mannheim and various business angels, such as Germany’s Business Angel of the Year 2019, Dr. Andrea Kranzer.

Press contact
Jakob Höflich
Co-Founder
+49 172 447 0771
Jakob (@) cyanite.ai 

Headquarter Mannheim
elceedee UG (haftungsbeschränkt)

Badenweiler Str. 4
68239 Mannheim

Office Berlin
Cyanite
Gneisenaustraße 44/45
10961 Berlin

Website: https://www.cyanite.ai/
LinkedIn: Cyanite.ai 
Twitter: Cyanite.ai