Experience Our Biggest Web App Update with 5,000+ New Genres! 🎉 Discover Now

Cyanite API Update – New Sub-Genres and More

Cyanite API Update – New Sub-Genres and More

We are proud to announce our latest API update. This update includes new sub-genres, improved genres, improved similarity search, and more. For details, see full documentation here

Now let’s take a closer look at it!

New Sub-Genre Classifiers

A ton of new sub-genres were added to the existing ones. Among them: blues-rock, folk-rock, hard rock, indie alternative, psychedelic progressive rock, punk, rock-and-roll, pop-soft-rock, abstract IDM left field, breakbeat DnB, deep house, electro, house, minimal, synth-pop, tech house, techno, trance, contemporary RnB, gangsta, jazzy hip-hop, pop-rap, trap, black metal, death metal, doom metal, heavy metal, metalcore, nu-metal, disco, funk, gospel, neo-soul, soul, big band swing, bebop, contemporary jazz, easy listening, fusion, Latin jazz, smooth jazz, country, and folk. 

Improved Genre Classifier

The genre classifiers have also been improved to give more accurate results. The country, folk, and punk genres are now deprecated to be replaced by the country, folk, and punk sub-genres.

Improved Similarity Search

The algorithm powering the Similarity Search has been improved. The Similarity Search now yields better results when searching for similar tracks based on musical features.

Removal of old GraphQL API types

Finally, we completely removed everything InDepthAnalysis related from our GraphQL schema.

To see the full list of changes, head over here.

Go ahead and start coding

Contact us with any questions about our API services via mail@cyanite.ai. Give us a shout-out on Twitter, LinkedIn or wherever you feel like. Don’t hold back with feedback on what we can improve.

If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed.

If you are a coder and want to join the ride, please send your application to careers@cyanite.ai.

PR: APM Music Partners with Cyanite to Enhance Music Tagging

PR: APM Music Partners with Cyanite to Enhance Music Tagging

PRESS RELEASE

APM Music partners with Cyanite to Enhance Music Tagging

Mannheim/Berlin/Los Angeles, September 9, 2021 – APM Music, the largest production music library in North America, and Cyanite, a technology company developing a suite of AI-powered music search products announced today a strategic partnership that will provide users with improved tagging and metadata to enhance their search queries.

A superior music discovery experience begins with content that is comprehensively, consistently, and accurately tagged. With an ever-growing music library such as the one APM Music has been providing the marketplace for nearly four decades, maintaining high-quality tagging and precise metadata at a large scale is a primary concern. Incorporating Cyanite’s AI will allow APM Music to introduce human-assisted auto-tagging to the music submission and review process, thus increasing the quality and consistency of tagging.

APM Music’s President/CEO Adam Taylor comments: “For APM Music, accurate and reliable music tagging has always been of the utmost importance and we are aligned with Cyanite on this constant strive for quality. Markus and the team have proven that they are able to quickly react to our feedback and improve their algorithms at a rapid speed. We are excited to integrate artificial intelligence into APM and create the best possible support for our team.

Cyanite’s artificial intelligence listens to and categorizes songs, helping to deliver the right music content, no matter the use case. Integrating this technology will benefit the end user by ensuring search queries continue to yield accurate and tailored results as APM’s music library expands in depth and breadth.

Markus Schwarzer, CEO of Cyanite: “We look forward to working with such a prestigious and renowned partner as APM Music. Everyone on their team is a unique expert in their field. Being the chosen AI partner for the important and extensive transition into this new age of music distribution fills our entire team with pride.

The addition of Cyanite’s technology extends APM’s commitment to continually increasing the quality and performance of its search engine, thereby delivering a superior quality discovery experience to match its richness of catalog.

Anyone wishing to try Cyanite’s technology can register for the Web App free of charge and upload music or integrate Cyanite into an existing database system via their API.

Full press material including German press release can be found via this link.

Background to APM Music:
APM Music is located in Hollywood, California, and is the premier go-to source for production music. Founded in 1983, APM Music is the largest production music library in North America. To find out more, please visit www.apmmusic.com. 

Background to Cyanite:
Cyanite believes that state-of-the-art technology should not be exclusive to big tech companies. The start-up is one of Europe’s leading independent innovators in the field of music-AI and supports some of the most renowned and innovative players in the music and audio industry. Among the music companies using Cyanite are the Mediengruppe RTL, the record pool BPM Supreme. the radio station SWR, the music publishers NEUBAU Music and Schubert Music, and the sound branding agencies Universal Music Solutions, TAMBR, and amp sound branding.

Press Inquiries 

Jakob Höflich

Co-Founder

+49 172 447 0771

jakob@cyanite.ai

Headquarter Mannheim

elceedee UG (haftungsbeschränkt)

Badenweiler Straße 4

68239 Mannheim

Berlin Office

Cyanite

Gneisenaustraße 44/45

10961 Berlin

The New Detail View is LIVE!

The New Detail View is LIVE!

Introducing Cyanite’s Detail View

If you like to explore music data in a visual format, we have news for you. Our Detail View is now live and it is maybe the world’s most advanced AI-driven graphical interface for music.

The level of granularity is truly amazing with each visualization showing all the data points for an in-depth analysis of your music.

Here is what you get in the Detail View:

  • colorful graphs for each section such as Genre, Mood, Voices, Instruments, and more
  • ability to see the dynamic changes in a piece of music
  • ability to customize graphs and data for a clearer and better view.

To show you how it works, we analyzed one of the songs we find most interesting from a data standpoint – ‘Bohemian Rhapsody’ by Queen. Keep reading to see the detailed analysis or check the  Detail View yourself in the Cyanite app.

An in-depth analysis of ’Bohemian Rhapsody’

‘Bohemian Rhapsody’ has quite a lot going on. It starts as an acapella, carries on as a ballad, then progresses to almost an opera, then there is a rock bit, and finally the ending. But this is just the first sight.

Here is what our detailed AI analysis reveals:

Genre 

The main genre is rock. But at some points, the AI determines rock as low as 0.05 and other genres take over such as classical and pop.

Genres in Detail View

Mood

The three dominant moods are energetic, chilled, and sad. The track is most sad in the beginning, but then it picks up a pace and alternates between energetic and uplifting to end on another chilled note.

The least present moods are sexy and happy. It is a very heavy song after all.

Mood in Detail View

Instruments

As you might have guessed, in this song bass, bass guitar, electric guitar, and percussion are mostly present. But piano especially carries the song through almost all of Freddie Mercury’s solo parts and toward the end. The piano was actually Freddie Mercury’s most favored instrument though his vocals quickly overshadowed his piano skills.

Instruments in Detail View

Augmented Keywords

Finally, here is a snapshot of augmented keywords for Bohemian Rhapsody which provides additional insight on what emotions and moods the song can evoke in listeners.

Augmented Keywords

If you enjoyed this read, check out the Detail View for yourself and see what insights you can discover about your favorite tracks.

I want to try out Cyanite’s AI platform – how can I get started?

If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed.

Contact us with any questions about our frontend and API services via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

Explore New Cyanite Features – June 2021

Explore New Cyanite Features – June 2021

Introducing Cyanite’s new features

We are constantly updating and adding new features to Cyanite based on your feedback. In June we released features that will make you even more efficient at analyzing and discovering music.

Our new features include an updated library view, more opportunities for deep analysis with genre and mood values, musical era tags, and improved Similarity Search with Custom Interval scanning and up to 100 search results.

New Library View

The new library view is a feast for your eyes with the following new features:  

    • You can now enjoy the library view with colored tabs highlighting results in Energy Level, Emotional Profile, and other columns. The color scheme helps you understand the song’s character right away.
New Library View
    • Find the exact numerical values for each genre and mood and get a granular analysis of the song. You’ve never experienced such a detailed music analysis before.
Mood Numerical Value
    • Additionally, have a look at the musical era column. The AI can now determine the decade the song sounds like.

      Similarity Search

      The Similarity Search lets you search your own database using a reference track from Spotify, Youtube, or from your own library, and filter the results by genre, bpm, and key.

        • We have now improved the performance of similarity search so you can get more relevant and precise results.
        • Choose the exact part of your reference song you want to find similar tracks for using the Custom Interval
      Custom Interval
        • Also now you can display up to 100 songs in the search results.
      Search Results

      Augmented Keywords

      If there are keywords you don’t see in Cyanite but would like to use, we have additional keywords for you. Augmented keywords are now available and can be unlocked upon request. Explore genres, brand values, moods, and ‘music fors’ such as “urban”, “mellow”, “party”, “sports”, “contemplative”, and “confident”.

      I want to try out Cyanite’s AI platform – how can

      I get started?

      If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed.

      Contact us with any questions about our frontend and API services via mail@cyanite.ai. You can also directly book a web session with Cyanite co-founder Markus here.

      Cyanite API Update – New Instruments and More

      Cyanite API Update – New Instruments and More

      We are proud to announce the latest API update. This update includes 8 new instrument classifiers, improved BPM, Key, and Time Signature detection, and improved Similarity Search.  For details, see full documentation here: https://api-docs.cyanite.ai/blog/2021/05/06/changelog-2021-05-06/

      Now let’s take a closer look at it!

      New Instrument Classifiers

      Now, additionally to our percussion classifier, we expose the synthpiano, acousticGuitar,  electricGuitar, stringsbassbassGuitar and brassWoodwinds classifiers.

      We also deprecated the AudioAnalysisV6Result.instruments field.

      The new field AudioAnalysisV6Result.instrumentTags exposes a list of detected instruments throughout the track.

      New Key and BPM classifier

      The FullScaleMusicalAnalysis and FastMusicalAnalysis types are now deprecated and will be removed from the API on 15 June 2021.

      The new AudioAnalysisV6Result.key and AudioAnalysisV6Result.bpm expose improved key and bpm values over the now deprecated FullScaleMusicalAnalysis and FastMusicalAnalysis.

      New Time Signature Classifier

      The AudioAnalysisV6Result.timeSignature now exposes the time signature of the track (e.g. 3/4 or 4/4) as a string.

      New Improved Similarity Search

      We improved the performance of our similarity search based on audio features. In addition to that, we exposed a new single field that allows searching similar tracks for both library and Spotify tracks with additional filters for bpm, key, and genre.

      Go ahead and start coding

      Contact us with any questions about our API services via mail@cyanite.ai. Give us a shout-out on Twitter, LinkedIn or wherever you feel like. Don’t hold back with feedback on what we can improve.

      If you want to get a first grip on how Cyanite works, you can also register for our free web app to analyze music and try out similarity searches without any coding needed.

      If you are a coder and want to join the ride, please send your application to careers@cyanite.ai.

      Introducing: Cyanite’s Keyword Cleaning System for Music Libraries

      Introducing: Cyanite’s Keyword Cleaning System for Music Libraries

      In this article, we present the common challenge of inconsistencies of keyword tagging in music databases. We discuss what causes these problems and how Cyanite developed a Keyword Cleaning system to automatically solve and overcome these. We will present four use cases for our Keyword Cleaning system and the potential impact it may have on music businesses.

      Introduction of the problem

      The way we perceive music is highly individual. So is the way we describe music. What is food for many dinner conversations is important to be aware of when handling bigger amounts of musical pieces professionally.

      To leverage diverse monetization opportunities with musical assets, many music companies sort music catalogs by assigning keyword tags to all the audio files in their music database. These tags may describe the mood and genre of a song or categorize its instruments or tempo. This way music companies ensure accessibility and searchability of any musical asset even in very large music catalogs.

      These tags follow the companies’ individual understanding of music – their catalog language. The specific nature of a catalog language may be understood under two aspects:

      1. Objective catalog language (tagging): the entity of keywords and tags often described as taxonomy or tag anthology (quantity, classes and wording). „Which tags do I use.

      2. Subjective catalog language (understanding of tagging): the understanding of tags and their connection to certain sound qualities. „When do I assign a certain tag?“

      Objective catalog language is inherent to the music catalog or the company that owns it. Subjective catalog language, however, is inherent to every individual person that tags the music.

      Having a consistent catalog language leads to a brilliant search experience and is the perfect condition for thorough exploitation of your assets. A lot of work can go into building and maintaining an own catalog language. However, 3 main events can quickly erode it and thus erode tagging quality and meaningfulness:

      Event 1: Catalog acquisitions or integrations.

      Event 2: Differences in the form of the day of tagging staff.

      Event 3: The hiring of new tagging staff.

      Not being aware of this can cause the annihilation of the work of decades. Songs can’t be found and revenue streams can’t be realized as before, seriously harming a company’s ability to execute their business model.

      More importantly – music searching staff don’t trust the music search anymore which leads them to building up highly individual systems of workarounds to finding suitable music or a very limited „go-to-catalog“ of songs that they use more often rather than grasping on the entire music catalog.

      Aaron Chavez © Unsplash

      Our solution

      Addressing these issues, Cyanite developed a way to bring together (translate) two catalog languages – objective or subjective – with minimum information loss and maximum speed, using AI.

      We base our approach on a measure we denote as keyword similarity, describing the degree of semantic similarity of a pair of tags. To give an example, the keywords “enthusiastic” and “euphoric” should have a rather similar meaning when used for the description of a musical mood. We would therefore expect a high degree of keyword similarity. On the contrary, “enthusiastic” and “gloomy” represent a quite contrary pair of descriptive attributes which should point towards a low degree of keyword similarity.

      Most music catalogs contain a multi-label tagging scheme, meaning the possibility for a single piece of music to be assigned multiple tags. We take use of this fact and focus on the track-wise co-occurrence of tags, hypothesizing that a frequent joint attribution of a tag pair will indicate a high degree of interrelation and, thus, keyword similarity.

      We developed a natural language processing (NLP) AI system capable of learning the semantic interrelation of keywords in any library. With this, we are able to derive a quantitative measure for any combination of keywords contained in one or several music catalogs. This analysis is the basis for a variety of groundbreaking use cases to overcome challenges many music companies are struggling with.

       

      Use Case 1: Catalog language translation

      This challenge arises when two (or more) differently tagged music catalogs shall be integrated into each other (potentially after a catalog acquisition or when choosing a different distribution outlet). Manually translating tags is tedious and may lead to significant information loss as sometimes the same tags are not used equally (see “subjective catalog language” above).

      Our system is able to understand and map every tag in relation to each other. It does it with both taxonomies understanding the respective catalog language. In a second step it maps both catalogue languages on top of each other drawing direct relations between tags and their understanding. The third step marks the translation of the single song tagging from one catalog language into the one the catalog shall be integrated in. The system automatically re-tags every song in a new catalog language.

      Use Case 2: Keyword Cleaning of inconsistent keyword tagging

      Companies with high fluctuation in tagging staff face this challenge – or it may be a company with a particularly large catalog (>100,000 songs) that picked up some legacy over the years: Inconsistencies in keyword tagging. This is one of the biggest problem catalogs can face as it seriously diminishes the searchability and search experience of the catalog leading to mistrust of the system, individual workarounds and eventually losing the customer for good. Or it leads the customer to directly contact the library’s sales team and search staff which harms the capability of your business to scale.

      After understanding the respective catalog language of your catalog our Cyanite Keyword Cleaning system can detect tags with low keyword similarity that may contradict the other tags and flag the respective songs. To assess if a tag was wrongfully assigned (or may be missing), we offer an audio-based tagging solution for these anomalies to detect whether or not a tag is suitable or not. In case of the latter the tag is then deleted.   

      Use Case 3: Taxonomy Cleaning. Detection of redundancies and blind spots.

      Languages change over time – and with it change catalog languages. Some catalogs have 15,000+ different keywords in their taxonomy. It should come as no surprise that songs with older keyword tags are less being found. The choice to a slimmer taxonomy can elevate searchability and overall search experience of catalogs.

      This raises the question of whether all tags are necessary and meaningful or not. To test this, our Cyanite system can detect tags that are equal in meaning by scanning through your keyword tagging. Then it consolidates redundancies condensing a taxonomy to only meaningful disjunct keyword classes.

      Use Case 4: Open search

      If you rely on customers handing in sync briefings and then search your catalog yourself, your business will lack scalability. So you might want to open up your catalog search to every potential client. For this you want to make sure, that you deliver the right music to every music search and every individual understanding of music – you need to speak the language of every of your customers.

      To achieve this, our Cyanite Keyword system can translate a vast amount of keywords into semantically related tags. This means that if you only tag the keyword „euphoric” for very upbeat, outgoing and happy songs, but the client wants to search for „enthusiastic”, our Cyanite Keyword system understands and will present the suitable songs out of your catalog. This is important for keyword that were tagged significantly less in your catalog to be able to show a good variety of music.

      Use Case 5: Automatic tagging in your own catalog language.

      Let’s say your clients and customers got used to your specific keyword tagging – your catalog language. It means that your catalog language is an integral part of the stickiness of your platform and will lead customers to retain to your service. If you introduce automatic tagging through deep learning systems such as the Cyanite Tagging system, you want to keep the automatic tags in your catalog language so that your customers keep on finding the right music.

      To achieve this, our Cyanite Keyword system and the Cyanite Tagging system work together on translating our auto-tags into your catalog language. Your customers won’t even notice that you switched to AI-tagging.

      How to get started!

      If the approach of Cyanite’s Keyword Cleaning resonates with you, the first step is to have a look into your metadata. For that, please reach out to sales@cyanite.ai. Together, we will dive into your tagging scheme and assess the possibility of a Keyword Cleaning project.