Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Transformers in music recommendation

Aug 20, 2024 - news.bensbites.com
The article discusses a music recommendation ranking system that uses Transformer models to understand the sequential nature of user actions based on their current context. The system, used by YouTube Music, learns from user actions such as skips, likes, and dislikes to better understand their musical preferences and provide more accurate recommendations. The recommendation system consists of three stages: item retrieval, item ranking, and filtering. The Transformer architecture, which is well-suited to processing sequences of input data, is used to improve the system.

The Transformer model is combined with an existing ranking model to learn the combined ranking that best blends user actions with listening history. The model takes into account the user's history, including previous actions and the music they are currently listening to. The use of the Transformer significantly improves the performance of the ranking model, leading to a reduction in skip-rate and an increase in time users spend listening to music. Future work includes adapting the technique to other parts of the recommendation system and incorporating non-sequential features within the Transformer.

Key takeaways:

  • The article discusses a music recommendation ranking system that uses Transformer models to understand the sequential nature of user actions based on the current user context, improving the recommendation system in YouTube Music.
  • The recommendation system consists of three key stages: item retrieval, item ranking, and filtering. Transformers are used to make sense of a sequence of input data, capturing the relationship between user actions.
  • The Transformer architecture is combined with an existing ranking model to learn the combined ranking that best blends user actions with listening history, leading to a reduction in skip-rate and an increase in time users spend listening to music.
  • Future work includes adapting the technique to other parts of the recommendation system such as retrieval models and incorporating non-sequential features within the Transformer for improved self-attention between the sequential and non-sequential features.
View Full Article

Comments (0)

Be the first to comment!