Human vs. Machine: The Music Curation Formula
Recreating human recommendations in the digital sphere at scale is a problem we're actively solving across verticals but no one quite has the perfect formula. The vertical where this issue is especially ubiquitous is music. Where we currently stand is solving the integration of human data with machine data and algorithms to generate personalized recommendations that mirrors the nuances of human curation. This formula is the holy grail.
Additional Supporting Materials
- What mechanisms + methods are you currently using to optimize curation? Human, machine, or wizardry [Both]? Cross learning: What have we learned from each other? Humans + big data: Finding data wizards and implementing algorithms
- How do you solve predictibility with machines? Machine listening : Computers that listen and are able to pick out qualities + details Machines trying to keep up with the speed of new music
- How do you solve scale and diversity with human curation? Managing human generated data The human ability to detect various nuances The human capability of factoring musical and cultural history
- Recommending by genre is easy. How do you curate and recommend on the track level?
- The future: What are some things we can anticipate in the near future and what are your predictions?
- MARC RUXIN, FOUNDER & CEO, TASTEMAKERX
- TIM QUIRK, HEAD OF GLOBAL CONTENT PROGRAMMING, Apple
- Ian Rogers, CEO, Beats By Dre
- Jim Lucchese, CEO, Echo Nest
MARC RUXIN, FOUNDER & CEO, TASTEMAKERX
Show me another