This paper presents a selection of the outcomes of a large-scale UK-funded research project that ran from mid-2014 to the end of 2019. During the project, known by its acronym FAST, a team of nearly 30 researchers in 3 universities (Queen Mary University of London, Oxford University, University of Nottingham) investigated the potential for digital technologies reliant on metadata for enhancing the work flows and data flows in the recorded music industry. This brought together expertise in Digital Signal Processing, Machine Learning, Artificial Intelligence, Data Science, Music Theory, Musicology, Luthiery, Human-Machine Interaction, Design, Ethnography, Linked Data, Ontologies, Acoustics, Audio Engineering and a lot more.
The paper will focus on some of the sub-projects within FAST, such as the Recording Studio demonstrators that explored how both symbolic AI and Machine (Deep) Learning are used to assist Audio Engineering processes, and Music Discovery demonstrators that explored and exploited the relationships between music artists using AI, Linked Data, DSP and Data Science. Some more speculative, preliminary work in Music Data Science will also be presented.