About Me

I am a Lecturer in Sound and Music Computing at the University of Plymouth, working on AI and signal processing, and a collaborator as part of the RadioMe project, building real-time radio remixing for people with dementia. Previously, I worked as a postdoctoral researcher at  Queen Mary University of London (QMUL) where, as part of the team of researchers working on the FAST project, I researched Intelligent Music Production. This includes techniques for automatically mixing audio tracks, machine learning and signal processing approaches to music mixing, implementation of semantic web technologies into music production, and ethnographic study of music production processes.

My research interests include audio production technology, artificial intelligence, real time and live mixing tools and DSP, machine learning and deep learning, sound synthesis, procedural audio, music information retrieval (MIR), source separation, audio engineering technology, genetic algorithms, audio effects and real time analysis and manipulation of audio. Perceptive, qualitative and objective measures and metrics for the evaluation of audio technologies.

I completed my PhD also at QMUL, where I worked within the sound synthesis research team, with a focus on perceptual evaluation of synthesised sound effects. My PhD focus was to evaluate the state of the art in sound synthesis and objectively identify what makes a particular sound effect realistic.

I received my MSc from Queen Marys University of London in 2014, where my master project “Microphone Bleed Reduction and Dereverberation” was undertaken as part of the Audio Engineering research group within the Centre for Digital Music.

I grew up Edinburgh and graduated from Edinburgh University with a BSc in Artificial Intelligence and Computer Science in 2011. During my time at Edinburgh, I focused on Audio Processing, Intelligent System and Autonomous Networks. Despite this, I spent the majority of my time in theatre and live events, working on technical production, set construction, event management and design. This inspired my MSc project, which was titled “Beat Tracking with Style Specific Heuristics”. Essentially, applications of real time beat tracking within live event production include synchronisation of lighting and sound, ideally with near-zero latency. This project was deemed a success when I identified that music genre classification could improve existing beat tracking algorithms, particularly when trying to identify beats within Jazz Music.

Following this, I became a freelance sound and lighting technician, working on events across Scotland and the UK. These events included live music performances, festivals, theatre, dance, opera, TV production, art installations, conferences and installations.


Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>