While the project is still in development, the research team plans to improve the user-model communication, enhance the app's interface, and expand the composition database. The researchers also aim to incorporate orchestral music into the app's repertoire in the future. The app, which aims to make music accessible to individuals regardless of their musical background or physical capabilities, could open up a world of possibilities for musicians and enthusiasts alike.
Key takeaways:
- A team of researchers has developed an innovative app that allows users to control musical experiences using their voice, facial expressions, or gestures.
- The app uses an AI model trained on various piano compositions to process notated music and predict performance characteristics such as tempo, position, duration, and note loudness.
- The user can interact with the model through the app, and can modify the music rendition through video or audio recording, voice commands or facial expressions.
- While the project is still evolving, future plans include enhancing the user-model communication, improving the app's interface, expanding the composition database, and incorporating orchestral music into the app's repertoire.