In cognitive neuroscience, sound sequences are used as abstracted models for temporal and sensorimotor processing in individuals and multi-agent interactions. In particular, use of musical stimuli permits the study of perception-action coupling as well as joint action and entrainment via reciprocal prediction and adaptation. In engineering, Brain-Computer Interfaces (BCI) and Neurofeedback applications have been developed for providing patients with alternative pathways of communication and interaction, as well as innovative neuro-rehabilitation treatment protocols. Bridging these two fields, the goal of this project is the interdisciplinary development of a Brain-Computer Interface platform that allows human subjects to interact directly and continuously with synthesized sound and music stimuli. Sound synthesis will be informed by contemporary understanding of the roles of certain parameters (e.g. rhythm, harmonics, and timbre) in entrainment. As such, this platform will allow studying the human brain in closed loop, e.g. the underlying neural dynamics of specific aspects of entrainment, such as temporal prediction and anticipation, synchronization and adaptation. Aside from the technical aspect of developing this BCI, an important part of this project will be its validation with respect to the above hypotheses in a series of studies. Potential contributions of this project are expected to be in human-machine interaction as well as neurorehabilitation.