TASCAR is a toolbox for creation and rendering of dynamic acoustic scenes that allows direct user interaction and was developed for application in hearing aid research. This paper describes the simulation methods and shows two research applications in combination with motion tracking as an example. The first study investigated to what extent individual head movement strategies can be found in different listening tasks. The second study investigated the effect of presentation of dynamic acoustic cues on the postural stability of the listeners.