Implementation of Virtual Acoustics for a Computational Concert Hall Model

Implementation of Virtual Acoustics for a Computational Concert Hall Model

Abstract

In this thesis, a system that produces a virtual acoustic environment for any modeled space is implemented. The virtual acoustic environment can be produced by modeling the sound source, room acoustics and the listener and can be listened to through headphones or loudspeakers.

The thesis starts with an overview of the methods of computational room acoustics, which approximate the behaviour of sound in modeled space. In this work, a time domain hybrid method consisting of the image-source method and artificial reverberation is used to obtain a real-time interactive system. The fundamentals and implementation aspects of the image-source method are introduced. Also the requirements for the real-time interactive auralization process are presented.

In a real-time interactive virtual acoustic environment the interpolation of all varying calculation parameters is needed to obtain a continuous output. The interpolation methods used are introduced and a presentation of the modeling methods for sound sources, room acoustics and the listener is given. An application of the realized system is the model of a concert hall named Marienkirche - demonstrated with an animated video.

The results presented in this thesis have been applied in DIVA-project (Digital Interactive Virtual Acoustics), which aims at producing an extreme virtual experience with modeled musical instruments, players and the room acoustics.


Tapio.Lokki@hut.fi