One of these applications is related to virtual studios for TV productions, which are a further extension of the traditional blue-box, or blue-screen technology. The purpose is to create the impression that the moderator is moving and speaking in a virtual world. Virtual sets can be graphically generated with the help of 3D modeling tools. One of the current limitations of the state-of-the-art techniques for virtual studios is related to the seamless integration of the synthetic and real worlds, where the production and integration processes should run in real time and the resulting images should meet the high requirements of picture quality in professional broadcasting. The integration process is still mainly based on a simple mixing technique that does not include the spatial and physical relationships between the elements. Virtual studios offer a large number of new options. For example, animation can be integrated in real time to create more dynamic situations, and connections can be created with interactive interfaces, which can have a direct influence on the set. For integrating such features, keeping track of the position of the moderator(s) in the (real) studio would allow a more realistic integration of the moderator into the generated 3D virtual scene. The goal here is to set up a PC-based, low-cost, virtual studio system, which will broaden the field of application by adding the feature of real-time tracking of the moderator's trajectory in the 3D scene.
The second application is related to the sound system based on wave-fieldsynthesis recently developed at the Fraunhofer Institute for Integrated Circuits IIS in Ilmenau, Germany. The wave-fieldsynthesis theory was developed at the Delft Technical University (Netherlands). The new process developed by Fraunhofer IIS researchers not only records the sound, but also the sonic characteristics of the surrounding space, and information regarding the spatial arrangement of the acoustic source. This makes it possible to reproduce a more natural spatial sound, covering a wide area of the theater. The cinema "Lindenlichtspiele" in Ilmenau is already equipped with the new system.
In order to produce sound that takes into account the spatial arrangement of the acoustic sources, these sources must be identified, and their position tracked during movie recording. For the time being, most of the methods used to achieve the sound sources positioning rely heavily on manual work. Automating as much as possible of this process would reduce the time and costs related to the post-production phase. Again, of course, the goal is to obtain a PC-based, low-cost, real-time tracking system.
Given the applications at which we are aiming, several constraints are imposed on the tracker configuration. In the case of television stations, an important problem to consider is the real-time capability of the hardware and software. Synchronization between the different recording devices is also vital. Movie productions typically involve recording of scenes with various types of backgrounds. In the case of virtual studios, most of the existing recording studios are based on the traditional blue-box technology, and the moderators are segmented out from the studio images with the help of chromakeyers. There is a need, however, for porting this technology also to more general backgrounds.
The tracking system should therefore be able to cope with various types of background, as well as with clutter and partial occlusions. It should also be possible to detect and track various kinds of objects. Within the virtual studios the moderator is of prime importance, but other kinds of objects can also be included in the scenarios. The tracker we want to develop should nevertheless be able to follow general types of objects, because in tracing acoustic sources within movie settings, not only people (the characters), but also other object types should be taken into account. Particular design issues need therefore to be addressed concerning the main steps of the tracking system (moving object detection, tracking, and 3D localization). The system is for the time being in the stage of development and testing.