Visualization System for the Control of the Spatialization

SemanticHIFITechnologiesRendering > Visualization System for the Control of the Spatialization <

General Description

This visualization interface aims at providing useful information to the user while he (she) controls the spatialization of sources in a sound scene. The key idea is to exploit the background area of interactive zones to superimpose valuable information for the user.

A first example of background representation is given consists in superimposing the radiation pattern of sound sources. This indicates to the user the positions of the listening reference point where the sources will be hearable and the positions where they can not be heard.

However this tool goes beyond the simple representation of the state of the system: it evaluates in advance the effect of the modification of the position of a sound source according to a given criterion and to a number of streamed metadata. The effect on the resulting soundscene of such modification is then represented as a two dimensional map. It is drawn within the interaction area in the form of a background shade of grey. Generally, a light background will indicate an area where the criterion is optimized and a dark background indicates an area where the criterion is minimized.

This system can be applied to a number of perceptual criteria. The representation of these criteria will guide the user in order to help reaching a consistent auditory result. Each of these criteria can be based on metadata either calculated in advance or estimated in real time. Example of such metadata is the loudness value of each sound source, it's global attenuation (resulting from the Spat perceptual parameters) or general contextual parameters such as the rendering setup description and the rendering algorithm used.

Responsible partner/team: IRCAM Room Acoustics team

2D representation & interaction space for a sound scene
Superimposition of the radiation pattern of sound sources