top of page

The Importance of Spatialized Sound in Virtual Reality

  • julienicoloff07
  • Jun 19
  • 6 min read
Illustration produite via ChatGPT
Illustration produite via ChatGPT

To mark the fête de la musique (The music day in France), this article briefly describes the importance of sound in immersive 3D experiences.


LS GROUP, formerly known as Light & Shadows, has been offering its expertise since 2009 in showcasing industrial 3D data and its diverse use cases. As our name suggests, we "shine a light" on our clients' digital scans or digital twins in sectors such as aerospace, automotive, energy, luxury, and retail. This allows them to enhance sales, train more effectively, and design better production systems.


Initially focused on visuals, this enhancement takes the form of websites, mobile apps, serious games for dedicated PCs, or on virtual and mixed reality headsets that have been on the consumer market for over a decade.


However, to create more engaging and immersive experiences, we must consider the entire human sensory system, especially hearing. Sound is obviously one of the signals that our brain constantly picks up and interprets. Just as we create 3D visual stimuli to immerse a user in a virtual world, we can also generate a virtual soundscape that reinforces and supports the user in this new environment.

 

Immersion, Sound, and Cognitive Psychology 


Why do we go to theaters instead of watching a film on our TV or smartphone? To strengthen that feeling of "immersion" that envelops us in dark theaters, where everything is designed to direct our visual and auditory focus solely on the cinematic content.


Why play a video game using a Virtual Reality (VR) headset instead of a handheld console? For the exact same reasons: immersive systems help us focus on the virtual experience by removing real-world distractions and stimuli.


And why are mixed reality headsets increasingly replacing previous-generation VR headsets? Because they smartly blend virtual stimuli with the real world elements we might not want to cut off completely, such as colleagues, nearby obstacles, or visual and geographic context, bringing a more social and workplace-friendly experience, especially in industrial settings.


From an evolutionary point of view, it is above all mankind's powerful visual system that has enabled us to survive and evolve in our environment: to see our prey in the distance to eat, and to see our predators coming before we are eaten! But hearing is no less important, as it enables us to pick up signals and dangers before they enter the field of vision. So sounds are also an important part of the amount of sensory information that the human brain is constantly processing. Next comes touch and, to a lesser extent in humans, taste and smell.


Thanks to this hierarchy of senses and the technological maturity of audio equipment, we can now create truly convincing and useful virtual auditory stimulus to deliver increasingly valuable and unique experiences.


 

The Functions of Sound in Immersive Experiences 


There are two main types of sound:


  • Realistic immersive sounds, recorded, generated, or transmitted, which enhance the user's immersion and make the scene more convincing. For instance: the sound of a car engine accelerating, or birdsong during a virtual nature walk.

    Note that even without accompanying 3D visuals, such sounds can be enough to “transport” someone, like a meditation podcast would transport you for a moment to a riverside.

    Immersive voice chat also falls into this category (such as Microsoft Teams Immersive Spaces), where participants hear each other spatially, based on the position of their avatars in 3D space, their voices are then transmitted, processed, and spatialized accordingly.


  • Functional, generated sounds, which help users understand the system's state, like the “beep” from a washing machine indicating the Start button has been pressed. This is called sonification of 3D virtual interfaces, the graphic and auditory design of UI elements, allowing users to better understand button states, parameters, notifications, etc.

    In real life, think of the audible feedback from a metal detector or a Geiger counter, where the sound mimics a physical magnitude, helping users interpret variations or locate the sensor in space.

    Sonification is particularly important in virtual reality interface design, as immersive systems often do not take every sense into account. For example, sound can replace the “haptic” feedback (= vibration) that you would normally get on a smartphone when you press a button. If well designed, this exchange of user feedback modalities, although unnatural, can be completely accepted by the user. They understand that in this (virtual) reality, when their finger makes contact with a menu, they get a slight “tick” to confirm that their action has been successful.


Whatever the type, realistic or generated, any sound added to an immersive experience serves a specific purpose: to create atmosphere, inform, locate, alert or add powerful markers.

 

Some of benefits of Immersive Sound


  • Sound not only provides additional information but also greatly enhances immersion, adding authenticity and credibility to the virtual environment, especially when paired with realistic visuals.


  • A real “bubble” feeling, allowing our auditory system to temporarily disconnect from real stimuli, with the right equipment (headphones or closed earphones).


  • Precise and instantaneous localization of virtual objects around us, even when they are out of our field of vision (useful for notification systems, or in VR action games), calling on a wider and more natural set of user reflexes than sight alone.

 

Challenges of Immersive Sound 


  • Immersive sound isn't always suitable in industrial settings. It often requires headphones, which may be prohibited in factories due to safety risks.

    As a result, audio feedback may need to be excluded from solutions tailored to industrial clients.


  • In collaborative experiences, voice chat is often necessary. When users are remote, it's easy to spatially link voices to their avatars. But in local multi-user settings (several users in the same physical room), immersive audio can become problematic. There can be dissonance between actual user positions and their virtual avatars (e.g., side-by-side in real life but face-to-face virtually), resulting in duplicated sound—directly through the air and via headsets’ audio system, causing echoes and discomfort.

    In such cases, the virtual audio system must either be adapted or entirely replaced with real-world verbal communication, without any equipment.

 

Some more advanced and original uses of immersive sound 


  • Ingress, a geolocated Augmented Reality mobile game by Niantic (creators of Pokémon Go), announced in 2023 the use of AR Audio in supported headsets, using 3D head tracking. Players can locate virtual "portals" by turning their heads, with impressive precision matching the GPS map.


    See the article here: https://ingress.com/news/audio-ar-more-devices

 

 

  • In a more professional context, cities and municipalities can use immersive 3D audio simulations to evaluate the acoustic impact of real estate, urban transformation, or road construction projects. These simulations also apply to different scales, like concert hall acoustics based on seating capacity, location, stage decor, and object presence.


    Some companies specialize in audio engines for realistic acoustic simulation, accounting for material properties. These technologies resemble ray tracing techniques and can be optimized, even for real-time results.


    We notably collaborate with Noise Makers for these types of projects.


    For more on this topic, this video by Vercidium perfectly illustrates the basics of acoustic simulation techniques for games and beyond: https://www.youtube.com/watch?v=u6EuAUjq92k

 


  • As part of a Saint-Gobain project, LS GROUP addressed the main objective of providing sales teams with a high-performance tool to optimize the sale of conservatories during their visits to potential clients. To meet this need, a white-label iPad application was developed. The app is structured into several modules covering the entire customer journey: a step-by-step configurator for designing the conservatory, a photorealistic image comparison tool to evaluate different types of glazing, an augmented reality module to project the conservatory onto the client’s home, and an innovative acoustic module.


    The latter plays a key role in the user experience: it allows users to concretely grasp the acoustic impact of the selected materials by simulating realistic weather sounds (rain, hail, wind) on different types of glazing and roofing. Acoustics, often underestimated yet essential for everyday comfort, is brought to the forefront with this feature, significantly strengthening the sales pitch by making a hard-to-describe aspect tangible. By placing the client in an immersive situation, this module promotes more informed decision-making and enhances the perceived value of the product. The tool also includes complete customer database management, facilitating commercial follow-up.


Projet configurateur vérandas Saint-Gobain
Projet configurateur vérandas Saint-Gobain

 

Want to Try Implementing 3D Sound? 


For technical users interested in exploring 3D audio engines for their apps, here’s a non-exhaustive list:



Also, major game engines like Unity 3D and Unreal Engine include audio systems that support VR sound spatialization.

 

And for french Music Day… What About Immersive Music? 


If you're curious to learn more about Immersive Music, check out this article by BPI France:


 


Author : Benjamin CAPEL 

Co-author : Titouan MOTREUIL

bottom of page