question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Feature: A possibility to set any mesh as the global audio listener

See original GitHub issue

Hey all,

This is my first contribution, so let me start by saying how amazing BabylonJS is, and thanks to all working on this project.

Regarding this feature: currently, the only possible listener for any spatial sounds in the scene is the active camera. This is fine in all cases where the active camera is actually “you”.

The problem: In my game, the camera is positioned always at the Vector3(0, 105, 105) and follows the player around. If the player encounters door for example, that has sound attached to it with spatial distance of 60, the camera will not be able to hear that sound (simply because it’s too far away). Right now I am solving this by creating all sounds in the same distance of it’s mesh as the camera is from player (so the sound position is mesh.position + Vector3(0, 105, 105)). This works, if the camera is static and is not rotating with the player object.

The solution: Maybe I am naive, but by reverse engineering the audio engine, I realised the solution should be very simple.

In the file https://github.com/BabylonJS/Babylon.js/blob/d7827d78296faf574984a370dbe25a0a7b93d63f/src/Audio/audioSceneComponent.ts on line 387 we have the _afterRender function that determines the camera’s global position and set’s the listener position.

 private _afterRender() {
        const scene = this.scene;
        if (!this._audioEnabled || !scene._mainSoundTrack || !scene.soundTracks || (scene._mainSoundTrack.soundCollection.length === 0 && scene.soundTracks.length === 1)) {
            return;
        }

        var listeningCamera: Nullable<Camera>;
        var audioEngine = Engine.audioEngine;

        if (scene.activeCameras.length > 0) {
            listeningCamera = scene.activeCameras[0];
        } else {
            listeningCamera = scene.activeCamera;
        }

        if (listeningCamera && audioEngine.audioContext) {
            audioEngine.audioContext.listener.setPosition(listeningCamera.globalPosition.x, listeningCamera.globalPosition.y, listeningCamera.globalPosition.z);
        ...

My proposal is - Add new method setListener, that would set an internal property this.customListener`. Then simply check if customListener is set and use its absolutePosition, if it’s not set, use the camera.

 private _afterRender() {
        const scene = this.scene;
        if (!this._audioEnabled || !scene._mainSoundTrack || !scene.soundTracks || (scene._mainSoundTrack.soundCollection.length === 0 && scene.soundTracks.length === 1)) {
            return;
        }

        var listener: Nullable<Camera | Mesh | AbstractMesh>;
        var listenerPositionKey: 'globalPosition' | 'absolutePosition' = 'globalPosition';
        var audioEngine = Engine.audioEngine;

        if (this.customListener && !this.customListener.isDisposed() && this.customListener.isEnabled()) {
            listener = this.customListener;
            listenerPositionKey = 'absolutePosition'
        }
       else if (scene.activeCameras.length > 0) {
            listener = scene.activeCameras[0];
        } else {
            listener = scene.activeCamera;
        }

        if (listeningCamera && audioEngine.audioContext) {
            audioEngine.audioContext.listener.setPosition(listener[listenerPositionKey].x, listeningCamera[listenerPositionKey].y, listeningCamera[listenerPositionKey].z);
        ...

I am very happy to implement this if you decide to move forward. I do not have a VR headset so I will not be able to test the code that follows this regarding cameras in VR, so a help would be needed unfortunately.

Thanks!

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:6 (6 by maintainers)

github_iconTop GitHub Comments

1reaction
deltakoshcommented, Aug 11, 2019

correct 😃 this way we have an easy implementation while being really flexible for the user

0reactions
Foxhoundncommented, Aug 11, 2019

@deltakosh You mean that the user would define it as

scene.getAudioListenerPosition = () => {
    return myMesh.absolutePosition;
}

and then we would call this function in the _afterRender function ?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Setting a mesh as the audio listener - Questions - Babylon.js
Hi all! I need to set my character mesh as the listener of sounds ... Issue: Feature: A possibility to set any mesh...
Read more >
Audio Listener Component - Open 3D Engine
An audio listener acts as a sink for sound sources in the virtual world, and 3D audio rendering is processed with respect to...
Read more >
Set Audio Listener Override | Unreal Engine Documentation
Used to override the default positioning of the audio listener. Target is Player Controller. Set Audio Listener Override. Target. Attach to Component.
Read more >
Audio Source - Unity - Manual
Properties ; Doppler Level, Determines how much doppler effect will be applied to this audio source (if is set to 0, then no...
Read more >
Bluetooth Mesh Networking FAQs
No, Bluetooth mesh networking is optimized for the exchange of small ... and define the standard functionality of a device, such as the...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found