Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Separate audio and video into different html tags

See original GitHub issue


Thanks for your super awesome work!

I have a very special and disturbing request to submit today.

The situation is that I would like to have a streaming content on a webpage (like a person talking), as well as let users navigate other content and sometimes play it, could be audio or video. However, on Safari on iOS, it won’t work: the browser allows only 1 video with audio playing at a time. So if the user clicks on play on a secondary content on their iPad, the streaming stops and only the secondary video plays. It is a known issue on Safari on iOS ( It however works perfectly on any desktop browser…

A workaround seems to be to split the video into a <video> and an <audio> tags: then the video is muted and can play as a “background” video, while the audio is played separately, and multiple audios are allowed to play together anyway. This example provided in the comments of the webkit bug works like that:

It also seems to work like this with webrtc video-conferencing solutions (jitsi, …).

So, would it be possible to implement something like an audio/video split with OvenPlayer and OvenMediaEngine to allow a streaming and another video together?

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:6 (3 by maintainers)

github_iconTop GitHub Comments

SangwonOhcommented, Mar 23, 2021

@GregOriol Interesting. However, with the current OvenPlayer implementation, separating video and audio tags seems to be a very difficult task. This is because from the initial design, it was designed to play audio and video in a single video tag. For now, I think we have to think about whether there are other alternative methods.

GregOriolcommented, Mar 30, 2021

That’s why I was saying “kind of does the thing for a WebRTC stream”: it seems to work well because the syncing is done by the stream and it’s the same stream object being used. Not sure that would work for HLS/Dash for example.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Loading a mute video with a separate audio into a HTML5 ...
You cannot have two sources (a video and an audio) playing at the same time under the same video / audio tag. As...
Read more >
Video and audio content - Learn web development | MDN
Active learning: Embedding your own audio and video · Save your audio and video files in a new directory on your computer. ·...
Read more >
How to embed video and audio in your HTML - freeCodeCamp
type This is going to be video/mp4 because .mp4 is the format of the video we are using. We can also use different...
Read more >
19 - HTML5 Audio and Video tags - HTML Full Tutorial
Learn how to properly use the HTML5 Audio and Video tags. In this lesson, we take a look at the HTML5 Audio and...
Read more >
How to Embed Audio and Video in HTML? - GeeksforGeeks
Step 3: Once the HTML file is saved, you can write HTML code inside this file. In this example, we have to embed...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found