Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
184 views
in Technique[技术] by (71.8m points)

javascript - How does the video tag determine how much buffer it needs for canplay

I'm using 2-second chunks for my video. The chunks are being broken into WebSocket messages and I'm trying to have the video player play immediately without waiting for the whole chunk. I added a delay and see the buffer building up with each message but it still waits for the whole chunk before it plays.

How much buffer does the video.canplay need before it plays and is there a way to reduce this amount so if it has any buffer at all it will play it?

question from:https://stackoverflow.com/questions/65929803/how-does-the-video-tag-determine-how-much-buffer-it-needs-for-canplay

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Its probably worth saying for a start that 2 seconds is actually very good latency for an IP OTT streamed video - you will struggle to get it much lower that that using streaming solutions with todays technology.

If you need real time video then technologies like WebRTC are the usual approach - they trade off video quality for latency.

Looking in more detail at your question, most videos are streamed these days using either HLS or DASH. These streaming protocols break video into segments or fragments and the segments are read by a HTML5 video player, like DASH.js, and fed to the browser video player using the Media Source Extension mechanism (MSE). A video tag would not normally have a segment or fragment set as the src directly as mentioned in the comments.

The main factors that determine how long a video takes to start playing are the size of the segments and the amount of buffer the video wants to have in place to ensure playback without interruption.

Most players will have some way to configure the latter - for example DASH.JS has the method MediaPlayer.updateSettings and the documentation includes an exmaple:

player.updateSettings({
     streaming: {
         liveDelayFragmentCount: 8
         abr: {
             maxBitrate: { audio: 100, video: 1000 }
         }
     }
 });

(from: https://cdn.dashjs.org/latest/jsdoc/module-MediaPlayer.html)

There is also now a new standard, CMAF, which works with HLS and DASH and allows lower granularity meaning that a player can start playing a video even before it receives a full segment.

CMAF essentially allows the size of a decodable chunk of video to be smaller, so you need to receive less before the decoder and player can start their work - it is described in the MPEG standards (https://mpeg.chiariglione.org/standards/mpeg-a/common-media-application-format) and the diagram below from this description gives a good illustration:

enter image description here

It's worth being aware that, as CMAF is relatively new, support across packagers, on the server side, and across players and devices on the client side may not be as widespread as you need right now.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...