Adaptive bitrate doesn't work
See original GitHub issueHi @pedroSG94 I’m testing BitrateAdapter and I’ve implemented it the same way as it is in the example. When I switch from high-speed WiFI to 3G the bitrate is not adapted properly and it is very high. The same situation when I start streaming from 3G. The tested upload speed of 3G is ~1.5Mb/s, however, adapted bitrate is ~5Mb/s (which is my max bitrate) resulting in lags on stream.
Maybe I’m doing something wrong, but I’ve checked the source code and the bitrate that you pass to onNewBitrateRtmp(bitrate) callback may not be correct. It’s taken from RtmpConnection.publishVideoData(size) just after you write the data to the socket. I think that socket has an internal buffer and the data size you write to socket’s outputstream doesn’t have to be equal to the size of data that has been uploaded.
I’ve compared this bitrate value to the value from TrafficStats.getTotalTxBytes(), which shows the amount of data that has been uploaded and there are discrepancies between them.
Here is the code to quickly compare it
private long bitrateSum = 0;
private long initialTotalUploadBytes = 0;
private long previousCheckTimeMs = 0L;
private long lastTotalUploadBytes = 0;
private BitrateAdapter adapter = new BitrateAdapter(new BitrateAdapter.Listener() {
@Override
public void onBitrateAdapted(int bitrate) {
Log.d("lol2", "bitrate adapted: " + bitrate);
rtmpCamera2.setVideoBitrateOnFly(bitrate);
}
});
@Override
public void onConnectionSuccessRtmp() {
adapter.setMaxBitrate(5 * 1024 * 1024);
initialTotalUploadBytes = TrafficStats.getTotalTxBytes();
}
@Override
public void onNewBitrateRtmp(long bitrate) {
Log.d("lol", "onNewBitrate: " + bitrate);
adapter.adaptBitrate((int) bitrate);
bitrateSum += bitrate;
Log.d("lol", "Bitrate sum so far: " + bitrateSum / 1024f / 1024f + " Mb");
long uploadedBytesSoFar = TrafficStats.getTotalTxBytes() - initialTotalUploadBytes;
Log.d("lol", "Real upload so far: " + uploadedBytesSoFar * 8f / 1024f / 1024f + " Mb");
long bytesDiff = uploadedBytesSoFar - lastTotalUploadBytes;
long nowMs = System.currentTimeMillis();
int timeDiff = (int) ((nowMs - previousCheckTimeMs) / 1000f);
float realUploadSpeed = bytesDiff * 8f / timeDiff / 1024f / 1024f;
Log.d("lol", "Real upload speed: " + realUploadSpeed + " Mb/s");
previousCheckTimeMs = nowMs;
lastTotalUploadBytes = uploadedBytesSoFar;
}
There is also one interesting thing that I don’t fully understand, maybe you could explain.
Being on high-speed WiFI, setting rtmpCamera2.setVideoBitrateOnFly(20 * 1024 * 1024);
determines the video bitrate but also the upload speed. I mean, when I set it to 20Mb/s like above the upload speed will be 20Mb/s, when I set it to 2Mb/s the upload speed will be 2Mb/s. So is there any contract in the RTMP that video quality affects upload speed? Can’t I upload 2Mb/s quality video with 5Mb/s speed? Or maybe it depends on the internal buffer size?
Issue Analytics
- State:
- Created 3 years ago
- Comments:41 (38 by maintainers)

Top Related StackOverflow Question
I took a deep dive into the adaptive bitrate in this library (latest version, 2.1.4), and I’ve been able to fix most of the adaptive bitrate problems. Here’s what I found:
Problems with default BitrateAdapter
Notice: All of this makes sense with Constant Bitrate only! Here’s my version of BitrateAdapter that accounts for audio bitrate, packet overhead, and reacts faster to congestion.
Suggestions to improve further:
@marcin-adamczewski Actually I end up not relying on upload speed calculation at all in adaptive bitrate. What we do now is - if there is congestion (i.e. queue is 15% full) then we immediately drop the target bitrate by set amount, then re-evaluate every 2s, if there’s no congestion raise by lower amount. Basically adding and subtracting to the same target, instead of manipulating the value received from upload speed calculation.