Setting libvpx codec of cfg.rc_target_bitrate
See original GitHub issueWhen REMB packet is received, the target bitrate is unpacked and set as vpx codec’s configuration, just like below.
https://github.com/aiortc/aiortc/blob/0310cd8486b38d8d8f295650253224247c43ed42/aiortc/codecs/vpx.py#L324-L326
https://github.com/aiortc/aiortc/blob/0310cd8486b38d8d8f295650253224247c43ed42/aiortc/codecs/vpx.py#L262-L264
(p.s I made sure __update_config()
is called every time, fixing self.__update_config_needed
to be always True
)
I wanted to see if the vpx codec actually reflects the new cfg.rc_taget_bitrate
, thus I saved the image and re-encoded to video to see the result.
However, the quality of the image/video is always same regardless of cfg.rc_target_bitrate
change.
I even tried setting all cfg.rc_target_bitrate
as just 1kbps (just like below), expecting very low quality video,
DEFAULT_BITRATE = 1000 # 1 kbps
MIN_BITRATE = 1000 # 1 kbps
MAX_BITRATE = 1000 # 1 kbps
but the result quality of image/video is always same.
Can I get any help?
How can I make sure vpx changes the bitrate according to its cfg.rc_target_bitrate
?
Issue Analytics
- State:
- Created 4 years ago
- Reactions:3
- Comments:7
I can also confirm this. Setting self.cfg.rc_max_quantizer to a lower value like 10 increases bitrate but setting self.cfg.rc_target_bitrate to any value does nothing. This makes me suspect cfg isn’t being used properly.
Can confirm this. We have an example test that adds bitrate configuration to the aiortc server example and bitrate configuration works fine for
h264
but doesn’t work forvpx
: https://github.com/Roboy/webrtc-tests/tree/main/aiortc-test