-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Compute the encoding bitrate on the host side for supported clients #1951
Conversation
015007a
to
1038f0b
Compare
Copy of comment at commit: This proved to be a troublesome commit for me as it forces 'video bitrate' and 'available bandwidth' very close together. Many NICs need their bandwidth to be as good as uncapped to avoid packet loss. Having a 1000M link but being capped at under 100 because you don't want outrageous video encoding bitrates is incredibly lossy for (cheap) NICs. The alternative of being able to set bandwidth near the interface's max to avoid packet loss but having to somehow force my GPU to do a 700+ MBit encode doesn't go over well either. A disengaged video encoder speed from the capped bandwidth would be best. |
But video bitrate and available bandwidth are tightly coupled. That is the nature of realtime transmission. If video bitrate exceeds total bandwidth, that causes data to to accumulate (bufferbloat) and the stream is no longer realtime. I'm not sure what you mean by "capped bandwidth". Sunshine doesn't cap the actual network bandwidth in any way (which is actually a bug). Packets are sent as fast as possible with no packet pacing to prevent overwhelming network buffers along the path to the client. I agree that Sunshine should attempt to maximize available bandwidth regardless of the encoder bitrate, and that's exactly how it behaves today. |
Thanks for the explanation. Perhaps I'm thrown off by the way the final video bitrate is calculated from the user-set bitrate in moonlight. It seems to me that the user-set bitrate is sort of a pooled bitrate from which later on small parts are subtracted for audio, FEC etc. But is the user-set bitrate then an attempt at setting just the video bitrate or does one clarify the available bandwidth as a whole? For example, before this commit I could build moonlight-common-c with the maximumBitrateKbps and initialBitrateKbps set to, let's say 40Mbit/s. Then starting a stream with user-set bitrate set to 800Mbit/s then provides a very stable and smooth stream @ 4k 60fps HEVC + HDR. dropping the user-set bitrate to 200Mbit/s (still having the moonlight-common-c limit hardcoded at 40Mbit/s) already starts stuttering because of dropped frames. I hope I'm making sense. |
It's the latter now. The former is extremely confusing for users because it's quite difficult to predict what the actual network bandwidth usage (total bitrate) will be. The total bitrate used to be influenced by the user's FEC percentage (overshoot), video codec (undershoot for HEVC/AV1), and even whether the stream was local or remote (local would overshoot by FEC percentage). Now it's simple - what you select as your bitrate will be the total bandwidth consumed.
I don't understand what you mean by setting |
I'm sorry if I'm not explaing this in an understandable fashion. But before when the max video bitrate was still capped to 100mbit by the client settings in "moonlight-common-c/src/SdpGenerator.c" I was able to change this to, let's say 45mbit/s or whatever my hardware encoder could fathom. But then still I could turn the "user slider"/"user-set bitrate"/"configuredBitrateKbps" all the way up to 900mbit/s without the videostream going past the max video bitrate. This gives me far better and smoother results than having these two numbers close together. Perhaps a better way to phrase it is that a definite videobitrate cap option would be an option, regardless of all other variables. |
Description
Historically, Moonlight has performed bitrate adjustments on its side to account for HEVC, HDR, and FEC overhead in various scenarios. These were very brittle and could lead to very unintuitive behavior in some situations (see the commit message of moonlight-stream/moonlight-common-c@3ed3ba6 for more details).
With this protocol update, a new SDP attribute
x-ml-video.configuredBitrateKbps
is sent from the client with the user's actual configured bitrate. This allows Sunshine to compute the encoding bitrate on its side, including things unknown to the client like the FEC percentage. It also allows the effective bitrate to exceed the 100 Mbps cap that GeForce Experience enforced.Screenshot
Issues Fixed or Closed
Type of Change
.github/...
)Checklist
Branch Updates
LizardByte requires that branches be up-to-date before merging. This means that after any PR is merged, this branch
must be updated before it can be merged. You must also
Allow edits from maintainers.