Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

obs-webrtc: prototype / playground #7192

Closed
wants to merge 1 commit into from

Conversation

DDRBoxman
Copy link
Member

@DDRBoxman DDRBoxman commented Aug 26, 2022

Description

This draft PR shows off some work on how we could structure a webrtc plugin without needing to pull in the large monster that is libwebrtc. It provides the webrtc output in such a way that a service plugin should be able to handle the signaling on their end and just request an SDP from OBS to send. It also provides a WebRTC input source to receive video feeds from a remote service, this ideally should also be as simple as having a plugin handle signaling and sending SDPs around and tagging which feeds are which.

We use https://webrtc.rs to manage the webrtc protocol. We can do this because OBS already has encoders, video and audio capture, and all the extra bits libwebrtc would provide to an app that doesn't already do these things.

This also adds basic WHIP support for the output so that if you have a site that supports whip you don't have to worry about writing a plugin to handle signaling.

Here's a little rust program I was using to test things. The first commit receives data from obs, and the second sends video feeds to OBS as a source:
https://github.com/DDRBoxman/CoolWHIP

A little diagram I threw together to show how this code would interact with an OBS plugin that handles its own signaling. We would probably want to expose APIs to provide a web socket and http calls so that each plugin doesn't have to bring its own web socket code / library.

Paper Journal 5

Motivation and Context

We should provide an API that mirrors chrome's webrtc javascript apis in the browser so developers can more easily make webrtc sources and destinations for their services

How Has This Been Tested?

Types of changes

Checklist:

  • My code has been run through clang-format.
  • I have read the contributing document.
  • My code is not on the master branch.
  • The code has been tested.
  • All commit messages are properly formatted and commits squashed where appropriate.
  • I have included updates to all appropriate documentation.

@clone1018
Copy link
Contributor

Hello DDRBoxman, first off thanks for putting this work into the public. I’ve got a couple of general thoughts so far, which are more focused on the positives and not exactly feedback :P:

  1. Firstly, I love the idea of having specific features / plugins available to be written in Rust, your point about it being able to encapsulate the heavy dependency without cluttering up the overall project is absolutely true. In addition Rust could not only attract new developers, but if used for more error prone things, help reduce some of the bug surface. However I suspect OBS will need a common layer for the Rust dependency in the future, so everyone is not including the same dependencies in isolation?
  2. At Glimesh, we’re trying to push the latency barrier for streaming because we believe that latency is what makes a huge difference on a stream. Waiting 2 seconds to chat with your favorite streamer vs 200ms turns a recorded video into an actual conversation. WebRTC can help us push that barrier down even smaller, while removing other large projects & dependencies from the video path. In our simple testing with this PR so far, we’ve seen latency down to 80ms or so (not including network latency from ingest ⇒ edge ⇒ viewer). I suspect with a pure WebRTC video path, that could go down even lower.
  3. Another improvement with WebRTC is being able to easily handle video from many sources. Video servers become less about intake / output and more about routing videos where ever they need to go. I can even imagine inviting people from your stream to join you for a chat, and all they have to do is use their browser. All of the servers just become peers, and all of the viewers become peers, and all of the streamers become peers. (However I imagine most would significantly limit who can see who!)
  4. An idea @Sean-Der (creator of Pion) brought up is one could also easily make a project that allows you to have a “Broadcast in a Box”, where a streamer interface with a simple WebRTC video page and their friends could watch directly! Which is a super common use case in my experience.
  5. I’m also super excited about the possibility of DataChannels in providing streams with even faster & deeper levels of interactivity. Since WebRTC provides this out of the box, it should be even easier for services / plugins to communicate with each other, without having to poke some holes in the users firewall.

Okay, now for feedback on the PR!

  1. I think including some “OBS Standard” WHIP protocol as the default mechanism is a good idea. WHIP is still so early and unimplemented in many places, I suspect whatever OBS chooses to implement will become the defacto. I know many prefer WebSockets for signaling, but in most simple use cases WHIP over HTTPS works perfect. In keeping with this idea, we’d need to add a simple Stream Key based Authentication mechanism by sending the entered stream key over to the WHIP server via an Authorization header. I do wonder if it would be best to use our own auth scheme though, instead of “Bearer” since a Stream Key has no relevance to a traditional Bearer token.
  2. I imagine the server URL would become the WHIP url and users of OBS could just paste in whatever the service gives them for custom. On the services side, a service could just register one (or more if they wanted?) WHIP URL. Since the actual video server can be negotiated through the SDP. WHIP also allows for redirecting at the HTTP level if necessary.

With that said, I pulled down this PR locally and made a one line modifications to the WHIP URL and was able to get a working prototype with Pion & our video infrastructure in about two hours. So for me this prototype is already a massive success and I can’t wait to see the day it’s implemented into OBS! If I can help in any way from my side, let me know but be warned my C is a little Rusty. :)

image
Screenshot of an OBS WebRTC output flowing into Glimesh using our existing video infrastructure. The exact path this is taking is something like OBS WebRTC -> Pion WebRTC -> FTL -> Janus WebRTC -> Viewer.

Thank you for the work & submitting this PR!

@lminiero
Copy link

Really cool, thanks for working on this! In a couple of WHIP-related experiments I did, I used the NDI support in OBS to "WebRTC-ify" the OBS output via my GStreamer based WHIP client, so a native support in WebRTC opens the door to very cool opportunities.

I haven't tested this with Janus yet, but I have a couple of questions on how it's all structured now. It does make sense to use a library like webrtc.rs to handle the WebRTC layer, especially if as you said you plan to re-use the existing encoding functionality in OBS, but I do have some doubts:

  • Does OBS support VP8/VP9/AV1 too, or is it limited to H.264? H.264 in browsers often causes headaches, because browsers have limited support for it in their WebRTC stacks, and are often very picky in what they accept: if it's not baseline, it uses some custom feature, or it's packetized in ways it doesn't like, it just won't work, and considering the H.264 support in OBS is tailored for traditional broadcasting scenarios (where you can have very frequent keyframes, that are harmful to WebRTC), this might cause problems.
  • Does webrtc.rs support bandwidth adaptation via TWCC, and do the encoders in OBS have support for adapting bandwidth depending on network conditions? For the latter, I guess the answer might be yes, as even when using it, e.g., to send on RTMP/TCP, if the newtork tells you you have 500kbps and you're trying to send at 2mbps, I'd expect the encoder to lower the target bitrate as a consequence or things would just break. In WebRTC, TWCC is the way by which a sender computes the available bandwidth.

Thanks!

@DDRBoxman
Copy link
Member Author

@clone1018

Making the whip defaults better definitely makes sense, they are pretty much testing place holders for now. Ideally services that want to do some sort of oauth flow would be able to set that up configure the auth token automatically.

@lminiero

OBS has many sources for encoders, we generally use either ffmpeg or the hardware encoders the machine provides. The ffmpeg we ship should already have a vp9 encoder included. If manufactures add AV1 output to their hardware encoders we will be able to send AV1. Anyway h.264 is still on the table, every browser supports it in 2022, and if we need to tinker with the encoder settings that's fine and expected.

Looks like webrtc.rs does in fact support TWCC.
OBS 24 got support for dynamic bitrate so it should just be a matter of hooking it up.
https://obsproject.com/wiki/Dropped-Frames-and-General-Connection-Issues#dynamic-bitrate

@ggarber
Copy link

ggarber commented Oct 19, 2022

👏 This is great and i would love to see something like this merged.

Regarding webrtc-rs even if it supports twcc I don't think it has the bandwidth estimation (bwe) implementation in place (webrtc-rs/webrtc#298) and neither other quality optimisations like simulcast or fec. Without bwe I think I won't be usable in constrained network cases but otherwise it should be fine. I understand the benefit of having a smaller&simpler library like webrtc-rs instead of libwebrtc from Google, so it is not an easy choice.

What are the next steps for this prototype?

Congratulations again!

@Sean-Der
Copy link
Contributor

Sean-Der commented Oct 19, 2022

@ggarber A MIT implementation of GCC exists here. It just hasn't been ported/rewritten in Rust yet.

It sounds like OBS already has a bandwidth estimator, so we should feed the TWCC back into that?

Simulcast I was going to wait until this merged to propose. I worried that too many features at once could slow things down from happening. Maybe I am wrong though!

@philn
Copy link

philn commented Oct 20, 2022

@ggarber A MIT implementation of GCC exists here. It just hasn't been ported/rewritten in Rust yet.

GStreamer also has a GCC impl now, soon to be upstream in gst-plugins-rs, https://github.com/centricular/webrtcsink/tree/main/plugins/src/gcc

@k0nserv
Copy link

k0nserv commented Oct 22, 2022

Ingress simulcast is in place in webrtc-rs, but egress is not. There's however a PR working to port this from Pion in progress.

Like @Sean-Der says the Pion bwe implementation should be relatively straightforward to port too.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
New Feature New feature or plugin Request for Comments More feedback & discussion is requested
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants