-
-
Notifications
You must be signed in to change notification settings - Fork 8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
obs-webrtc: prototype / playground #7192
Conversation
6177f7b
to
209ebd0
Compare
72d725b
to
26f7576
Compare
Hello DDRBoxman, first off thanks for putting this work into the public. I’ve got a couple of general thoughts so far, which are more focused on the positives and not exactly feedback :P:
Okay, now for feedback on the PR!
With that said, I pulled down this PR locally and made a one line modifications to the WHIP URL and was able to get a working prototype with Pion & our video infrastructure in about two hours. So for me this prototype is already a massive success and I can’t wait to see the day it’s implemented into OBS! If I can help in any way from my side, let me know but be warned my C is a little Rusty. :)
Thank you for the work & submitting this PR! |
Really cool, thanks for working on this! In a couple of WHIP-related experiments I did, I used the NDI support in OBS to "WebRTC-ify" the OBS output via my GStreamer based WHIP client, so a native support in WebRTC opens the door to very cool opportunities. I haven't tested this with Janus yet, but I have a couple of questions on how it's all structured now. It does make sense to use a library like webrtc.rs to handle the WebRTC layer, especially if as you said you plan to re-use the existing encoding functionality in OBS, but I do have some doubts:
Thanks! |
Making the whip defaults better definitely makes sense, they are pretty much testing place holders for now. Ideally services that want to do some sort of oauth flow would be able to set that up configure the auth token automatically. OBS has many sources for encoders, we generally use either ffmpeg or the hardware encoders the machine provides. The ffmpeg we ship should already have a vp9 encoder included. If manufactures add AV1 output to their hardware encoders we will be able to send AV1. Anyway h.264 is still on the table, every browser supports it in 2022, and if we need to tinker with the encoder settings that's fine and expected. Looks like webrtc.rs does in fact support TWCC. |
👏 This is great and i would love to see something like this merged. Regarding What are the next steps for this prototype? Congratulations again! |
@ggarber A MIT implementation of GCC exists here. It just hasn't been ported/rewritten in Rust yet. It sounds like OBS already has a bandwidth estimator, so we should feed the TWCC back into that? Simulcast I was going to wait until this merged to propose. I worried that too many features at once could slow things down from happening. Maybe I am wrong though! |
GStreamer also has a GCC impl now, soon to be upstream in gst-plugins-rs, https://github.com/centricular/webrtcsink/tree/main/plugins/src/gcc |
Description
This draft PR shows off some work on how we could structure a webrtc plugin without needing to pull in the large monster that is libwebrtc. It provides the webrtc output in such a way that a service plugin should be able to handle the signaling on their end and just request an SDP from OBS to send. It also provides a WebRTC input source to receive video feeds from a remote service, this ideally should also be as simple as having a plugin handle signaling and sending SDPs around and tagging which feeds are which.
We use https://webrtc.rs to manage the webrtc protocol. We can do this because OBS already has encoders, video and audio capture, and all the extra bits libwebrtc would provide to an app that doesn't already do these things.
This also adds basic WHIP support for the output so that if you have a site that supports whip you don't have to worry about writing a plugin to handle signaling.
Here's a little rust program I was using to test things. The first commit receives data from obs, and the second sends video feeds to OBS as a source:
https://github.com/DDRBoxman/CoolWHIP
A little diagram I threw together to show how this code would interact with an OBS plugin that handles its own signaling. We would probably want to expose APIs to provide a web socket and http calls so that each plugin doesn't have to bring its own web socket code / library.
Motivation and Context
We should provide an API that mirrors chrome's webrtc javascript apis in the browser so developers can more easily make webrtc sources and destinations for their services
How Has This Been Tested?
Types of changes
Checklist: