-
Notifications
You must be signed in to change notification settings - Fork 64
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible to skip transcoding? #44
Comments
I'm not sure if I understand what you want. Are you saying that you have already transcoded your various resolutions/bitrates/languages, and you just want to encrypt & upload? If that's the case, I don't think there's very much work to be saved by Shaka Streamer. You could use Shaka Packager to encrypt & package, and then upload to S3 with gsutil or similar tools. One advantage of Shaka Streamer is that it builds the command-lines to Packager and gsutil for you, so you'd lose that by doing it by hand. But Shaka Streamer was meant to do the entire transcoding & packaging pipeline, and I'm not sure that the work it would take to make transcoding optional would be worth it in the long term. So, long story short, we don't have this today, but we're open to discussing it. |
I like the transcoding part of shaka-streamer but it seems to limit the choice of codecs like not supporting hevc which is why I want to be able to support the workflow for which I was previously using shaka-packager. As you said I already have my streams transcoded. I basically wanted to be able to just use Shaka streamer for both cases where I want Shaka streamer to do the transcoding as well as the other case where it basically invokes Shaka packager and uploads to cloud storage. |
Ah, I see. Well, please file a separate feature request for HEVC, and we can add it. We're already working on adding AV1 support (which depends on an upcoming Shaka Packager release, among other things). In the mean time, I don't think we have the time to make transcoding optional right now, but I'll mark this as a feature request, too. If you want to work on it yourself, we can discuss your design here, and then you can submit a pull request for it. We're always happy to have contributions to the project. |
Hi @joeyparrish and @hardc0der, bumping up this issue 😀. I would love to have the ability to skip transcoding and simply transmux pre-encoded assets. An example use case would be encoding everything with a VOD manifest and then using Streamer to generate a live manifest pointing to the pre-encoded segments. I am willing to write this feature 😊. I am just wondering how would we like to go about such a change architecturally? I made a POC that simply creates a node called PassthroughNode which runs FFMPEG and pipes things over to Packager. It's very similar to TranscoderNode but without the encoding, as expected. |
My first thought on this is how it would be represented in the configs. I tend to think that the config structure tends to imply a certain architecture to me. I would imagine you could specify multiple video and audio inputs in the input config, similar to what you can do today for multiple languages. Though I don't think multiple video makes sense in today's Streamer, I don't think we prevent it either, and this could make it sensible. If you had multiple video inputs, I could imagine a pipeline config with a boolean that says "transmux only" or "pass through", which skips FFmpeg and pipes inputs directly to Packager. Packager will transmux without needing to involve FFmpeg, and accepts TS, MP4, and WebM as input. How does that sound? The only wrinkle I could forsee is how we should interpret multiple video inputs if the new transmux-only mode were not turned on. Should we match inputs to outputs based on resolution? Ignore second and subsequent video streams? I haven't reviewed the code or tried it, but I suspect today's Streamer would take 5 video inputs and 5 video output resolutions and multiply them, creating 25 outputs, which is awful. |
Sounds great! I totally agree with the pipeline config boolean leading to transmuxing via packager. With the pipeline config properly defined, the ControllerNode within streamer will generate a PackagerNode and not a TranscoderNode. Then, within the PackagerNode, the transmux-only logic would orchestrate the generation of live manifests? The idea of creating 25 outputs is frightening. I think actively supporting multiple videos with transcoding is an issue in its own. It also brings up the idea of generating a playlist, like in issue #43. One fear that I have if multiple videos are processed on a single shaka-streamer instance is that the processing time could be incredibly long - if the server fails, the effects would be quite large. Is there any fault tolerance in place to notify the user of where the failure left off? For example, if streamer unexpectedly fails during a VOD transcode, is there a feedback loop to indicate where the process stopped within the transcode? If streamer is generating fmp4 segments for VOD, there is no need to redo the previously generated segments. |
Just bumping this feature and thread back up :) |
That sounds reasonable to me. If conditional statements around transmuxing in PackagerNode become too complicated, you could create an alternate PackagerNode for transmuxing only. Hopefully that won't be necessary, though. A single packager node implementation might be better maintained than two, since one version might see more use by certain developers.
Well, we could choose to throw an error on multiple video inputs with transmux-only set to false. This would preclude a future where we support something like multiple viewpoints, but I think neither Shaka Packager nor Shaka Player support that today, so maybe that's not a big deal at the moment. Maybe the best thing is to avoid overloading the input list for the purpose of transmuxing? Perhaps an explicit input_type like in #43, to avoid ambiguity in the config? Although maybe that might complicate the implementation. What do you think?
No, not really. There's just the ffmpeg output on the console with a timecode. More importantly, though...
There is no mechanism in Streamer today to resume a failed job where it left off. 😞 |
Is there any updates on this ? |
No. PRs are always welcome. |
I have encoded video and audio streams that I simply want to encrypt and generate a DASH manifest for and upload to an S3 bucket. It seems like shaka-streamer would simplify this over using shaka-packager but to me it seems like this isn't possible. Am I missing something or is this not supported?
The text was updated successfully, but these errors were encountered: