This message was deleted.
# sdk-rust
s
This message was deleted.
🙏 2
b
Hey, do you want to receive RGB frames or the encoded RTP packets in a specific video codec?
e
encoded RTP packets, would be great if they were h264, but another codec would be fine as well
b
You can access the raw RTP frames using the EncodedTransform in libwebrtc. This is currently not supported by our Rust SDK. I’m just wondering if this is possible to use multiple encoded transforms since one is already being used for E2EE (not released yet) cc @freezing-lifeguard-43174 Should we patch WebRTC for that? EDIT: Not sure if this is the best way to do it yet
f
yes,
RtpReceiverInterface::SetDepacketizerToDecoderFrameTransformer
It is indeed possible to register a callback here, and receive the h264/vp8/vp9/av1 frame (the frame assembled by VideoRtpDepacketizer) before sending it to the Decoder.
But if
uncompressed video frames
you mean RGB/YUV420, you may need to register callbacks here https://github.com/webrtc-sdk/webrtc/blob/cf25e8183e697924c51d8550e39f3e961ab2de21/pc/video_track.h#L45
yeah, I think we should be able to export these interfaces for rust in the future
e
awesome! I will dive into that, as I need it asap.
Do you think it would be hard/feasible to make libwebrtc stop the frame processing at this callback? I would like to prevent it from decoding or doing any further processing of these video frames - it should call this callback w/ the uncompressed frame and just drop it.
apparently if you register a delegate w/ the
SetDepacketizerToDecoderFrameTransformer
, the delegate is expected to call
OnReceivedPayloadData
to continue the pipeline
I wonder how many bad things would happen if I just didn't call it?
AFAICT many bad things