This message was deleted.
# sdk-rust
s
This message was deleted.
e
the two alternatives I see for doing that: 1. libwebrtc (or the livekit sdk) outputs the video frame in a surface that is already in the GPU (not copied to ram) 2. libwebrtc outputs the raw encoded frames and it's my job to do whatever I need with them
I think 1 is much harder than 2
but maybe there's another way I'm not seeing?
a
And I take it your video source is livekit webrtc?
e
yep
a
I’ve been working the other direction with an external video source, but I think the
VideoTrack
can be read as an
I420
frame, but I haven’t looked to see if you could hook up a callback to receive each frame.
e
the problem is that the VideoFrame seems to be in RAM
like, after it was decoded it was copied into ram
and that's what livekit is outputting
b
We have plan to support NvCodec in the future. (Directly as a webrtc decoder). con.unity.webrtc is already implementing it here. But they are still copying the VideoFrame into the RAM. Maybe we can still create a custom VideoBuffer type with kNative
👍 1
a
Yeah. I’ve been writing new
VideoFrames
from Gstreamer. I think you’ll get a
PeerConnection
from a
RemoteParticipant
and be able to use an
OnTrack
callback to send it to your process
Oh, I see, but that’ll definitely be off GPU
e
thank you! yeah... for now I think the easiest would be for me to get the encoded frames and decode myself
a
This discussion is super interesting. So, @boundless-energy-78552 if I understand right, using the the
kNative
buffer, we could create a set of gstreamer plugins for
livekit-webtc
sources & sinks?
1
I mean we could do that with the I420 buffer, but using the kNative buffer, that plugin could exec entirely on GPU
e
if
kNative
is a gpu surface, then gstreamer has support for it as something like
'video/x-raw(memory:CUDAMemory),format=I420'
💥 1
👀 1