This message was deleted.
# sdk-rust
s
This message was deleted.
e
AFAIK currently there's no interface in the livekit rust sdk where you can inject encoded video directly. You need to pass to livekit the unencoded video frames and let libwebrtc encode them
(libwebrtc is used internally by the livekit rust sdk)
d
ok how should I do that
do you have an example for that ?
e
I don't have a direct example, but I'm not sure about what you are trying to do
is the h264 file just a raw stream of NALUs?
AFAIU that's missing some important data for playback, like timestamps
d
Totally I want to get receiving camera tracks from livekit and composite them with gstreamer and get the output from gstreamer and send that back to livekit
e
you can use the livekitwebrtcsink for that
it's a gstreamer plugin
d
but for now just for test I want to get a h264 file data and send that livekit
d
and get video data from livekit and send that to gstreamer again for test only
e
gst-launch-1.0 videotestsrc ! videoconvert ! livekitwebrtcsink <your parameters>
d
yeah I saw that livekitwebrtcsink before but I dont want to use that
because I should and want to have more controls over it
e
then you need to decode your video frames and inject into livekit like the
wgpu_room
example does
(the
wgpu_room
example doesn't do the decoding part, just injecting unencoded frames into livekit)
d
thats going to use more cpu of server 😞
e
webrtc kinda works together with the encoder
like, webrtc can request a keyframe at any time
and again, h264 data alone is not enough
you need at least timestamps on top of that
d
another way is to get video from livekit with Go sdk and forward that to a rust server with something like websocket but it adds delay on that
yeah I have timestamps and dont have problem with that
e
in your h264 file?
d
No :))
Im so confused 🙂
e
I'm not sure the go sdk supports this use case either
AFAIU it uses libwebrtc the same way as the rust sdk, but I may be wrong
d
why rust sdk implemented with libwebrtc instead of webrtc-rs :/
e
they talk about that on the readme:
Copy code
Did you consider using <http://webrtc.rs|webrtc.rs> instead of libwebrtc?

Yes! As <http://webrtc.rs|webrtc.rs> matures, we'll eventually migrate to a pure Rust stack. For now, we chose libwebrtc for a few reasons:

Chrome's adoption and usage means libwebrtc is thoroughly battle-tested.
<http://webrtc.rs|webrtc.rs> is ported from Pion (which our SFU is built on) and a better fit for server-side use.
libwebrtc currently supports more features like encoding/decoding and includes platform-specific code for dealing with media devices.
d
yeah this message are there from 1 year ago till now as far as I know :))
e
which go sdk did you mean?
I can't find a go client library for livekit
d
that go server sdk is for both server and client
e
I'm not sure about that 🤔
but I never fiddled much with the go codebase
d
so I should decode the video that came from livekit and send raw data to gstreamer instead of get encoded video from liveit and then use gstreamer elements to decode that hmmm....
Im totally sure about that is go sdk are for both server and client because I used go sdk
in description they wrote Client and server SDK for Golang
e
nice, didn't know that
d
ok thanks finally I think I should rewrite rust sdk using webrtc-rs myself :)) libwebrtc is totally annoying ...
e
yeah libwebrtc is very high level, doesn't give you access to much stuff
good luck!
b
Hey, fwiw the webrtc library we use is already abstracted inside the
libwebrtc
crate
You can see the folder
web
,
native
, and maybe you can easily create
webrtcrs
Would love to help if you make a PR