This message was deleted.
# sdk-rust
s
This message was deleted.
a
🤔 Following the call flow backwards from the room client, it looks like I can implement
media_stream::VideoFrameSink
and
media_stream::AudioSink
and to build up into a pair of tracks for each bin, and use a GST
appsink
to emit the audio data and video frames. Though, I'm curious if there's a higher abstraction to add a new
h264
source from a socket through an
RtcSession
?
d
Hey Aaron, currently you can pass in raw frames into the rust SDK for publishing. Here's an example: https://github.com/livekit/client-sdk-rust/blob/main/examples/simple_room/src/logo_track.rs We do not support taking in a raw h.264 track yet. though it may be interesting to offer this eventually. The main reason is that it's advantageous to have webrtc be closely connected to an encoder. this allows us to encode keyframes on demand (for example, when a new client receives the track)
1
👍 1
a
Cool, this is the direction I started down yesterday. Looks like I need to be on
main
for this to work -- I was getting a codec error in
webrtc-sys
so I backed off to
0.1.1
to see if that was my issue. I'll move back to
main
branches for both
livekit
and
livekit-webrtc
, and try to see where my frame creation differs from the logo track example. Here's the stacktrace I'm getting from
web-rtc
in case this makes sense to you. It's may how I'm creating the frame, though -- Since GST is already in I420 I copied the YUV planes directly into the target buffer from the appsink sample frame.
NM -- that stack is is a red-herring. Tracking down some of my gstreamer frame decoding
Thanks for the help @dry-elephant-14928 -- unfortunately, I'm still stuck. I'm still getting the same error in the stack trace I sent before:
Copy code
*** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: -[RTCVideoCodecInfo nativeSdpVideoFormat]: unrecognized selector sent to instance 0x6000008ae8c0
I thought perhaps that my strides weren't getting set (since I didn't see a way to set them outside of
yuv_helper
, so I tried converting my frame to
argb
and converting it back to
i420
before overwriting it in the frame buffer. I think the error is coming from the
VideoEncoderFactory
, but I'm not exactly sure where that gets called from https://gist.github.com/spiegela/2455f9da234c39602e8f7410efb75258
Switching to
theo/newbuilds
branch provides some more logging that may help:
d
Thanks Aaron, I was talking to @boundless-energy-78552 earlier about this and he said he'd take a look.
🙏 1
out of curiosity, where are the video frames coming from?
a
Right now I'm just testing from a gstreamer video test source -- generated just from the gstreamer libraries
My real use case with be h24 encoded frames coming from rtsp and file sources
If you want to play back the pipeline that I have in that snack, you can run:
Copy code
gst-launch-1.0 videotestsrc num-buffers=200 ! "video/x-raw, format=I420, width=1280, height=720" ! queue ! autovideosink
which is the same source & capabilities. That snack just has a different "sink" so that I push individual frames to the livekit client (or try to 😉)
b
Hey @ambitious-lunch-95812, thanks for the logs. I think I know the issue. Can you try to copy the .cargo folder from the repo to your project root?
I’ll release a new version very soon. If you want to publish a track atm, you need to use the main branch
a
Ok. Cool. Thanks! I'll switch back to main, and bring in that cargo config, and let you know asap
🙏 1
Hmm. Those rustflags segfault for me -- I'm on
aarch64-apple-darwin
maybe it's something else. let me do a clean build
👌 1
Yeah, its seg-faulting on my mac. Installing deps to test on linux
👍 1
b
weird, can you show me your Cargo.toml?
a
ohh -- one sec. I think I'm still on your fork
Sry -- tried a bunch of tags/branches today 😉
b
No problem! 🙏
a
Ok, switched back to main on
livekit/client-sdk-rust
. It still segfaults on Mac. Here's my cargo.toml
Linux gets a different error, which seems like a protobuf version incompatibility? I just installed it from
yum
this AM...
b
Is it possible to access your rtsp-publisher repo so I can debug?
a
Sure -- let me just extract this example into a separate repo
🙏 1
Here you are: https://github.com/spiegela/gstreamer-livekit Thanks again for the help
b
Thanks, I’ll take a look 🙂
🙏 1
@ambitious-lunch-95812 Can you add
theomonnom
to the repo?
I found the issue, there is a bug in the SDK, you need to connect to a room before creating a LocalVideoTrack, I’ll fix that soon
I still have no frames but I think there is something wrong with gstreamer
Also, you don’t need to call livekit-cli to generate a token
There is a crate on the repo called livekit-api
( Still not published to crates.io )
a
Oh. Cool. I saw it in the repo, but wasn’t able to import it. I knew that was a temp work around
👍 1
b
I don’t receive anything with the app_sink, I’m not sure why tho
(my gstreamer installation seems broken)
a
Yeah— getting it setup can be a pain. You can try with the
gst-launch-1.0
command up in the thread. I can try it again in 30m or so— I set a break point inside the sample loop and it was getting hit, but it’s still possible my frames are screwy
b
Yes it is working for me, image seems cropped tho
a
yeah, that's my broken argb to I420 conversion... I removed that, and uncommented the direct buffer copies, and it works fine now 🙌
🙌 2