This message was deleted.
# sdk-rust
s
This message was deleted.
d
Copy code
const FB_WIDTH: usize = 640;
const FB_HEIGHT: usize = 360;
const PIXEL_SIZE: usize = 4;

#[derive(Clone)]
struct FrameData {
    framebuffer: Arc<Mutex<Vec<u8>>>,
    video_frame: Arc<Mutex<VideoFrame<I420Buffer>>>,
}

let source = NativeVideoSource::new(VideoResolution {
    width: FB_WIDTH as u32,
    height: FB_HEIGHT as u32,
});
let track = LocalVideoTrack::create_video_track("file", RtcVideoSource::Native(source.clone()));
room.local_participant().publish_track(LocalTrack::Video(track), TrackPublishOptions {
    source: TrackSource::Camera,
    ..Default::default()
}).await.unwrap();

tokio::spawn(async move {
    let file = File::open("test.h264").unwrap();
    let reader = BufReader::new(file);
    let mut h264 = H264Reader::new(reader, 1_048_576);

    let mut ticker = tokio::time::interval(Duration::from_millis(33));

    let mut data = FrameData {
        framebuffer: Arc::new(Mutex::new(vec![0u8; FB_WIDTH * FB_HEIGHT * 4])),
        video_frame: Arc::new(Mutex::new(VideoFrame {
            rotation: VideoRotation::VideoRotation0,
            buffer: I420Buffer::new(FB_WIDTH as u32, FB_HEIGHT as u32),
            timestamp_us: 0,
        })),
    };

    loop {
        let nal = match h264.next_nal() {
            Ok(nal) => nal,
            Err(err) => {
                println!("All video frames parsed and sent: {err}");
                break;
            }
        };

        let data1 = nal.data.freeze();

        println!("{}", data1.len());

        let mut framebuffer = data.framebuffer.lock().await;
        let mut video_frame = data.video_frame.lock().await;
        let i420_buffer = &mut video_frame.buffer;

        let (stride_y, stride_u, stride_v) = i420_buffer.strides();
        let (data_y, data_u, data_v) = i420_buffer.data_mut();

        *framebuffer = data1.to_vec();

        println!("{}", framebuffer.len());

        println!("{}", (FB_WIDTH * PIXEL_SIZE) * FB_HEIGHT);

        yuv_helper::abgr_to_i420(
            &framebuffer,
            (FB_WIDTH * PIXEL_SIZE) as u32,
            data_y,
            stride_y,
            data_u,
            stride_u,
            data_v,
            stride_v,
            FB_WIDTH as i32,
            FB_HEIGHT as i32,
        ).expect("panic");

        source.capture_frame(&*video_frame);

        let _ = ticker.tick().await;
    }

    Result::<(), ()>::Ok(())
}).await.unwrap().unwrap();
e
AFAIK the rust sdk doesn't give you access to the encoded frames
because libwebrtc doesn't give you access to the encoded frames
so the
video_frame
you are getting should be a decoded frame
the decoding is abstracted away, it's done for you by libwebrtc
have a look at the wgpu_room example to see an example of how it's used
d
at this step it doesn't matter the frame is decoded or encoded
and as far as I know rust sdk supports h264 too
and yes I saw the wgpu example
e
encoded = h264 nal units, decoded = no nal units, not h264
d
the problem is that the code won't go to inside while block and will block on next().await function
oh you talking about problem 2 I think
e
ah let me read again your code, I misunderstood
d
I was talking about problem 1
e
ok, I was missing some context, sorry
d
So I should publish raw video data that is ok Now what about receiving buffers ? So it should be raw too ??
e
maybe libwebrtc is failing to handle the h264 stream you are sending
you could try enabling logs for libwebrtc to have more visibility on what's going on
Copy code
export RUST_LOG=libwebrtc=debug
d
hmm... I will try that
e
on problem 2
are you writing the NAL units to the framebuffer?
d
yes
Now I got what you are trying to say
e
yeah the framebuffer is supposed to be the decoded / raw frame
d
the problem is data is encoded so size of that is small and it gives the
Copy code
dst isn't large enough
error
and it gives and takes raw video data
so now I should found some raw video data to ensure that video send and receive to livekit are working correctly
e
hmm the I420 is the chroma / luma frame
that will be encoded later
so now I should found some raw video data to ensure that video send and receive to livekit are working correctly
ah yep!
d
I don't know much about video frame types such as I420 so because in wgpu example they used I420 I will go with that too
e
like, you can see it as a weird bitmap
that's why there are helper functions like
abgr_to_i420
it will convert from abgr (alpha (?) blue green red pixel format) to i420 (the weird bitmap that separatas chroma / luma)
but it's a "decoded" frame, it will be encoded into h264 (or vp8, etc) later
there is no API I know of in the rust SDK that allows you to send encoded data
d
hmm.... thanks very much
🤘 1
I'm thinking that i's good or it's bad that it does not have this feature it would be better to have this feature in order to encode/decode in my way so I can for example reduce it's cpu usage (btw I don't know how it's cpu usage is yet 😁 )
e
yeah I had the same issue, mostly on the decoding side
👍 1
I guess their goal was to abstract the encoding/decoding part as it's very complicated and you never know which codecs are supported on each end
but yeah, when you have a specific use case and want to optimize resource usage, it's a big constraint
d
It's interesting because the go sdk gives access to rtp packet that includes encoded video so I can do whatever I want with for that
For example with rust sdk if I want to just forward video to something else I should re encode video frames
e
yeah it's a side-effect of their choice of using libwebrtc
d
For problem 1 the RUST_LOG=libwebrtc=debug that you said does not have any effect
e
weird, I use it all the time
how did you set it?
d
exactly as you said
Copy code
export RUST_LOG=libwebrtc=debug
e
are you on windows, linux or mac?
d
then "cargo run"
linux
e
well that should be enough 🤔
d
also tried these :
Copy code
RUST_LOG=libwebrtc=debug cargo run
Copy code
RUST_LOG=debug cargo run
e
try adding https://crates.io/crates/env_logger to your project (or pretty_env_logger)
(make sure to run
Copy code
env_logger::init();
or equivalent)
d
yeah just had to add the init function now it's working thanks :))
e
😄
d
it says this : "FFmpeg H.264 decoder not found" but both ffmpeg and x264 are installed 😞
e
hmm weird, which version of the SDK are you using?
you can recompile libwebrtc with h264 support and use it
but the latest sdk should already have it
d
I'm using latest version
I also installed open264 and still gives that error
e
what you need to check is if the libwebrtc that the sdk is using was compiled with h264 support
you can compile it yourself if not
e
yep, it should be enabled by default
to compile it yourself, clone the sdk repo, go to webrtc-sys/libwebrtc and run the build script
then you write the path to an env var:
Copy code
export LK_CUSTOM_WEBRTC=<path_to_repo>/webrtc-sys/libwebrtc/
ah wait
d
if it have it enabled by default then don't need to compile it my self 🤔
e
indeed
the error says something else
it's about your environment
d
maybe there is a problem with my ffmpeg installation ...
e
yep, seems like that
d
what should I install except openh264 ?
e
d
so It seems that they fixed this problem in new version
and I'm have these in cargo.toml
Copy code
livekit = "*"
livekit-api = "*"
livekit-protocol = "*"
livekit-webrtc = "*"
and I ran cargo update command but still getting this error ...
So I downloaded latest webrtc build from release page and setting LK_CUSTOM_WEBRTC fixed the problem
Now I have to deal with problem 2 and find some raw video ... 🙂