wooden-controller-23004
08/01/2023, 4:25 PMavformat_open_input
calls fail with "Protocol not found". if i unlink livekit then they work fine. i don't make any actual calls into the livekit sdk (having a call to livekit in an unused function to ensure it gets linked is enough to break it).
this is true for ffmpeg calls with remote and local paths, with h264 and non-h264 encoded videos.
my guess is that webrtc statically bundles ffmpeg compiled with minimal features (and with patches apparently), and since i have to build the whole thing statically to get livekit to work, the ffmpeg-next lib links that ffmpeg version as well. if that is the case i don't know how to change it.
i'm not sure what can help here but wonder if you have any thoughts. maybe shared webrtc would help as i could then use specific ffmpeg dlls, but i don't know where to start on that.enough-zebra-80131
08/04/2023, 8:53 AMenough-zebra-80131
08/07/2023, 1:12 PMwooden-controller-23004
08/08/2023, 12:59 PMbasic_room
example, but it crashes on macos (x86_64) when connecting to a wss://*.livekit.cloud/
url. i changed the origin in cargo.toml to point to main but otherwise no changes (and the same error occurs using 0.2.0).
running this in the original client-sdk-rust repo works fine.
my repo is at https://github.com/robtfm/basic_room, set up to wait for remote login after the build step in case that's useful for you.ambitious-lunch-95812
08/31/2023, 11:44 AMmany-evening-20972
09/19/2023, 6:51 PMadamant-winter-76832
09/26/2023, 9:19 PMbrainy-table-6082
10/18/2023, 10:54 PMblue-grass-88145
10/28/2023, 3:24 PMblue-grass-88145
11/01/2023, 1:52 PMenough-zebra-80131
11/03/2023, 8:25 PMlivekit_api::signal_client::SignalClient
- I seem to get disconnected exactly 1 min after connectingdry-farmer-50272
11/06/2023, 6:47 AMtokio::spawn(async move {
let file = File::open("test.h264").unwrap();
let reader = BufReader::new(file);
let mut h264 = H264Reader::new(reader, 1_048_576);
let mut ticker = tokio::time::interval(Duration::from_millis(33));
loop {
let nal = match h264.next_nal() {
Ok(nal) => nal,
Err(err) => {
println!("All video frames parsed and sent: {err}");
break;
}
};
let frame_data = nal.data.freeze() // data byte array
// publish frame_data to livekit
let _ = ticker.tick().await;
}
Result::<(), ()>::Ok(())
}).await.unwrap().unwrap();
blue-grass-88145
11/06/2023, 1:03 PMadorable-dentist-43121
11/07/2023, 5:18 PMblue-grass-88145
11/10/2023, 10:49 AMblue-grass-88145
11/12/2023, 11:51 AMadorable-dentist-43121
11/16/2023, 6:42 PM<ws://127.0.0.1:7880>
to CreateRoom
thread 'main' panicked at src/main.rs:44:10:
called `Result::unwrap()` on an `Err` value: Twirp(Request(reqwest::Error { kind: Builder, url: Url { scheme: "ws", cannot_be_a_base: false, username: "", password: None, host: Some(Ipv4(127.0.0.1)), port: Some(7880), path: "/twirp/livekit.RoomService/CreateRoom", query: None, fragment: None }, source: BadScheme }))
adorable-dentist-43121
11/16/2023, 8:09 PMstatus: Internal, message: "twirp error: failed to execute the request: error decoding response body: expected value at line 1 column 1", details: [], metadata: MetadataMap { headers: {"content-type": "a
pplication/grpc", "date": "Thu, 16 Nov 2023 20:06:22 GMT", "content-length": "0"} }
And the server has this
WARN livekit service/auth.go:84 error handling request {"status": 401, "error": "invalid API key: dev"}
Is it not wrapped and pass down stream or I'm doing something wrong.
Code is as followed:
let create_room_result = client
.create_room(
Uuid::new_v4().to_string().as_str(),
CreateRoomOptions::default(),
)
.await;
let room = match create_room_result {
Ok(room) => room,
Err(err) => return Err(Status::internal(err.to_string())),
};
dry-farmer-50272
11/18/2023, 8:22 AMRemoteTrack::Video(video_track) => {
let rtc_track = video_track.rtc_track();
let mut video_stream = NativeVideoStream::new(rtc_track);
while let Some(video_frame) = video_stream.next().await {
println!("reading...");
}
}
The "reading..." term won't be printed which means It will be blocked on video_stream.next().await and frames won't be received
but with vp8 encoding it's works and "reading..." will be printed and only has problems with h264.
2- when I try to publish or receive tracks I will get Convert("dst isn't large enough") error.
I will send the publish code as reply.blue-grass-88145
11/22/2023, 4:14 PMframe.audio.len()
the length of the audio in ms of just the length of the vector holding the audio captured at that instant?wooden-controller-23004
11/24/2023, 3:49 PMerror C2676: binary '<<': 'rtc::webrtc_checks_impl::LogStreamer<>' does not define this operator or a conversion to a type acceptable to the predefined operator
i saw this on an action here <https://github.com/livekit/rust-sdks/actions/runs/6902465280/job/18779241976>; do you remember what the solution was? thanksgifted-scooter-1946
12/01/2023, 12:01 PMlibc++
(instead default gcc + libstdc++
on linux).
Now I want to proof-of-concept to integrate livekit into such a project.
Problem: I need to compile c++ dependencies of rust-sdks with libc++
.
I'm new to rust, so these questions may be trivial to someone in the know.
Q1: Which rust dependencies are C++ dependencies and how I can known it from source/config code? — I know webrtc is a C++ dependency, but is it the only one?
Q2: When I run the cargo build
command in my project: under-the-hood are pre-built webrtc library downloaded or build/compile on demand locally?blue-grass-88145
01/13/2024, 1:04 AMbrief-refrigerator-69901
01/13/2024, 3:31 AMblue-grass-88145
01/14/2024, 2:16 PMblue-grass-88145
01/22/2024, 7:42 PMblue-grass-88145
03/02/2024, 10:20 PMblue-grass-88145
03/02/2024, 10:21 PMblue-grass-88145
03/02/2024, 10:21 PMstraight-river-10244
03/08/2024, 5:21 PM