This message was deleted.
# announcements
s
This message was deleted.
m
You mean, different providers? Probably more work than it’s worth. /shrug
The mix-and-match part, specifically
p
i guess i’m just trying to understand webrtc basics… webrtc doesn’t have a standard server interface? like you have wss//[host]7880 … but other server would be different?
m
Oh, right, so it depends on the needs of your application. But in most cases, you’d probably also have a more traditional web/application server which manages things like your database/persistent storage, auxiliary APIs, user auth, etc.
This could also, of course, be “serverless” — you could use Firebase/Supabase, for example, and your backend would be a collection of cloud functions.
p
as a specific example… the ant media server seems to have more the recording capability that i could use. but can i use the livekit flutter client to connect?
sorry these are noob questions i’m sure
m
Ah, ok, that’s different then. I haven’t really looked into the kind of interface(s) Ant supports, but you’d most likely have to modify our client to communicate with Ant. What kinds of recording capabilities do you need that aren’t available?
p
well i’m playing with lots of crazy ideas in my head
will msg you in the private channel
m
Sure, please do!
a
My understanding of it is that the webrtc protocols themselves, much like the general browser media APIs, are pretty standard. The part that isn't standard is using an SFU, like LiveKit/Jitsi/etc.. So, even though the actual way the video is being sent across the wire is standardized, the way you tell the browser how to connect to other clients (the signaling) is different. You could probably get some things to mix together, but you'd need have translation layers for the signaling side of things.
🎯 1
1