This message was deleted.
# helpdesk
s
This message was deleted.
đź‘€ 1
d
Hey Piotr, are you using our Swift SDK for this? Can you tell me a bit about what the custom AVAudioEngine is designed to do?
r
hey David! Yes, I’m using LiveKit Swift SDK and essentially what I try to do is reflected in this repository, instead I’m trying to configure our own audio engine as a node in AVAudioEnigne and more advanced configuration, since audio processing is written in Rust. Link to general concept of how I try to wrap AVAudioEngine into LiveKit scenario is here: https://github.com/mstyura/RTCAudioDevice
d
Thanks for sharing this. This seems very cool.. but also a bit tricky to do with the current SDK. Do you potentially have another path to pass in those audio buffers? I believe that might be a path that could be more easily supported vs a RTCAudioDevice abstraction
b
This is pretty cool, I would need to patch WebRTC(ObjC) to expose option to pass RTCAudioDevice first.
r
Hi @dry-elephant-14928! I found an SDK for Rust scenario, especially with client-sdk-rust in example of “single_room” and file sine_track.rs. As I checked the code it might be worth to try to keep all my audio processing in Rust and just expose AudioTrack, AudioSource or MediaTrack in the FFI outside of Rust so it might be compatible with SDK for Swift (iOS in particular). Do you think it might be an option or LiveKit in its RTCAudioSessionConfiguration will still override audio coming out of Rust?
Just FYI here is the example that I found and of course that FrameData would be swapped with our buffers from audio engine
d
Hey @rough-application-44252, eventually we want to move to using the Rust SDK as the "common core" behind all of our SDKs.. but in the mean time, it's not possible to mix Rust with the Swift SDK today. If you'd like to use the rust SDK, you could just write a Swift wrapper around it. We are working on an ffi interface for the Rust SDK as well. and it should be ready soon.
r
alright. So to sum up, at this point it might be worth it to create wrappers between Swift and Rust to “trick” Local/RemoteAudioTracks in LiveKit SDK for iOS, but stick with audio processing in Rust if I understand that correctly?
@better-house-11852 I found in WebRTC that all I need is already exposed in RTCPeerConnectionFactory but for some reason LiveKit SDK for Swift with ObjC doesn’t support that. Source code is in here: https://webrtc.googlesource.com/src/+/master/sdk/objc/api/peerconnection/RTCPeerConnectionFactory.h