https://livekit.io logo
Join Slack
Powered by
# sdk-native
  • i

    incalculable-toddler-50323

    10/15/2025, 2:00 PM
    Hi, We use the livekit react native SDK in our app. We've been experiencing issues with audio on android phones (specifically Samsung ones). These issues have been of two types: • The assistant interrupts itself, and its sound keeps getting fed back into the device. (prevalent on Galaxy S23 / S25 Ultra devices etc, and other Galaxy devices) • The voice input is not registered properly, and the one has to speak very loudly for the voice to get through. (we were able to recreate this on Galaxy S21 5G) We have Krisp disabled on the device, and use BVC. In our react native mobile app, our versions are:
    Copy code
    "@livekit/react-native": "^2.9.1",
    "@livekit/react-native-webrtc": "^125.0.12",
    Our current audio options in Javascript are configured as:
    *AudioSession*.*configureAudio*({ android: { audioTypeOptions: AndroidAudioTypePresets.media } });
    We're trying other audio options now, if someone has recommended audio config specifically for android, let us know and we'll try that. If not, any other leads on this issue will be helpful!
    👀 1
    • 1
    • 3
  • d

    damp-painting-93551

    10/15/2025, 3:50 PM
    Hi, I am trying to integrate a voice agent using livekit in my react-native mobile app. But the app is unable to establish connection with the agent. The agent is deployed in the backend and I am passing tokens in the frontend. The token generation is correct as I am able to connect with agent using the livekit starter examples. Can anyone help? Thanks!
    i
    d
    • 3
    • 3
  • d

    damp-painting-93551

    10/15/2025, 3:53 PM
    publishing track {"enabled": true, "kind": "audio", "muted": false, "pID": undefined, "participant": undefined, "room": undefined, "roomID": undefined, "source": "microphone", "streamID": "ABCXXXX "streamTrackID": "ABCDYYYYY, "trackID": undefined}
  • c

    cold-hydrogen-40655

    10/16/2025, 12:03 PM
    SessionAPIUtilities.h:176 AudioSessionGetProperty (kMXSessionProperty_HasEchoCancelledInput) failed with error: -50 (sessionID: 0x0) I'm seeing this error while trying to setup livekit in my react native expo app (Ignite Boilerplate) Any suggestions ?
  • c

    cold-hydrogen-40655

    10/16/2025, 12:07 PM
    We have "react-native": "0.72.10",
  • c

    cuddly-florist-34090

    10/16/2025, 1:27 PM
    Hi Team, There is an intermittent issue we are seeing. We are not disabling the camera or changing publishing permission, but the track is still being unpublished as soon as user published it. What could be the reason for this? Is there any issue going on with Livekit? Here are the logs for a session we captured. Please check. https://cloud.livekit.io/projects/p_2y5csewzsft/sessions/RM_4UxSJTQo92Uj/participants/PA_BFZaecDo69DB Thank You cc: @plain-wall-1315
    p
    • 2
    • 1
  • e

    eager-carpet-98546

    10/17/2025, 1:18 PM
    Hey everyone 👋 We’re running into a critical issue in our Flutter app after updating
    livekit_client
    to 2.5.2
    (which depends on
    flutter_webrtc
    1.2.0
    ) — we had to update for Google Play 16KB page size compliance. Here’s what’s happening (both Android & iOS): • When the remote participant joins before the local one, both audio and video tracks of remote participant are
    null
    on the local side, even though the remote participant has already published them. • Audio can still be heard, but track objects remain
    null
    . • When the remote participant mutes/unmutes mic or toggles video, no track events are fired — it behaves like the tracks were never published. • When the local participant joins first, everything works fine. This didn’t occur with
    livekit_client
    2.4.8
    and
    flutter_webrtc
    < 1.0
    . It’s a critical blocker for us — has anyone else run into this or found a workaround? 🙏 I see very similar issue on GitHub issues - but explanation isn't so good, so I needed to explain above, since GitHub issues are often ignored. https://github.com/livekit/client-sdk-flutter/issues/895
    👀 1
    b
    w
    • 3
    • 5
  • s

    square-electrician-69423

    10/17/2025, 4:25 PM
    Hello everyone, I'm building a voice chat app using React Native (Expo) and LiveKit. My app was working perfectly a week ago, but now it's stuck on "connecting..." and never finishes. It's like it's frozen on the loading screen. Here's the confusing part: • My LiveKit server is working fine. I tested it with a different example app, and it connected without any problems. • This means the problem isn't with my server or my login keys. • I haven't changed any of my code since it was last working. What I see in my app: The logs show that it starts to connect, but it never succeeds. It doesn't even get to a point where it shows an "error" or "disconnected" message. It's just stuck.
    import { LogLevel, setLogLevel } from 'livekit-client';
    setLogLevel(LogLevel.debug);
    Logs
    room event connectionStateChanged {"args": ["disconnected"], "event": "connectionStateChanged", "pID": "", "participant": "", "room": "", "roomID": undefined}
    I also tried listening manually using
    room.on(RoomEvent,()=>{})
    and getting only these logs
    [LiveKit] Event: connectionStateChanged connecting
    LOG  room event audioPlaybackChanged {"args": [], "event": "audioPlaybackChanged", "pID": "", "participant": "", "room": "", "roomID": undefined}
    [LiveKit] Event: audioPlaybackChanged false
    The
    onError
    in
    <LiveKitRoom>
    isn’t showing any errors. The main clue: I haven’t changed anything in the frontend or backend since it last worked. The example project connects to my server but my app doesn't. Has anyone faced a similar situation where your app gets stuck "connecting" even though your LiveKit server is proven to be working? Any ideas what could have changed or what I should check in my React Native project?
    r
    c
    m
    • 4
    • 7
  • s

    silly-camera-44273

    10/17/2025, 6:00 PM
    are we able to register a custom video track in React Native or are we locked in to only the camera or native screenshare?
  • b

    broad-leather-49642

    10/20/2025, 10:51 AM
    Hello, everyone. I have been successfully using the Live Kit SDK for Unity WebGL for months now. A few days ago, however, I noticed than when I start a call like before (no code changes to the call mechanic for months) every user hears themselves with an echo. It doesn't matter if they are wearing headphones or just use the laptop mic. Each user hears themselves when speaking. Does anyone know anything about this or could suggest some ideas to try to fix it? Again, it used to work fine before and I have done no changes to the code. I am on the latest version of the SDK and always have been. There are no updates available. At least not in the docs. It is version 2.0.1
  • b

    broad-leather-49642

    10/20/2025, 2:25 PM
    Windows has had some updates so it could be related to that
  • w

    wide-shoe-67933

    10/20/2025, 3:40 PM
    I need some help figuring out some stuff here. I just updated from version 3.2.3 to 3.5.1 for the flutter SDK. Everything was working fine with 3.2.3 but now when people join with iOS (iPhone 12/iOS 18.7.1) they sometimes doesn't see other participants because of room.remoteParticipants being empty (and in turn, our UI doesn't update properly) after i've connected. The funny part is, when i connect on iOS with the microphone muted everything works. Can someone help me figure out why users connecting with/without microphone on iOS effects the room.remoteParticipants list of values?? Edit: everyone connects and such where they can still hear each other. But since the UI is built based on whats inside the room.remoteParticipants they can't see each other. These are the values for room.remoteParticipants if i listen to all room changes. Again, why are these values null and why can other participants control them or am I doing something extremly weird in the way I handle my remote participants?
    Copy code
    Connecting without microphone
    flutter: videoTrackPublications.firstOrNull?.enabled: null
    flutter: videoTrackPublications.firstOrNull?.muted: null
    flutter: videoTrackPublications.firstOrNull?.track?.isActive: null
    
    Connecting with microphone
    flutter: videoTrackPublications.firstOrNull?.enabled: true
    flutter: videoTrackPublications.firstOrNull?.muted: false
    flutter: videoTrackPublications.firstOrNull?.track?.isActive: true
    Edit: saw that this issue was already reported and working on
    👀 1
    b
    • 2
    • 6
  • g

    gray-father-36165

    10/22/2025, 7:50 AM
    Hello there. I can see a lot of packages that expose Krisp functionality to Web, Swift, Android etc., but I was wondering if it's possible to enable it using the rust-sdks directly? Searching through the codebase didn't yield much in that regard, but maybe there are plans to add it there? Or, if the effort to add it is small, maybe a brief description on how, and then we could potentially add it ourselves
  • f

    fresh-megabyte-17370

    10/27/2025, 10:17 AM
    Hi Team Trying to build sample swift application from repo - https://github.com/livekit-examples/swift-example Getting these compilation errors. Any idea how to resolve these #C01LPRVTQMU #C07FVFGAUKX #C025KM0S1CK LiveKitExample-havcspdeirrqylgwxhmyyqjsuptx/SourcePackages/checkouts/components-swift/Sources/LiveKitComponents/Audio/AudioProcessor.swift519: Task-isolated value of type '() async -> ()' passed as a strongly transferred parameter; later accesses could race
    👀 1
    • 1
    • 1
  • q

    quiet-exabyte-79192

    10/27/2025, 9:22 PM
    with livekit react native Screenshare example: https://github.com/livekit/client-sdk-react-native?tab=readme-ov-file#ios-1 https://jitsi.github.io/handbook/docs/dev-guide/dev-guide-ios-sdk/#screen-sharing-integration We got the system more or less working but are running into an issue when the user opens the broadcast picker but then closes it without selecting to start the broadcast. When we lanch the broadcast picker, and then dont select a broadcast, we are unable to kill the screen_share track generated from setting screenShareEnabled(true). When we select the broadcast and run it we have no issue killing the screenshare track. Are there any docs showing how to handle this? It doesnt look like there is a callback or anything that lets us know the picker was closed without an action.
    • 1
    • 1
  • f

    fresh-megabyte-17370

    10/28/2025, 4:03 PM
    https://github.com/livekit/components-swift/issues/30 Can someone help with this?
  • r

    rough-bear-42680

    10/29/2025, 10:15 AM
    🆘 Hello LiveKit team! I’ve opened a PR for our Spitch plugin [tts/stt]. Could you review it? can someone help review it: https://github.com/livekit/agents/pull/3748
    r
    • 2
    • 2
  • e

    early-twilight-63278

    10/29/2025, 1:27 PM
    🔴 Hi LiveKit Team i am trying to import LiveKit in UiKit xcode26 i am having an error saying
    Missing required module 'LKObjCHelpers' error when importing LiveKit
    can anyone please help me fix this. i am using spm.
  • f

    fancy-gpu-72499

    10/30/2025, 9:46 PM
    Can anyone confirm a working install for LiveKit with Unity? The default installation instructions are not working for me. I posted an issue on that too: https://github.com/livekit/client-sdk-unity/issues/166
  • f

    fancy-gpu-72499

    10/30/2025, 9:47 PM
    To be clear, I'm also trying version 1.3 currently. The last version that worked for me was 1.2.5 so I'm now trying to troubleshoot across versions and device types. Right now, I can't get 1.3 to work with a Unity build for Android on Mac. Previously on 1.2.5, I was able to do that on Windows. So I'll be investigating a few variants.
  • d

    damp-kite-83575

    11/03/2025, 10:18 AM
    hello, I'm developing an AI voice calling app using LiveKit in React Native (Expo). The app works perfectly when switching to the home screen or other apps like Notes, and even during lock screen — the call continues without any issues. However, on iOS, the call gets disconnected when I open apps such as Snapchat or WhatsApp (which use the microphone). This issue is already fixed on Android, but I’m looking for a solution or guidance on how to keep the call active on iOS even when switching to these apps. Can anyone help with this?
  • f

    freezing-planet-22595

    11/04/2025, 6:02 AM
    Hey all, I had a question about implementing a client SDK in Java for server-server communication. I have successfully implemented the
    Join
    protocol. I can see a correct
    JoinResponse
    followed by a room
    Update
    and then an
    Offer
    with various
    Trickle
    candidates coming in. If I then use the server SDK to query the room participants, I see my connected client no problems. I then start sending pings every 3 seconds. About 20 seconds into the entire process, I then receive a websocket disconnection (code 1000, reason = null) and my connected client is removed from the livekit room. I have been staring at other sdks for over 5 hours now and am about to crash out, does anyone have any idea why livekit is disconnecting me remotely? Edit: We are following everything documented here (https://docs.livekit.io/reference/internals/client-protocol/#joining-a-room).
  • b

    brief-winter-72530

    11/04/2025, 6:52 AM
    Hi everyone, I have a question. I managed to set up a call via LiveKit, but now I need to make it work with other audio sources, like in-game sound or video playback. LiveKit has an AudioManager that handles the audio, and I don't want to disable it—I'd like to extend its functionality. Are there any best practices for this kind of case?
  • a

    average-jackal-43655

    11/04/2025, 9:18 AM
    We’re currently facing a critical issue with our iOS app — the outgoing audio call disconnects or gets suspended when the app goes into the background. Our goal is to perform outgoing audio calls (no video) using LiveKit, where we create a room using a server URL and token. The call works perfectly while the app is active, but once it’s backgrounded, it stops. We’ve already: • Enabled Background Modes (
    audio
    ,
    voip
    ) in both Xcode and app.json / Info.plist • Called
    setupCallService()
    before
    registerGlobals
    in
    App.tsx
    • Started the native audio session before
    startCallService()
    • Verified our
    AVAudioSession
    configuration Despite all these configurations, iOS still suspends the app when it’s backgrounded. We couldn’t find any proper or complete guide to make LiveKit outgoing audio calls persist in the background on iOS. We urgently need your help and detailed guidance or configuration steps to fix this issue. we need help, the docs not helpful
  • q

    quick-ram-26036

    11/05/2025, 9:57 AM
    Hello! Android compose library user here. I am using the latest compose livekit library
    1.4.0
    from july. The core library seems to be getting updates quite frequently. The
    1.4.0
    Compose lib is still using 2.18.3 of the core lib. Atleast that's the version I get as transitive dependency. Could you update the compose to use the latest core one to get all fixes available to compose users 🙂
    l
    • 2
    • 2
  • b

    blue-machine-75806

    11/05/2025, 1:17 PM
    Hello, we are trying to use the noise reduction and echo cancellation features of LiveKit (not the AI ones) in our Flutter application. The target platform is Linux. Our configuration looks like this:
    RoomOptions(
    defaultAudioOutputOptions: AudioOutputOptions(speakerOn: true),
    defaultAudioCaptureOptions: AudioCaptureOptions(
    echoCancellation: true,
    noiseSuppression: true,
    ),
    ),
    It seems both echo cancellation and noise suppression do not seem to have any effect during calls. Could you please confirm if these features are currently supported on Linux Flutter, and if any additional setup or dependencies are required to make them work? Thank you 🙂
    r
    • 2
    • 9
  • a

    adamant-plumber-44781

    11/05/2025, 5:57 PM
    hey we're running into a strange issue with livekit's ios sdk and an audio track from a separate source. repro steps: 1. separate source audio coming though speaker 2. start livekit conversation (coming though speaker) 3. go back to separate audio source (this is now coming out of the in ear speaker) any ideas on why this would be the case?
    l
    • 2
    • 1
  • d

    damp-kite-83575

    11/05/2025, 7:54 PM
    Hello Community, We’re integrating outgoing-only audio calls using the following SDKs in our React Native Expo app:
    @livekit/react-native-expo-plugin
    ,
    @config-plugins/react-native-webrtc
    ,
    @livekit/react-native
    ,
    @supersami/rn-foreground-service
    , and
    react-native-callkeep
    . Everything works fine in the foreground, but when the app goes into the background, the audio call disconnects after about 1 minute. We have already: • Enabled
    UIBackgroundModes
    →
    audio
    and
    voip
    in Info.plist and
    app.json
    • Set up
    registerGlobals()
    and
    setupService()
    in App.tsx • Given microphone and background permissions • Not implemented PushKit (since we only do outgoing calls) Our main question: Do we need to add the VoIP entitlement certificate from Apple Developer Portal even though the app only handles outgoing calls? Or is there any additional iOS configuration, permission, or setup (in Xcode,
    Info.plist
    ,
    app.json
    , or entitlement files) required to keep an outgoing call active in background mode? If anyone can share references, configuration examples, or a working setup for this scenario, it would be really helpful.
    r
    • 2
    • 1
  • g

    gifted-policeman-80044

    11/12/2025, 6:52 PM
    Hello, I couldn’t find this in the docs, but here’s my question: In the native SDK (iOS specifically) I see that lib directly taps into the microphone. Is there any way to pass to the SDK a stream of audio buffers that will be in turn passed to WebRTC? And if not, is there any way the SDK can feed us the buffers that it records so we can use them for things like VAD, volume, sound detection etc.? Thanks 🙂
    r
    • 2
    • 2
  • l

    little-author-63317

    11/13/2025, 6:56 AM
    Hello, I am trying to implement Picture in Picture, using Livekit Flutter Sdk. Android works fine, but when I minimize the app, camera stops working and I am seeing black screen on PiP. I didn’t find solution, what can be a reason, or does it work in PiP iOS at all?
    r
    • 2
    • 1