https://livekit.io logo
Join Slack
Powered by
# sdk-native
  • g

    glamorous-thailand-53063

    06/30/2025, 6:27 PM
    Has anyone hit this issue before? https://github.com/livekit/client-sdk-swift/issues/731 We are hitting it in the swift sdk and it's making integrating quite difficult
    t
    • 2
    • 1
  • i

    incalculable-stone-39027

    07/02/2025, 6:15 AM
    Hi guys do you have jsons for your SDK python documentation? May be endpoints have json instead of html? thanks! say for this and other doc pages https://docs.livekit.io/reference/python/v1/livekit/agents/index.html
    r
    • 2
    • 2
  • r

    rhythmic-plumber-94044

    07/03/2025, 7:12 AM
    Hi Guys! Has anyone tried the Unity SDK? I created a Share Screen feature but it stops when the screen isn't active (for example - when I'm watching youtube, the stream stops)
  • s

    stocky-application-19701

    07/03/2025, 9:52 AM
    Hi Guys, We are facing this issue on our prod app. Will appreciate if someone can look into it. https://github.com/livekit/client-sdk-android/issues/700
  • b

    boundless-branch-10227

    07/03/2025, 8:45 PM
    Can someone please respond to this msg
    ➕ 1
    m
    l
    • 3
    • 2
  • f

    flaky-city-98337

    07/04/2025, 12:24 PM
    Hey! It seems like the TextureReceived event listener doesn't trigger for Video Track for some unknown reasons. Could someone please help me understand why. I'm following the examples share for Unity. The remote participants are connected via the JS client https://github.com/livekit/client-sdk-unity/issues/136
  • n

    numerous-cat-27865

    07/09/2025, 2:57 PM
    Hi LiveKit community! We're evaluating LiveKit to replace our existing video conferencing solution and have some questions about native feature support: Layout & UI: • Multiple video layouts (sidebar + focus view, picture-in-picture, grid view, focus modes) • Are these provided out-of-the-box or require custom implementation? Collaboration Features: • Native whiteboard with real-time collaboration (participants in sidebar panel) • Screen sharing with annotation support • In-session chat functionality • Breakout room capabilities Technical Issues: • Currently experiencing poor screen sharing quality for remote participants - any recommended settings or troubleshooting steps? Documentation: • Having difficulty finding comprehensive docs on these features - any pointers to relevant resources would be appreciated! Thanks in advance for any guidance. Really excited about LiveKit's potential for our use case!
    l
    • 2
    • 1
  • m

    mysterious-mouse-33403

    07/10/2025, 2:31 AM
    Is there a known issue of LiveKit React components not rendering in iOS simulators?
    l
    • 2
    • 6
  • l

    lively-keyboard-94143

    07/10/2025, 6:56 AM
    Is there any way to handle timeout from react native on livekit
  • l

    lemon-city-87922

    07/10/2025, 12:12 PM
    it seems like this Swift code is not in fact a delegate function like it should be… a) are these docs out of date? b) what is the new version of this function?
  • l

    lemon-city-87922

    07/10/2025, 12:14 PM
    this function also no longer exists in the newest Swift sdk… wow the ios docs are qutie out of date
    👀 1
  • l

    lemon-city-87922

    07/10/2025, 12:18 PM
    bro another Swift doc that’s outdated 😭 is all livekit swift documentation outdated?
    l
    • 2
    • 1
  • r

    rhythmic-plumber-379

    07/11/2025, 4:13 AM
    Out of curiosity, has anyone tried building or done research for custom avatars using Unreal engine or Unity and tried to integrate it with LiveKit? The avatar can look animated and doesn’t need to look like a real person. The commercial options like Tavus and bithuman seem pricey and don’t seem to scale cost wise. I see there is also some LiveKit Unity SDK: https://github.com/livekit/client-sdk-unity?tab=readme-ov-file https://github.com/livekit/client-sdk-unity-web https://github.com/livekit-examples/unity-webgl-demo But not sure how that would work in the big picture Any thoughts about this?
  • p

    prehistoric-hospital-42098

    07/11/2025, 6:06 AM
    Hi folks, I’m building a voice assistant and want to display the live transcription stream to users in real time. However, I’m currently only receiving the final sentence after the user stops speaking, likely due to how LiveKit handles audio streams. Has anyone figured out how to get intermediate (streaming) transcription results while the user is still speaking? Any guidance or workarounds would be appreciated!
    a
    • 2
    • 2
  • m

    melodic-zoo-29246

    07/11/2025, 6:41 AM
    Hi, I am using the native ios SDK and I have disabled VAD on the server. Where and how would I need to make changes on the iOS app to manually send events.
  • l

    little-honey-89160

    07/11/2025, 11:50 AM
    I copied and pasted example code
    Copy code
    import * as React from 'react';
    import {
      StyleSheet,
      View,
      Text,
      FlatList,
      ListRenderItem,
    } from 'react-native';
    import { useEffect } from 'react';
    import {
      AudioSession,
      LiveKitRoom,
      useTracks,
      TrackReferenceOrPlaceholder,
      VideoTrack,
      isTrackReference,
      registerGlobals,
    } from '@livekit/react-native';
    import { Track } from 'livekit-client';
    
    
    registerGlobals()
    
    const LiveViewScreen = () => {
      useEffect(() => {
        let start = async () => {
          await AudioSession.startAudioSession();
        };
    
        start();
        return () => {
          AudioSession.stopAudioSession();
        };
      }, []);
    
      return (
        <LiveKitRoom
          serverUrl={wsURL}
          token={token}
          connect={true}
          options={{
            adaptiveStream: { pixelDensity: 'screen' },
          }}
          audio={true}
          video={true}
        >
          <RoomView />
        </LiveKitRoom>
      );
    }
    
    const RoomView = () => {
      // Get all camera tracks.
      // The useTracks hook grabs the tracks from LiveKitRoom component
      // providing the context for the Room object.
      const tracks = useTracks([Track.Source.Camera]);
    
      const renderTrack: ListRenderItem<TrackReferenceOrPlaceholder> = ({ item }) => {
        // Render using the VideoTrack component.
        if (isTrackReference(item)) {
          return (<VideoTrack trackRef={item} style={styles.participantView} />)
        } else {
          return (<View style={styles.participantView} />)
        }
      };
    
      return (
        <View style={styles.container}>
          <FlatList
            data={tracks}
            renderItem={renderTrack}
          />
        </View>
      );
    };
    
    const styles = StyleSheet.create({
      container: {
        flex: 1,
        alignItems: 'stretch',
        justifyContent: 'center',
      },
      participantView: {
        height: 300,
      },
    });
    
    export default LiveViewScreen
    but I get an error
    Copy code
    TypeError: r.addEventListener is not a function (it is undefined)
    and then some warnings
    Copy code
    WARN  Sending `onAnimatedValueUpdate` with no listeners registered.
    I am using react-native with expo
    Copy code
    "@livekit/react-native": "^2.7.6",
        "@livekit/react-native-expo-plugin": "^1.0.1",
        "@livekit/react-native-webrtc": "^125.0.11",
        "expo": "~52.0.31",
    I installed it through expo plugin https://github.com/livekit/client-sdk-react-native-expo-plugin any tips what might be wrong here? The plugin doesn't seem to be updated in a while should it still be working?
    r
    • 2
    • 1
  • w

    wonderful-nightfall-7721

    07/12/2025, 7:19 AM
    Hi, I am trying to set up
    react-native-meet
    on my MacOs. For some reasons, I am not able to run the App. I am facing Build Error. Can someone help here. PFA the error screenshot. I am stuck here. Some help/direction will help me make progress.
  • b

    brash-judge-83910

    07/13/2025, 12:12 PM
    Hi guys, we’re facing an issue with LiveKit cloud for our voice based chatbot built in React Native. The transcription for user as well as the agent appears all-at-once. We need a realtime word-by-word (or atleast sentence-by-sentence) transcription for both. Are we missing some config? #transcription #realtime
    l
    • 2
    • 3
  • s

    strong-dentist-37201

    07/15/2025, 10:50 AM
    Hi, is there a roadmap for the Unity SDK? We are also interested in the Unity WebGL support, we need control over the input audio streams and output audio streams, as in the Unity SDK, the web version seems to just offer a enableMicrophoneAndCamera() which is very limitting
    r
    • 2
    • 1
  • s

    stocky-portugal-86826

    07/15/2025, 12:53 PM
    Hi there, if we are missing some config option in one of the plugins, in this case Azure STT (https://github.com/livekit/agents/blob/main/livekit-plugins/livekit-plugins-azure/livekit/plugins/azure/stt.py) what's the correct process to contribute to it adding this config? we are missing:
    SpeechServiceConnection_EndSilenceTimeoutMs
  • i

    important-psychiatrist-73895

    07/15/2025, 7:29 PM
    Hi LiveKit team, is there a way that I'm missing to be able to set / select a specific audio device on iOS? Here's what I see in AudioManager.swift in the client-sdk-swift :
    Copy code
    public var inputDevice: AudioDevice { 
        get { 
            #if os(macOS) 
                 AudioDevice(ioDevice: RTC.audioDeviceModule.inputDevice) 
            #else 
                 AudioDevice(ioDevice: LKRTCIODevice.defaultDevice(with: .input)) 
            #endif 
        } 
        set { 
            #if 
                os(macOS) RTC.audioDeviceModule.inputDevice = newValue._ioDevice 
            #endif 
        } 
    }
    And then RTCIODevice.h doesn't seem to have a way to create from a specified input. Is this intentional because there's some underlying issue with supporting a specific device on iOS (e.g. a plugged in microphone)? Or just not implemented because it wasn't a priority yet? From what I can tell so far, I'd need to make a branch with some edits to enable this in livekit's branch of webrtc, build the ltwebrtc framework for iOS and then implement this in client-sdk-swift BUT if there's an easier way would love to know it!
    • 1
    • 2
  • m

    many-helmet-9770

    07/15/2025, 10:37 PM
    Hi everyone! We're doing some integrations with telephony systems and LiveKit. We're receiving audio in Opus format from telephony systems but while passing it to LiveKit, I see we're having to decoding it to raw PCM format, and while transferring the audio back, we're again receiving raw PCM from LiveKit and encoding to Opus to send telephony systems. The problem is encoding & decoding operations taking a lot of resources, is there any possibility to pass the audio in Opus format to LiveKit ? I see LiveKit uses WebRTC to transfer audio from client to its server, and in that WebRTC, I believe we negotiate the codec by specifying it in SDP, so far I believe LiveKit specifying raw PCM in negotiation SDP, so is there any possibility to use Opus in that negotiation and transfer audio in Opus over WebRTC ? If this is not available yet, we're ready to dedicate our time in contribution if any guidance provided, since it's being crucial for us. Also just curious question: why we're transferring audio in raw PCM over WebRTC ? wouldn't it be more efficient to transfer audio in compressed version like in Opus ? Thank you!
    l
    • 2
    • 4
  • a

    alert-chef-13304

    07/16/2025, 5:14 AM
    I’m building a voice agent with livekit and have run into a problem. I see you are a dev at livekit and just wondering if you could help out I'm getting a Metro bundler error when trying to use @livekit/react-native with the latest stack: Current Setup:     •    Expo SDK 53     •    React Native 0.79     •    Metro 0.82     •    @livekit/react-native 2.7.6 Error: Unable to resolve "../../../../../.." from "node_modules/@livekit/react-native/src/audio/AudioManager.ts" Question: What's the recommended version combination for Expo + React Native + LiveKit in 2025? We also need compatibility with expo-superwall. Looking to implement real-time voice chat and would prefer LiveKit over WebSocket fallbacks. Thanks for your help!
    • 1
    • 1
  • w

    wooden-scientist-55429

    07/16/2025, 12:10 PM
    Hey Everyone I am using unity sdk for implement voice chat functionality in to my game https://github.com/livekit/client-sdk-unity
  • w

    wooden-scientist-55429

    07/16/2025, 12:11 PM
    but while i am setup in to my unity project there are lots of error coming
  • w

    wooden-scientist-55429

    07/16/2025, 12:11 PM
    can you share me stable version of unity sdk for mobile ?
  • f

    fierce-knife-30651

    07/16/2025, 2:46 PM
    Hello, I cannot build anymore for iOS (Android works fine), I get the following error:
    Copy code
    The following build commands failed:
    	CompileC /Users/expo/Library/Developer/Xcode/DerivedData/ShinyLive-hhdpofmyzlqvkjbxuvvmqpvjyccg/Build/Intermediates.noindex/ArchiveIntermediates/ShinyLive/IntermediateBuildFilesPath/Pods.build/Release-iphoneos/livekit-react-native-webrtc.build/Objects-normal/arm64/WebRTCModule.o /Users/expo/workingdir/build/node_modules/@livekit/react-native-webrtc/ios/RCTWebRTC/WebRTCModule.m normal arm64 objective-c com.apple.compilers.llvm.clang.1_0.compiler (in target 'livekit-react-native-webrtc' from project 'Pods')
    • Expo SDK 51.0.38 • @livekit/react-native-webrtc : 125.0.9
    p
    • 2
    • 2
  • w

    worried-oyster-72658

    07/17/2025, 4:50 AM
    Guys, using the react native template of livekit I am facing this error... The voice agent is working perfectly well in the console and in the playground. But once I am trying to run the voice agent built for Android on expo it is just speaking one sentence that is there as the greeting and then stopping after it... It is not replying to me back or conversing with me... Does anyone have any idea how I can fix this?
  • d

    delightful-telephone-54565

    07/17/2025, 6:34 AM
    Hello Guys, I have configured the video resolution to 2K, and it’s working as expected. However, the video playback is not smooth — it’s stuttering and getting stuck, especially on devices running Android 14 and 15. Interestingly, it works smoothly on Android 11. How can I resolve this latency issue on newer Android versions? Below is the code I’m currently using.
    Copy code
    defaultCameraCaptureOptions: CameraCaptureOptions(
               cameraPosition: CameraPosition.back,
               params: VideoParameters(
                dimensions: VideoDimensionsPresets.h1440_169,
                 encoding: VideoEncoding(
                  maxBitrate: 25000000, // 25 Mbps
                  maxFramerate: 60,
                ),
            ),
          ),
          defaultVideoPublishOptions: VideoPublishOptions(
              videoCodec: "H.264",
              videoEncoding: VideoEncoding(
                maxBitrate: 25000000, // 25 Mbps
                maxFramerate: 60,
              ),
            simulcast: false
          )
    Thanks in advance.
  • f

    fast-thailand-41521

    07/17/2025, 11:14 AM
    Hi, I imported the LiveKit SDK Native package into Unity 2021.3.45 on macOS (tried also on Unity 2022). The package generates compilation errors related to a library that cannot be found during the build, located at
    Packages/io.livekit.livekit-sdk/Runtime/Plugins/Google.Protobuf.dll
    . For example: "the namespace Google could not be found" The same error does not occur on Windows OS. How can I fix this? Thanks.