proud-spring-42526
07/14/2025, 5:16 PMtall-airline-16756
07/14/2025, 5:16 PMtall-airline-16756
07/14/2025, 5:17 PMuseEffect(() => {
let start = async () => {
await AudioSession.startAudioSession();
};
start();
return () => {
AudioSession.stopAudioSession();
};
}, []);
• Audio Session Configuration for Viewers: If your users are primarily consuming audio (not publishing), configure the audio session for media playback before starting it. For Android, use:
useEffect(() => {
let connect = async () => {
await AudioSession.configureAudio({
android: {
audioTypeOptions: AndroidAudioTypePresets.media,
},
});
await AudioSession.startAudioSession();
};
connect();
}, []);
• iOS-Specific Handling: On iOS, use the useIOSAudioManagement
hook to automatically manage AVAudioSession options as audio tracks change in the room. This helps handle cases where the publisher joins after the viewer.
• AudioContext State: On iOS, a newly created AudioContext may be in a 'suspended' state, which can block audio playback. The SDK tries to resume it, but timing issues can occur. Monitoring and ensuring the context is resumed can help.
• Track Reception: Confirm that the audio track is being received from the server by checking the track state when the publisher joins.
Most often, this issue is resolved by ensuring the audio session is configured and started before connecting to the room, and by using the appropriate audio session management for your platform and use case.
If you need more specific guidance, please share which platforms (iOS/Android) are affected and any relevant code snippets for your audio session setup.
References:
Sources: src/room/Room.ts | Screen sharing | LiveKit Docs | docs/index.html