highest resolution offered by the self hosted serv...
# ask-ai
s
highest resolution offered by the self hosted server deployed using helm on k8s cluster?
t
Thanks for reaching out, Nouman. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
Just so you know, the highest resolution you can get with a self-hosted LiveKit server deployed using Helm on a Kubernetes cluster is 3840x2160 (2160p). Both the server setup and the client SDKs support this resolution, so you’re all set to enjoy that crisp quality! Sources: src/room/track/options.ts | livekit/src/room/options.rs | Runtime/Scripts/Room/Track/Options.cs
s
so how we add things some that we are uisng the highest quality of resolution when testing th using load-test
in the helm for resolution
need answer
t
Hey there! Just checking in—are you trying to set the highest resolution (3840x2160) specifically for the load-test tool, or do you want to tweak the Helm chart settings for your LiveKit deployment to handle that resolution? Also, are you focusing on testing the server performance or the client-side video quality? Let me know so I can help you better! Sources: livekit/src/room/options.rs | livekit-android-sdk/src/main/java/io/livekit/android/room/track/LocalVideoT... | Runtime/Scripts/Room/Track/Options.cs | pkg/loadtester/loadtest.go
s
first server
t
Hey there! Just a heads-up, setting the highest resolution (3840x2160) for the LiveKit server isn’t something you can tweak directly through the Helm chart. Usually, video resolution is handled on the client SDK side. If you’d like, I’m here to help with any server-side setup or performance tuning questions you might have—just let me know! Sources: Kubernetes & Helm Charts | livekit/livekit_composite | DeepWiki
s
some when i test the resolution which i select very high but the load test shows only the 720 resolution not more than that why/
t
Hey there! Just a heads-up—the LiveKit load-test tool is set up to cap video resolution at 720p, even if you pick "very high." This is on purpose to keep things consistent when running different tests. The resolution options you see are fixed, with "high" meaning 1280x720. If you're looking to test out higher resolutions like 4K, no worries! You can use the LiveKit SDKs or CLI to publish custom high-res video streams straight to a room. Just let me know if you'd like a hand getting that set up—I’m happy to help! Sources: README.md | src/room/track/options.ts | Benchmarking | LiveKit Docs
s
so it is not the server things of some codecs? i think the server did get the 1080p video that why it only test the 720?
t
Hey there! Just to clarify, the resolution limit you’re seeing in the load-test tool isn’t because of the server or codec settings. The LiveKit server can actually handle resolutions all the way up to 4K (3840x2160). Plus, features like adaptive streaming and simulcast help adjust the resolution based on network conditions and who’s subscribed. The load-test tool, though, is purposely set to cap at 720p to keep benchmarking consistent, no matter what the server can do. If you’re noticing that the server is getting 1080p video but the load-test tool only shows 720p, that’s just how the tool is designed—it’s not a limitation on the server side. If you want, I’d be happy to help you explore other ways to test higher resolutions. Just let me know! Sources: README.md | Codecs and more | LiveKit Docs | Benchmarking | LiveKit Docs
s
i need to test the 1080p
t
If you want to test 1080p resolution, you can easily use the LiveKit CLI or SDKs to publish high-res video streams straight into a room. Here’s a simple way to do it with the CLI: 1. First, get a 1080p video file ready for testing. 2. Then, run this command to publish the video to your room:
Copy code
lk room join --identity publisher --publish <path_to_1080p_video> <room_name>
This will help you see how the server handles 1080p streams. If you want, I’m here to walk you through the setup or answer any questions!
s
i do not know the server is getting the 1080p or not?
is there any codecs to add?
t
LiveKit supports 1080p video streams using the codecs it already offers, like H.264, VP8, VP9, and AV1—so no extra codecs are needed to get that 1080p resolution. If you’re running into any issues, it’s a good idea to double-check that your bitrate and frame rate settings are set up right for 1080p. Feel free to reach out if you want a hand with specific settings or want to try some testing methods—I’m here to help! Sources: Video codecs | LiveKit Docs | examples/demo/demo.ts | protobufs/livekit_ingress.proto | Codecs and more | LiveKit Docs
s
i not specify the codecs in the helm values file
t
If you want to specify codecs in the Helm values file for your LiveKit deployment, you can easily set them up in the
enabled_codecs
section under the
room
configuration. Here’s a quick example:
Copy code
room:
  enabled_codecs:
    - mime: audio/opus
    - mime: video/vp8
    - mime: video/h264
This setup lets you enable the codecs you want for publishing tracks in the room. Feel free to customize this list with codecs like H.264, VP8, VP9, or AV1 depending on what you need. If you’d like a hand with this or have any questions, just let me know—I’m here to help!
s
if i add the codecs it also test the 1080p are not
t
Hey! Just a quick heads-up: adding codecs in the Helm values file makes sure those codecs are ready to go in your LiveKit setup, but it doesn’t automatically switch the load-test tool to 1080p. That tool actually limits the resolution to 720p by design, to keep things consistent for benchmarking. If you’re aiming to test 1080p, you’ll want to use the LiveKit CLI or SDKs to send 1080p video streams straight into a room. I’m happy to help you get that set up—just let me know!
s
yes sure
t
If you want to test 1080p video streams, you can easily use the LiveKit CLI to publish high-res video straight to a room. Here’s a quick way to do it: 1. First, get a 1080p video file ready for testing. 2. Then, run this command to publish the video to your room:
Copy code
lk room join --identity publisher --publish <path_to_1080p_video> <room_name>
This will help you check that the server is handling 1080p streams just right. If you run into any questions or want a hand with any step, just let me know—I’m here to help!
s
not the why the load test command use
t
Just a heads-up: the load-test command in LiveKit is set up to cap video resolution at 720p, even if the server can handle higher resolutions. This is on purpose to keep benchmarking and performance testing consistent across different setups. The resolution options in the load-test tool are fixed, with "high" meaning 1280x720. If you're looking to test 1080p resolution, the load-test tool won’t do the trick. But no worries—you can use the LiveKit CLI or SDKs to publish 1080p video streams directly to a room for testing. Feel free to reach out if you want a hand with this!
s
is there any way we can test 1080p using command of load test
t
Hey there! Just a quick heads-up—the load-test tool doesn’t support testing at 1080p resolution right now. It’s set up to cap video resolution at 720p to keep benchmarking consistent. Unfortunately, there isn’t a command or setting in the load-test tool that can change this. If you want to test 1080p, the best way is to use the LiveKit CLI or SDKs to publish 1080p video streams directly to a room. I’m happy to walk you through the setup or chat about other testing options whenever you’re ready—just let me know!
s
and how we check server is accepting the 1080p resolution
t
Hey there! To check if the server is accepting 1080p resolution, here are a few easy ways you can try: 1. Check Video Stream Parameters: You can use a handy tool like
ffprobe
to take a look at the video stream being sent to the server. Just run this command: `ffprobe -v quiet -print_format json -show_streams <input_stream>`Then, peek at the
width
and
height
fields in the output to see if it’s 1080p (1920x1080). 2. Monitor Server Logs: Another way is to check the LiveKit server logs. They usually show details about the incoming video tracks, including their resolution. 3. Use Debugging Tools: If you’re using a LiveKit client, try turning on debugging. It’ll show you the video track dimensions that are being sent to the server. If you want, I’m here to help you walk through any of these steps or tools. Just let me know!
s
ffprobe -v quiet -print_format json -show_streams <input_stream> where do we need to run this command and what input stream
t
Hey there! You can run the
ffprobe
command right on the machine where your video stream is being sent or received. Just use the
<input_stream>
as the URL or file path of the video stream you want to check out. Here’s how: • If you’re working with a video file, just use the file path like this:
Copy code
ffprobe -v quiet -print_format json -show_streams /path/to/video.mp4
• If you’re looking at a live stream, use the stream URL instead (like an RTMP or WebRTC stream):
Copy code
ffprobe -v quiet -print_format json -show_streams <rtmp://your-stream-url>
Feel free to reach out if you want a hand figuring out the right input stream or getting
ffprobe
set up—I’m here to help!