This message was deleted.
# helpdesk
s
This message was deleted.
i
Also in the LiveKit Unity SDK example project on github, in ExampleRoom.cs I see this comment:
Copy code
IEnumerator Start()
    {
        // New Room must be called when WebGL assembly is loaded
        m_Room = new Room();
What does this comment actually mean? Does this have something to do with why it isn't working in my project?
d
hey yeah it def sounds related to the issue you are seeing.. do you have your room creation in an
IEnumerator Start()
block as well?
i
No, I don't have it in an
IEnumerator Start()
block but rather a different coroutine that is executed later on. Is that a problem?
d
hah I don't know Unity, but was just reading the comments and thought it was relevant to your error. will defer to @boundless-energy-78552 on this one.
b
Hey @important-megabyte-62876, are you trying to run your scene inside the editor?
i
@boundless-energy-78552 Yes, I am.
b
Yes.. This is currently not supported, at the moment, the WebGL SDK needs a browser context to work. We’re currently working on a Unity Native SDK, this will allow to run LiveKit on the editor and many other platforms
i
Oh, dang. 😕 That makes implementation difficult. Will the native SDK you mentioned be released soon? In the meantime, what's the recommended way to implement/test/iterate and have things not throw errors when using the app in the Unity Editor?
b
You can use compiler directive (
UNITY_WEBGL
). I agree that it slows down the development, my priority is to release the Unity NativeSDK, some blockers must be fixed and it should happen soon
i
When the Native Unity SDK is released, with there be only a single Unity package that includes both the Unity native and Unity web SDKs and uses compiler directives to execute the correct SDK? (Native SDK in Unity Editor & web SDK in a WebGL build?)
As my eventual goal would be to use LiveKit as a cross-platform voice/video solution for any apps I create with Unity, it would be great to use a single package, write the C# code for LiveKit once, and have it work cross-platform (for example, a Unity WebGL user, a Unity Windows user, a native Unity Quest VR (Android) user could all communicate with each other in a cross-platform app)
So, just curious about the roadmap there.
d
Correct, the goal is to have a single Unity package with the same APIs that works across native and web. The plan is roughly the following: 1. release Unity for native (there may be some minor API differences between the current WebGL SDK) 2. merge WebGL SDK into it (all with the same API)
1
i
That sounds great! Could you give a rough estimate for when this will be released?
d
1 will be by the end of this month
not sure about 2 at the moment
i
So when Phase 1 is complete, would it be possible to have both packages in the same project and rig up some compiler definitions to allow LiveKit to function both in the editor and in a WebGL build? (And allowing WebGL builds to connect with the Unity Editor for voice/video?)
d
I believe so!
the two packages would not conflict
i
Ok, thanks @dry-elephant-14928 and @boundless-energy-78552! By the way, I really appreciate that you and your team are working towards a full Unity integration. I expressed some disappointment earlier about LiveKit not working in the Unity Editor, however I understand where you are at and what your goals are and I am very eager and excited for when the native Unity SDK is released. This is super important for our team since we do a lot of multiplayer testing and iteration inside the Unity Editor and when things don't function the same way they would in a build, it makes things pretty difficult to work with. We're currently using Agora's Unity WebGL plugin for voice chat, which does work both in the Editor and in a WebGL build. However, their plugin is in beta and I've found it to be rather buggy, and updates to it are fairly infrequent. I would really like to move everything over to LiveKit, but I can't do that until LiveKit has a native Unity integration released. That being said, so long as the native Unity SDK is in-progress and releasing fairly soon, I think I am still going to attempt to implement only screen share with LiveKit and see how that goes. In the short-term I can try to write some code to handle things "gracefully" when running in the Unity editor - disabling incoming and outgoing screenshare functionality when using the Unity Editor. It's a bit of a challenge since I'm not just running LiveKit - it'll be tied into Photon PUN2 which is coordinating most of our multiplayer networking. But once I wrap my head around how LiveKit does things I'm hopeful I can come up with a good solution to handle things differently inside the Unity Editor than in a build. Once the native Unity integration is released, I could then update my code to have screen share also work inside the Unity Editor, and then transition voice chat and any other services over from Agora to LiveKit. Anyways, I just wanted to type this out because it gives you guys a picture of an actual use case! I've enjoyed reading your blog posts and I admire LiveKit's goals and mission. I hope to use LiveKit in all of my Unity projects once the native Unity SDK is released!
❤️ 2
d
Thank you Matt. Your support means a lot! We are anxious to release full Unity support and hope we won't disappoint!
❤️ 2