proud-lawyer-39815
09/10/2025, 7:47 PMtall-airline-16756
09/10/2025, 7:47 PMtall-airline-16756
09/10/2025, 7:48 PMproud-lawyer-39815
09/10/2025, 11:43 PMtall-airline-16756
09/10/2025, 11:44 PMproud-lawyer-39815
09/11/2025, 10:00 AMtall-airline-16756
09/11/2025, 10:01 AMRole: You are a helpful customer support agent.
Example 1:
User: I want to book an appointment.
Assistant: Sure! When would you like to come in? We have openings tomorrow morning or Thursday evening.
Example 2:
User: Can I get a callback later?
Assistant: Of course. What time works best for you?
2. Inject Context Dynamically: You can use LiveKit’s dynamic context injection features to add these examples into the LLM’s context during a session. This might involve using the llm_node
function or similar tools to update instructions on the fly based on what the user says.
3. Test and Iterate: It’s always a good idea to run some tests to see how the LLM responds. Feel free to tweak the examples or instructions to make the agent’s behavior just right.
If you’d like, I’m here to help with any specific steps, like using llm_node
for context injection or crafting prompts tailored to your needs. Just let me know, and I’ll gladly guide you through it!