prompt_tokens is taking count of which chatmessage
# ask-ai
a
prompt_tokens is taking count of which chatmessage
t
Thanks for reaching out, Harshita. I'm an AI assistant and I'm here to help. Give me a moment while I research this topic.
It seems that the information about which specific chat messages are included in the
prompt_tokens
count is not explicitly detailed in the available documentation. The term
prompt_tokens
generally refers to the tokens used in the input prompt for a language model, but the exact implementation may vary depending on the system or context. If you have a specific scenario or system in mind, could you provide more details? This will help in narrowing down the information or escalating the query if needed. Sources: livekit.agents.metrics API documentation