Konstantin
06/27/2025, 11:53 AMJohn O'Reilly
06/28/2025, 9:06 AMJohn O'Reilly
06/28/2025, 1:20 PMMultiLLMPromptExecutor
what determines when each one gets used? Also, which models get used in this case?
val multiExecutor = MultiLLMPromptExecutor(
LLMProvider.OpenAI to openAIClient,
LLMProvider.Google to googleClient,
)
Ofek Teken
06/28/2025, 10:28 PMjson_schema response format
like OpenAI and more providers state.
Libraries like LangChain4j allow you to essentially give these providers the response_format
. As far as I see, this isn't the current approach in Koog which seems rather more error-prone for my understanding? (Thus, the rising need of the fixingModel
technique in PromptExecutor.executeStructured
).
Is there any specific reason for not sending the response_format
for models that support it? e.g. in via Koog's OpenAIRequest
(OpenAI response_schema reference). I can only assume it would remove the need for fixingModel
and similar techniques, allowing a reliable type-safe
response for models that indeed support it.
One more question, out of pure interest and because I didn't catch that in the awesome KotlinConf talk - Was this project open-sourced after begin used internally in Jetbrains? If not, are there any future plans to use Koog in Jetbrains products?Finn Jensen
06/29/2025, 10:45 PMinputTokensCount
and outputTokensCount
using the OpenAIModels.CostOptimized
model family. Is there any flag I need to toggle to have the LLMClient return the tokens?Felix
07/02/2025, 11:59 AMDirk
07/03/2025, 11:44 AMandrew
07/04/2025, 9:11 PMRicardo Belchior
07/06/2025, 1:52 PMToolRegistry { tool(SayToUser) }
For example, Gemini Flash throws an exception saying tools are not supported for that model; and Gemini Pro does not have a free tier.mdepies
07/07/2025, 1:28 PMSergio Casero
07/10/2025, 4:24 PMAIAgent
through an API with ktor? The idea is to provide an API to the devs so the mobile team can integrate it in the appsFilip Zalewski
07/11/2025, 1:00 PMattachments {
document(Path(documentPath).toString())
}
However, I'm wondering if it's possible to include a Byte[]? I am processing a file received via upload, and would like to avoid writing it to disk if possible.
Follow-up, is there a way to include documents in subsequent prompts?visakha
07/14/2025, 7:40 PMval agentGemini = AIAgent(
executor = simpleGoogleAIExecutor(System.getenv("GEMINI_API_KEY")),
systemPrompt = "You are a helpful assistant. Answer user questions concisely.",
llmModel = GoogleModels.Gemini2_0Flash ,
temperature = 0.7,
toolRegistry = ToolRegistry {
tool(SayToUser)
},
maxIterations = 100
)
fun main() = runBlocking {
// val result = agentOpenAI.run("Hello! How can you help me?")
val result2 = agentGemini.run("Hello! How can you help me?")
}
Eduardo Ruesta
07/15/2025, 1:47 PMDaniela
07/16/2025, 8:00 AMEduardo Ruesta
07/16/2025, 5:15 PMOfek Teken
07/16/2025, 9:52 PMnode
that returns a structured output sealed class
that has it's own system prompt.
At first I thought about using subgraphWithTask
because you can very easily write the prompt task there and supply my own finishTool
to achieve so. I looked at ProvideStringSubgraphResult
, but I didn't find a way to express sealed hierarchies
when writing the ToolDescriptor
. I might've missed though.
I'd be happy to hear if this even sounds like the correct direction for this use-case.
I've responded inside the thread with what I currently did to achieve so. Thanks!Ofek Teken
07/20/2025, 9:45 PMcontext-caching
, and if not to suggest it as an upcoming feature 🙂Filip Zalewski
07/21/2025, 12:42 PMEduardo Ruesta
07/21/2025, 3:57 PMVictor
07/21/2025, 6:22 PMRhony
07/22/2025, 1:41 PMEduardo Ruesta
07/22/2025, 1:51 PMEduardo Ruesta
07/24/2025, 9:05 PMEduardo Ruesta
07/26/2025, 2:31 AMimplementation(libs.koog.agents)
implementation(libs.mcp)
mcp = { module = "io.modelcontextprotocol:kotlin-sdk", version.ref = "mcp" }
and
koog-agents = { module = "ai.koog:koog-agents", version.ref = "koog" }
with this version:
koog = "0.3.0"
mcp = "0.6.0"
but when i tried to add the mcp tool to my agent like this:
// Start the MCP server (e.g., as a process)
val process = ProcessBuilder("path/to/mcp/server").start()
// Create a ToolRegistry with tools from the MCP server
val toolRegistry = McpToolRegistryProvider.fromTransport(
transport = McpToolRegistryProvider.defaultStdioTransport(process)
)
// Use the tools in an AI agent
val agent = AIAgent(
promptExecutor = executor,
strategy = strategy,
agentConfig = agentConfig,
toolRegistry = toolRegistry
)
// Run the agent
agent.run("Your task here")
my Android Studio can't find these classes McpToolRegistryProvider
, McpTool
and McpToolRegistryProvider
Daniela
07/29/2025, 1:53 PMJohn O'Reilly
07/31/2025, 8:26 AMEduardo Ruesta
07/31/2025, 1:20 PMEduardo Ruesta
07/31/2025, 6:32 PMcaelum19
08/01/2025, 1:35 PMwhile (true) { agent.step() }
). I want the chat history to persist between steps (windowing or self-redacting is fine). What's the best way to do this?
For context, I want the agent to be more autonomous and run continuously, using tools to wait for periods of inactivity