
@ Gzuuus
2025-06-14 11:30:38
A week or so ago, nostr:nprofile1qqsqgc0uhmxycvm5gwvn944c7yfxnnxm0nyh8tt62zhrvtd3xkj8fhgprdmhxue69uhkwmr9v9ek7mnpw3hhytnyv4mz7un9d3shjqghwaehxw309aex2mrp0yh8qunfd4skctnwv46z7qgewaehxw309aex2mrp0yh8xmn0wf6zuum0vd5kzmp033tada presented me with a challenge: how we can request user input using MCP servers, so that a chat client like Goose or any other could dynamically request that information from the user to proceed with the tool call and its result. This would be something like 'Question MCP', which allows the agent to prompt questions back to the user. This is a novel feature that few clients implement, as there was no clear way to do this without a custom client, which is undesirable due to vendor lock-in. The solution lies in a protocol-specific feature
https://video.nostr.build/f2ed015d3c9e80a8705196528b69189d251f4818c2ab4951aeadaaae7002d2cd.webm
RooCode and Cline have this feature, but it is specific to their clients. It works by providing some built-in tools to the LLMs, along with prompt engineering in the system prompts that instruct the LLMs to eventually request input from the user. While this works, it is not an ideal solution because it is client-specific. To replicate the same behavior in a different client, you would need some sort of 'Feedback MCP' with tools like ask_user, ask_user_confirmation, etc. and you would need to tweak system prompts to make it work, which is not desirable neithe.
The solution lies in a protocol-specific feature from MCP, a standardized way for servers to request additional information from users through the client during interactions. Fortunately, this is something that has already been thought about and proposed in different pull requests in the MCP spec repo. This feature is called 'Elicitation', and it is currently a draft in the MCP protocol spec. It was introduced as a draft just a few days before Alex and I started our conversation about this, making it a bleeding-edge feature with no current support and only a pull request in the MCP TypeScript SDK that has not been merged yet.
The challenge was clear now: implement Elicitation in an MCP server and a client that handles it to demonstrate how it works. I did it, and as a result, I created a mcp-chat TUI program, which, to my knowledge, is the only chat MCP client implementing this feature. If you are interested in understanding how I made it, feel free to ask; I'll happily respond and comment on the challenges and obstacles I encountered and how I resolved them.
However, I want to reflect more on the Elicitation feature and what it allows. This is very interesting: MCP servers can require information when executing tools, without the client needing to do anything other than handle it in their own ui. This opens up a plethora of possibilities for dynamic tools requiring data from their users. For example, tools that handle uncertainty, like a 'Book a Table' tool, where if the tool cannot find a suitable table, it can ask the user what to do, whether to take any available table or cancel the process. This has various implications: it makes the process more intuitive, robust, dynamic, and, importantly, more private.
Since the LLM doesn't need to be aware of the details that the user inputs, in the previous example, the 'Book a Table' tool could request through Elicitation details like the user's name, email, phone, or any other required information to complete the process. This is a direct communication between the user and the MCP server. Once the Elicitation is complete, the tool returns the result, feeding it back to the LLM, omitting the personal details and simply saying, 'The table was successfully booked for 1 PM at The Awesome Restaurant.' Then, the LLM and the user can continue their chat if desired. At no point does the LLM need to know the details of the user to book the table, as those details are something between the user and the MCP server. The LLM is only present when the user requests in natural language to call the 'Book a Table' tool and when the 'Book a Table' tool returns its result.
This is very cool. I could keep talking about this, but I'll stop here, as this is not supposed to be a long-form content piece. But if you are interested in any of this stuff, feel free to AMA!
I'm going to drop some related links:
MCP-chat: https://github.com/gzuuus/mcp-chat
Elicitation spec: https://modelcontextprotocol.io/specification/draft/client/elicitation
Introduce Elicitation capability PR: https://github.com/modelcontextprotocol/typescript-sdk/pull/520
#mcp #llm #ai