You ask your AI assistant to refactor a function. It confidently rewrites half the file, breaking tests and introducing bugs. If only it had asked which optimization you wanted! That simple question could have saved an hour of debugging. We've all been there: the powerful LLM just... guesses.

Why Can't It Just Ask?

Standard LLM interactions are often one-way command lines: request in, response out. This model doesn't easily let the AI pause its work and ask you, the user on your local machine, for real-time clarification.

You are now part of the MCP tool

To bridge this, tools like Cursor use the Model Context Protocol (MCP). MCP allows richer communication between dev tools and AI models, including requests for user interaction.

Introducing interactive-mcp: The Bridge

This frustration led me to build interactive-mcp: a small, open-source Node.js/TypeScript server that acts as an MCP endpoint for interaction.

When an MCP-compatible AI assistant needs your input, it requests it via interactive-mcp. My server then presents the prompt or notification directly to you on your machine.

How It Works: Giving the AI a Voice

interactive-mcp exposes several MCP tools:

1. request_user_input

Asks you simple questions directly in a terminal window. The LLM sends a message (and optional predefined answers), interactive-mcp shows a command line prompt, and your typed response goes back to the LLM.

2. start/ask/stop_intensive_chat

For multi-step interactions (like configurations):

  • start_intensive_chat: Opens a persistent terminal chat session.
  • ask_intensive_chat: Asks follow-up questions in the same window.
  • stop_intensive_chat: Closes the session.

3. message_complete_notification

Lets the AI send a simple OS notification, useful for confirming when tasks finish.

(Powered by node-notifier for cross-platform support: Win/Mac/Linux).

The Benefits

Why use interactive-mcp?

  • 💰 Reduced Premium API Calls: Avoid wasting expensive API calls (e.g., on o3/Claude 3.7 Sonnet via Cursor) generating code based on guesswork.
  • ✅ Fewer Errors: Clarification before action means less incorrect code and wasted time.
  • ⏱️ Faster Cycles: Quick confirmations beat debugging wrong guesses.
  • 🎮 Better Collaboration: Turns one-way instructions into a dialogue, keeping you in control.

Getting Started

Getting started is simple via npx (no global install needed). Just configure your MCP client:

Cursor (mcp.json):

{
  "mcpServers": {
    "interactive": {
      "command": "npx",
      "args": ["-y", "interactive-mcp"]
    }
  }
}

Custom Timeout (Optional):
Prompts default to 30s. Change it via the -t or --timeout flag (after --):

// Example: 60 second timeout
"args": ["-y", "interactive-mcp", "--", "-t", "60"]

Your client can now use the interactive tools!

Try It Out & Get Involved

Ready to make your AI assistant less of a guesser?

Give interactive-mcp a try!

This project is young (just POC), and feedback is crucial:

  • Does it solve a pain point for you?
  • Any issues or missing features?

Please open an issue or start a discussion on GitHub. Contributions welcome!

Let's make our AI tools work better with us. Cheers!