Set up Vilix in your AI tool.
Free for 3 days. Works with Claude, ChatGPT, Cursor, Perplexity — anything that speaks MCP.
You don't need to generate or paste any tokens. Just paste the MCP URL into your AI tool and approve the connection — the same 3-step flow everywhere.
Copy the Vilix MCP URL
The same URL works for every AI tool that supports MCP.
https://api.getvilix.com/mcp/sseAdd Vilix to your AI tool
Each tool follows the same flow: paste the URL, approve OAuth, done. No tokens, no config files for most setups.
- Settings → Connectors → Add custom connector
- Paste the MCP URL above
- Approve the OAuth prompt — done
- Settings → Connectors → Add MCP server
- Paste the MCP URL above
- Approve OAuth, then enable Vilix inside the chat — done
- Open ~/.cursor/mcp.json (or the project-level .cursor/mcp.json) and add the snippet below
- Restart Cursor; it will trigger the OAuth flow automatically
- Sign in and approve — Cursor stores the credentials internally, no manual token
{
"mcpServers": {
"vilix": {
"url": "https://api.getvilix.com/mcp/sse"
}
}
}- Settings → Connectors → Add MCP server
- Paste the MCP URL above
- Approve OAuth — done
Paste this system prompt
Add this to your AI's system prompt / custom instructions so it actually uses Vilix instead of just listing the tools.
I want you to use Vilix MCP as the mandatory memory layer for every message I send unless I explicitly ask for stateless mode.
For EVERY message I send, always follow this exact order:
1. Call:
get_context(user_prompt=<my latest message>)
2. Use the returned context:
- recent_messages
- user_memories
- related_conversations
- instruction
3. Generate your final response internally.
Do NOT send it yet.
4. Before replying to me, call:
save_turn(
user_message=<my latest message>,
assistant_message=<exact final response>,
source=<agent/app name>,
chat_id=<existing chat_id if available>
)
5. Return the exact same saved response to me.
STRICT RULES
- Always follow: get_context → generate_answer → save_turn → return_answer
- Apply this to EVERY message I send.
- Never skip memory steps based on importance.
- Never answer before get_context.
- Never return an answer before save_turn.
- The saved response MUST exactly match the final returned response.
- Never fake or claim a successful save if it failed.
- Reuse chat_id for the same conversation.
- Do not use lower-level memory tools if get_context/save_turn exist.
If Vilix/MCP tools are unavailable or broken:
- Do NOT silently continue.
- Tell me the memory tools are unavailable.
- Ask whether I want to continue without memory.
- Be explicit that continuity and persistence may not work.You’re done
Start a new conversation in your AI tool. It should call get_context at the start of each turn and save_turn before replying. Switch tools mid-thought — your context comes with you.
Want a dashboard view of what's been saved? Sign in at app.getvilix.com.
Stuck? Check the docs or email support@getvilix.com for setup, ChatGPT / Claude / Cursor connection issues, or billing and account help.