AI SummaryNereid enables real-time collaborative diagram editing in Mermaid through an MCP-powered TUI interface, with attention-coordination features for sequence diagrams and flowcharts. Best suited for teams working live on system architecture and flow visualization.
Install
Copy this and paste it into Claude Code, Cursor, or any AI assistant:
I want to install the "nereid" skill in my project. Please run this command in my terminal: # Install skill into the correct directory (3 files) mkdir -p .claude/skills/nereid && curl --retry 3 --retry-delay 2 --retry-all-errors -o .claude/skills/nereid/SKILL.md "https://raw.githubusercontent.com/bnomei/nereid/main/skills/nereid/SKILL.md" && mkdir -p .claude/skills/nereid/agents && curl --retry 3 --retry-delay 2 --retry-all-errors -o .claude/skills/nereid/agents/openai.yaml "https://raw.githubusercontent.com/bnomei/nereid/main/skills/nereid/agents/openai.yaml" && mkdir -p .claude/skills/nereid/references && curl --retry 3 --retry-delay 2 --retry-all-errors -o .claude/skills/nereid/references/mcp-playbooks.md "https://raw.githubusercontent.com/bnomei/nereid/main/skills/nereid/references/mcp-playbooks.md" Then restart Claude Code (or reload the window in Cursor) so the skill is picked up.
Description
Collaborate in Nereid Mermaid sessions via MCP using AST-first, probe-refine workflows for sequence diagrams, flowcharts, xrefs, routes, and walkthroughs. Use when exploring or editing diagrams with a human watching live in TUI, and when coordinating attention through `attention.*`, `follow_ai.*`, and `selection.*`.
Nereid MCP Collaboration
Collaborate on Mermaid-backed diagrams in a live, shared TUI/MCP session. Keep context small, edits structured, and attention explicit.
Collaboration Contract
Assume co-presence by default: • The user can see diagram updates in real time. • The user can see where the agent is focused. • The agent should steer attention visually, then speak briefly. Drive collaboration with this state model: • attention.human.read: read the human cursor/attention in TUI. • attention.agent.read: read the agent spotlight object. • attention.agent.set: move the agent spotlight to one object. • attention.agent.clear: clear the agent spotlight. • follow_ai.read / follow_ai.set: read or toggle whether TUI follows agent spotlight. • selection.read / selection.update: shared working-set selection (multi-object). Treat these as separate concerns: • Human attention: what the person is looking at. • Agent attention: what the agent wants to emphasize now. • Selection: short-term working set for batch reasoning/edits. • Follow-AI: whether the TUI camera/cursor tracks agent spotlight.
Core Principles
• Treat AST as source of truth; rendered text and Mermaid text are projections. • Session files (nereid-session.meta.json, diagrams/.mmd, walkthroughs/.wt.json) are app-managed snapshots and can be rewritten frequently while Nereid runs. • Use canonical ObjectRef everywhere: d:<diagram_id>/<seq|flow>/<participant|message|node|edge>/<object_id>. • Prefer small reads first (diagram.stat, diagram.get_slice, diagram.diff, walkthrough.diff). • Use typed query tools (seq., flow., xref.*, route.find) before large snapshots. • Gate edits with base_rev and keep ops minimal. • Record evidence as refs (xrefs and walkthrough nodes) so reasoning is resumable. • Keep dangling xrefs visible as TODO artifacts unless asked to clean them.
Execution Discipline
• Use MCP tools as the only source of truth. • Do not inspect src/, tests/, docs/, or data/ to answer runtime collaboration questions. • Make the first relevant MCP call immediately after reading the user prompt. • Prefer direct MCP execution over schema/code exploration. • If payload shape is unclear, call the tool once and adapt from validation errors. • Use shell/file probing only when the user explicitly asks for file-level inspection or storage debugging.
Discussion
Health Signals
My Fox Den
Community Rating
Sign in to rate this booster