One layer.
Every tool.
Halyard has two kinds of integrations. Sources — the places your team produces knowledge. Agents — the AI tools that need to read it. We ship native connectors for both.
Read Halyard from every AI tool.
Every agent-side integration uses Model Context Protocol (MCP). Install once per tool, and Halyard exposes the same search_knowledge / ask_expert / explore_knowledge surface to all of them.
Claude Code
Plug Halyard into Claude Code via MCP. Your agent gets grounded context + human escalation.
Read integration →Slack
Capture team knowledge from threads. Route agent questions to the right human.
Read integration →OpenAI Codex
Configure Codex with Halyard as a remote MCP server in your codex.json.
Coming soonCursor
Add Halyard as an MCP server in Cursor's settings. Your agents retrieve your team's context.
Coming soonWindsurf
Halyard as a background MCP server in Windsurf. Agents call search_knowledge inline.
Coming soonClaude Desktop
Add Halyard to Claude Desktop for grounded conversations without opening a terminal.
Read integration →Capture from where work happens.
Sources feed the knowledge graph. Slack is the flagship — it's both where your team captures tacit knowledge and where Halyard routes ask_expert calls.
Slack
Capture team knowledge from threads. Route agent questions to the right human.
Read integration →Notion
Index Notion pages and databases as grounded context for your agents.
Coming soonGitHub
Surface PR decisions, ADRs in-repo, and architectural context to every agent.
Read integration →Google Drive
Index Drive docs and sheets so your agents can cite from spec docs and meeting notes.
Coming soonLinear
Connect Linear for issue context, status, and product-decision trails.
Coming soonGranola
Meeting notes as a first-class knowledge source. Halyard reads Granola via its MCP.
Coming soonMissing something?
We ship new integrations based on real customer pulls. If you want Halyard to read from or plug into a tool we don't list yet, tell us.