MCP Server
The Model Context Protocol (MCP) server allows AI applications to search and query the TechWolf Developer Portal in real-time. This is the recommended approach for interactive AI workflows. URL:https://developers.techwolf.ai/mcp
Setup
Add the following to your MCP client configuration (Claude Code, Cursor, Windsurf, etc.):Available tools
| Tool | Description |
|---|---|
SearchTechWolfDeveloperPortal | Search across the knowledge base for documentation, code examples, API references, and guides. |
When to use
- Building AI-powered integrations that need to look up API documentation on-the-fly.
- AI coding assistants (Cursor, Windsurf, Claude Code) that need context about the Skill Engine API while writing integration code.
llms-full.txt
A single file containing the full content of the TechWolf Developer Portal. Load this into an AI application’s context window for comprehensive, one-shot access to all documentation. URL:https://developers.techwolf.ai/llms-full.txt
When to use
- You need the AI to have complete knowledge of the documentation upfront.
- Your AI tool does not support MCP.
This file is very large. Ensure your AI application’s context
window can accommodate it or use subagents.
llms.txt
An AI-friendly sitemap that lists all documentation pages with their URLs. This helps AI systems discover and index the TechWolf Developer Portal content. URL:https://developers.techwolf.ai/llms.txt
When to use
- AI crawlers or indexing systems that need to discover available documentation.
- You want the AI to selectively fetch only the pages relevant to a specific question, rather than loading everything at once.
- Lighter alternative when full context loading is not practical.