Skip to main content

The generation pipeline

MCP Blacksmith converts an OpenAPI specification into a fully functional MCP server through a multi-stage pipeline:
  1. Parse & Validate — Verify structural and syntactical specification correctness
  2. Filter & Enhance (optional) — AI-curate operations and parameters
  3. Generate Code — Produce a complete Python project with dependencies
  4. Package Server — Bundle into a downloadable ZIP archive

1. Parse and validate

Your OpenAPI specification is parsed and validated for structural correctness. MCP Blacksmith supports:
  • OpenAPI 2.0 (Swagger)
  • OpenAPI 3.0.x
  • OpenAPI 3.1.x
  • OpenAPI 3.2.x
Specifications are validated for schema correctness, reference integrity, and required fields before generation begins. Issues that would produce broken code block generation and are reported with specific locations and suggested fixes. You can also run a dedicated validation pass separately to inspect your specification in detail. This is optional and non-blocking — results are shown in the Viewer tab of the dashboard.

2. Extract and enrich operations

Operations and their request/response schemas are extracted from the specification. Each API operation (e.g., GET /users, POST /orders) is mapped to an MCP tool with its parameters, request body, response schemas, and authentication requirements. Optionally, request schemas can be filtered and enhanced using enhancement passes. Read-only and server-generated fields are removed, meanwhile authentication-related fields are abstracted from AI agents. Parameter descriptions, examples, and constraints are optimized for LLM readability, and schemas are rewritten for token efficiency. This curation step is what separates a raw API wrapper from a production-quality MCP server — it ensures AI agents see only the parameters they should control, with descriptions, examples, and constraints they can understand.

3. Generate code

The generator produces a complete Python project:
FilePurpose
server.pyMCP server with tool definitions for every operation
_models.pyPydantic models for request/response validation
_validators.py50+ format validators (full OAS Format Registry coverage)
_auth.pyAuthentication handlers for each security scheme
.envEnvironment variables for credentials and configuration
requirements.txtPython dependencies
LICENSEMIT license for generated code
DockerfileProduction Docker build
.mcp.jsonMCP client configuration template
README.mdSetup and usage instructions
Each API operation becomes an MCP tool — an async function that handles parameter validation, authentication injection, HTTP request execution, and response processing.

4. Package and download

The complete server is packaged as a ZIP file. Extract it, install dependencies, configure credentials, and run — no additional code generation or compilation needed.

What the generated server does at runtime

When an MCP client (like Claude) calls a tool:
  1. Validate — Parameters are validated against Pydantic models. If types, formats, or constraints don’t match, the agent receives immediate, context-rich feedback describing exactly what failed — before any request is sent to the API
  2. Authenticate — Correct credentials are injected based on the operation’s authentication requirements
  3. Execute — HTTP request is sent to the target API with retry logic and circuit breaking
  4. Validate response — Response is optionally validated against the API’s response schema
  5. Return — Structured data is returned to the MCP client. API errors are normalized into a consistent, LLM-readable format regardless of the upstream API’s error structure
All of this happens transparently. The AI agent sees simple tools with typed parameters and receives actionable feedback on every outcome.

Deployment considerations

Generated servers are designed for self-hosted, single-tenant deployment. Each server instance serves one user or application with its own credentials.
  • Do not commit .env to version control — it may contain sensitive API credentials
  • The server requires its dependencies installed (via requirements.txt, either in a virtual environment or globally)
  • For multi-tenant deployment with shared infrastructure, visit MCP Armory