Built a tool that converts any REST API spec into an MCP server
Posted by rubalps@reddit | Python | View on Reddit | 19 comments
I have been experimenting with Anthropic’s Model Context Protocol (MCP) and hit a wall — converting large REST API specs into tool definitions takes forever. Writing them manually is repetitive, error-prone and honestly pretty boring.
So I wrote a Python library that automates the whole thing.
The tool is called rest-to-mcp-adapter. You give it an OpenAPI/Swagger spec and it generates:
- a full MCP Tool Registry
- auth handling (API keys, headers, parameters, etc.)
- runtime execution for requests
- an MCP server you can plug directly into Claude Desktop
- all tool functions mapped from the spec automatically
I tested it with the full Binance API. Claude Desktop can generate buy signals, fetch prices, build dashboards, etc, entirely through the generated tools — no manual definitions.
If you are working with agents or playing with MCP this might save you a lot of time. Feedback, issues and PRs are welcome.
GitHub:
Adapter Library: https://github.com/pawneetdev/rest-to-mcp-adapter
Binance Example: https://github.com/pawneetdev/binance-mcp
Scared_Sail5523@reddit
It's genuinely impressive that you took the initiative to build a library specifically to solve the drudgery of converting those enormous API specs for Anthropic’s MCP. Manually defining hundreds of functions is incredibly tedious and always invites mistakes, so automating that entire tool registry generation process is a huge boost to efficiency. The fact that the adapter fully handles authorization and execution, even for something as large as the Binance API, shows how robust your solution is. This tool is clearly going to save significant development time for anyone currently building agents or experimenting with the Model Context Protocol.
rm-rf-rm@reddit
Even Anthropic is admitting the problem with MCPs and why theyre not the right solution. Utils like this will only exacerbate whats bad and unscalable with MCPs - context bloat. This indiscriminately throws an entire API spec into MCP tools
Maybe useful for some one of use case or in some isolated env. For most real usecases, your much better of 1) just writing a traditional API call and feeding the output to an LLM (if youre writing a program 2) making the aforementioned api call into an MCP with fast MCP (if youre using a chatbot)
rubalps@reddit (OP)
Fair point. Dumping an entire API into MCP is not a great long-term pattern and that is not what this tool is meant to promote. The adapter is mainly for the early phase where you want to explore an API quickly, iterate fast, prototype ideas without hand-writing JSON schemas, and then refine or trim down what you actually want to keep. It also supports filtering during generation, so you do not need to expose the whole spec if you do not want to. The docs cover this.
For production setups, curated MCP tools or direct API calls are still the better approach. This is simply a faster way to get started, not a push to overload MCP.
smarkman19@reddit
This adapter is great for discovery, but production needs curation, tight schemas, and a gateway in front. Start by generating only the tags you’ll use, then collapse endpoints into task-first tools (one per job) with Pydantic-typed inputs/outputs and machine-readable error codes.
Put an s-http MCP server behind a gateway (mcpjungle or Kong), allowlist tools, issue short‑lived JWTs with scopes, add per‑tenant rate limits and audit logs. Keep calls dependable with timeouts, retries with backoff, and idempotency keys; cache safe GETs; keep context small by pinning a brief schema summary and passing diffs only. For multi-step flows, expose a few orchestrator tools so the model makes fewer calls. I’ve used Kong for rate limiting and Auth0 for tenant JWTs; DreamFactory helped publish quick REST endpoints over legacy databases so the MCP side stayed clean.
Revolutionary_Sir140@reddit
We don't need adapters, We need UTCP
https://github.com/universal-tool-calling-protocol
muneriver@reddit
reminds me of these:
https://kylestratis.com/posts/stop-generating-mcp-servers-from-rest-apis/
https://medium.com/@jairus-m/intention-is-all-you-need-74a7bc2a8012
rubalps@reddit (OP)
I get the point of those articles. Turning a whole REST API into MCP tools is kind of like giving an LLM a thousand-piece Lego set and expecting a spaceship on the first try. This adapter is meant to speed up experimentation, not something you drop into production without thought and testing.
FiredFox@reddit
Looks like a pretty nice example of a vibe-coded project. I'll check it out.
rubalps@reddit (OP)
Thanks, appreciate that! Honestly, to make a 'vibe-coded' approach actually work, I found I needed more planning, not less. Having clear phases was the only thing that let me move fast without the code turning into a mess. It definitely required thorough testing to stabilize the vibes, though. Feel free to open an issue if you spot anything!
nuno6Varnish@reddit
Cool project! Talking about those large and messy APIs, how can you limit the context window? Did you think about manually selecting the endpoints to have more specialized MCP servers?
rubalps@reddit (OP)
Thanks! And yes, context window is a real concern with huge specs.
The adapter already supports filtering, so you can include only the endpoints you want (by path or method). That way you do not expose the entire API to the model.
Doc link: https://github.com/pawneetdev/rest-to-mcp-adapter/blob/master/LIBRARY_USAGE.md#filtering-tools-during-generation
InnovationLeader@reddit
Can I cherry pick the APIs which I want added as tools? If not that will be a very helpful feature
rubalps@reddit (OP)
You don’t need to delete anything from the Swagger/OpenAPI file 🙂
The adapter already supports endpoint filtering.
You can pass a filter config during generation to include only the paths or methods you want. The docs for it are here:
https://github.com/pawneetdev/rest-to-mcp-adapter/blob/master/LIBRARY_USAGE.md#filtering-tools-during-generation
Smok3dSalmon@reddit
Just delete the endpoints from the swagger doc
Disastrous_Bet7414@reddit
I haven't found MCP nor tool calling to be reliable enough thus far. Maybe more training data could help.
But in the end, I think well structured, custom data pipelines are the best to get reliable results. That's my opinion.
InnovationLeader@reddit
Could be the model you’ve been using. MCP has been perfect for integration and current AI does well to call the right tools?
Any_Peace_4161@reddit
REST and SOAP (and Swift - the protocol, not the language) still rule most of the world. There's WAY more SOAP out there than people are willing to accept. XML rocks.
vaaaannnn@reddit
And what about fastmcp ?)
rubalps@reddit (OP)
I built this mostly for learning and exploration. I know FastMCP also supports OpenAPI conversion, but I wanted to understand the internals and build something tailored for large, messy, real-world APIs like Binance. Should've mentioned it in the post.