How Arazzo Could Help MCP Servers Orchestrate APIs for AI Consumers Janet Wagner November 4, 2025 We recently published an article about why developers should use an OpenAPI specification as a starting point for an MCP server. The piece led to an interesting discussion on LinkedIn about what makes an MCP server performant and useful. One critical piece is building tools that can orchestrate multiple API calls together for AI consumers. But how would you enable that API orchestration? Well, Arazzo could help with that. What is the Arazzo Specification? The Arazzo Specification is an open specification that provides a mechanism for defining sequences of API calls and their dependencies to create deterministic API workflows. It expresses the functional use cases offered by an API or group of APIs, making it easier for humans and machines to understand and consume them. Arazzo is one of three specifications in development under the OpenAPI Initiative, the others being the OpenAPI specification and the Overlay Specification. Why MCP Server Tools Need to Orchestrate Multiple APIs MCP servers are built primarily for AI agents, which need to connect to large language models (LLMs) to perform multiple tasks all at once or within a very short period. While LLMs can complete a wide range of tasks, they often rely on external tools that use APIs to extend their capabilities. MCP servers offer LLMs tools they can choose from, and those tools need the ability to orchestrate APIs because: Real-World Tasks Are Complex Most real-world tasks can’t be completed with a single API call. Let’s say you’re building an MCP server that powers an AI agent for an online travel agency. If the AI agent needed to find and book a flight for a customer, it would need a tool that uses multiple APIs and makes calls in a specific order: API #1: Search for flights API #2: Get prices for flights API #3: Check seat availability API #4: Add traveler details and book the seat API #5: Process payment for the flight API #6: Send confirmation email Without an efficient method of API orchestration in place, the AI agent would have to act as the orchestrator, managing these steps individually. If the agent must orchestrate the API calls, it could lead to errors, high latency, or the AI app breaking. Simplicity Helps AI Agents Perform More Reliably Orchestration abstracts the complexity of the underlying APIs away from the AI agent. The AI model doesn’t need to know the specific details of each API and endpoint. It doesn’t have to figure out how to put together API calls and in what sequence (which increases LLM token usage). The AI agent also doesn’t have to handle API errors, retries, and partial failures. It only needs to call a single, high-level “book a flight” tool from the MCP server. Moving this complexity to the server tool simplifies the AI agent’s reasoning, making it more reliable. Options for Enabling API Orchestration in MCP Servers Now that we know why MCP tools need to orchestrate multiple APIs, let’s look at how you can enable this capability in your server. Arazzo is one of several options developers could use to incorporate API orchestration into MCP servers: Declarative orchestration tools: Tools that allow you to define or build declarative workloads or patterns. Examples: OData (protocol), Arazzo (specification), GraphQL (query language). API gateways: Some API gateways offer API call sequencing, usually via orchestration plugins. Examples: Kong, WSO2, Tyk. Workflow automation platforms: These platforms are used in different contexts, such as executing application logic, business processes, or AI integrations. Some of them can be used for API orchestration. Examples: Temporal, Camunda, n8n. Integration platforms: Most integration platforms include pre-built connectors to many different APIs. Some of them also allow orchestration of multiple APIs. Examples: MuleSoft Anypoint, SnapLogic, Workato. If you’re already using one of these tools, you could try using it for API orchestration or use Arazzo along with it. Why Use Arazzo to Enable API Orchestration in Your MCP Server? Arazzo was not designed specifically for API consumption by AI agents or LLMs. However, it offers a standardized and deterministic way of enabling API orchestration. The benefits of using Arazzo to help MCP server tools orchestrate APIs include: Improves the efficiency of agentic API consumption: You can define and apply consumption semantics based on specific use cases and across multiple operations. You can do this with a single API description or multiple independent API descriptions. Arazzo enables the AI to execute API calls and consume APIs in the same way every time. This deterministic approach improves the efficiency of agentic API consumption. Abstracts complexity away from AI Agents: Arrazo is declarative, which means you can define API workflows via schemas and queries instead of code. The MCP server can then expose these Arazzo-defined workflows as tools to the AI agent. If an AI agent needs to “book a flight,” the workflow could cover all the steps and APIs outlined earlier and present them to the agent in a single, simplified tool. Greater control over API usage: Arrazo’s extensibility makes it possible to include usage-based or SLA-based metadata that developers can use to ensure AI agents do not exceed API usage limits or the intended use of APIs. Developers can enforce these policies at the observation or processing layer to manage the MCP server’s operational costs and scale. Helps reduce LLM token usage: OpenAPI gives developers a way to explain to LLMs the purpose and intent of each endpoint with examples, and Arazzo provides a way to describe the correct sequences of API calls. If the AI model immediately understands what each tool is, when and why to use each one, and the correct order of API calls, it won’t have to generate tokens to figure all that out itself. Since LLM pricing is typically based on token usage, this could decrease overall costs. Mark Boyd, Director at Platformable, explained to Nordic APIs that Arazzo can provide more than just a description of a workflow process. It can describe use case intent, which is better understood by AI than individual endpoint functionality. It can include metadata that describes usage so that AI agent consumption can be monitored and controlled to prevent cost blowouts from excessive use. “Starting with Arazzo is a business decision,” Boyd said. “Workflow use cases for AI need to be prioritized by business value, and by mapping the use case, developer teams can make better FinOps decisions based on whether the cost-benefit ratio of building MCP servers and refactoring APIs for agent consumption (or at least detailing them more explicitly with error messages and examples) is worth pursuing on a case-by-case basis.” To enable API orchestration, you would need to code the MCP server to reference a separate external Arazzo document. The Arazzo document references operations from your OpenAPI document, describing how to combine them into step-by-step sequences. While using Arazzo for API orchestration in MCP servers offers several benefits, it has its challenges. Arazzo Has Multiple Challenges Arazzo has great potential as a tool for enabling API orchestration for AI consumers. However, it faces some barriers to gaining industry acceptance. The specification is relatively new — many experienced API practitioners have never heard of it, so adoption has been minimal so far. Arazzo is also very detailed with a steep learning curve, which may discourage some developers from using it. It wasn’t designed for AI consumption use cases in the first place, and lastly, it’s primarily only associated with one vendor (SmartBear). Nolan Di Mare Sullivan, Founding DevRel at Speakeasy, said on LinkedIn that he thinks Arazzo is a possible solution for MCP tools orchestrating multiple APIs. However, he told Nordic APIs that he’s not convinced Arazzo is the right solution for MCP specifically. He also pointed out that there are significant adoption barriers. “Arazzo includes elaborate constructs for workflow management, runtime expressions, success criteria, and failure handling — all overkill for the straightforward tool definitions that MCP requires,” Di Mare Sullivan explained to Nordic APIs. “This creates unnecessary cognitive overhead for developers who just want to expose a function to an AI model.” The specification also has a marketing problem and a UX challenge. “It’s just daunting to look at an Arazzo spec — it is dense and very detailed, and I think developers will balk at having to learn it,” he commented. However, if tooling and a user experience can be created to make it easier to define Arazzo workflows, Di Mare Sullivan thinks it’s certainly possible that Arazzo could be used under the hood for some API orchestration tasks. The bottom line: Arazzo needs more community support to successfully enable AI usage of APIs. Guiding Multi-Step Workflows With Arazzo Having an MCP server reference an Arazzo specification that deterministically guides the AI agent through multi-step, API-driven workflows is far better than letting it guess at those workflows on its own. Without this guidance, the agent will spend more tokens trying to figure out which APIs to call and the correct sequences of calls — increasing server latency, potential errors, and LLM costs. One senior engineer discovered first hand how token usage increases when the LLM context window receives too much information from an API and the model also has to figure something out on its own. Imagine the LLM dealing with multiple APIs — token usage would skyrocket! Arazzo is still in its early stages. With some time and effort from the OpenAPI Initiative and the developer community, the specification could eventually become easier to use and expanded specifically for AI use cases. It’s poised to become crucial to the API orchestration solution developers need to build performant and useful MCP servers for AI agents and LLMs. The latest API insights straight to your inbox