Connect any AI Agents with OneStock MCP
BETA version
The feature is still being adjusted, and we welcome your feedback to help us improve it.
What is an MCP Server?
The Model Context Protocol (MCP), introduced by Anthropic in late 2024, has been described as the “USB-C of AI integrations.” It establishes a common interface that allows AI agents to access external services through a unified schema. With MCP, every server presents its features in the same standardized format, meaning agents can seamlessly interact with different systems without the need for bespoke API work.
Since its release, the protocol has gained strong traction in the AI ecosystem. Several players in sectors like e-commerce and payments already provide MCP servers, enabling developers to directly embed shopping or transaction capabilities into AI assistants and chatbots.
What’s the difference compared to APIs?
Unlike traditional APIs, which each come with their own rules, formats, and integration requirements, MCP abstracts those differences away. Instead of building and maintaining custom connectors for every service, developers can rely on a single, consistent protocol. This shift reduces integration overhead and makes it far easier to plug new capabilities into AI agents.
API connection
🔀 High heterogeneity: each partner has its own specs, data formats, authentication methods, and versions.
🔧 Heavy maintenance: constant updates required for every change or backward-incompatible release.
💸 High development cost: every new partner requires significant build and testing effort.
🚫Limited scalability: maintaining dozens of connectors quickly becomes unmanageable.
MCP connection
🧠High answer quality: use context of each tool ensure higher quality interactions than manually hardcoding use cases via APIs.
📏 Standardization: a single, unified way to describe and expose capabilities, regardless of partner.
⚡Reduced time-to-market : connecting a new compatible partner doesn’t require custom development.
🤝 Native interoperability: enables smooth collaboration across multiple agents and partners in the same flow.
🚀 Scalability: rapidly add new partners without accumulating technical debt.
How it works?
The MPC acts as the bridge between a Large Language Model (LLM) and the OneStock platform. Instead of hard-coding specific integrations, we expose a set of tools, each with its own context: what the tool does, when it should be used, and what type of information it requires.
When a customer asks a question in natural language like “Can you give me delivery information on my order?” the LLM doesn’t need to guess how to fetch the answer. Thanks to the contextual descriptions provided by MCP, it immediately knows which tool is appropriate (for example, searching an order or retrieving its details), under which conditions to use it, and which data to provide.
The LLM then calls the right tool via MCP, receives structured information back from the OneStock platform, and reformulates it into a clear, human-friendly response. This approach ensures that any complex business logic hidden behind APIs is made accessible in a simple, standardized way, while preserving flexibility for developers and fluidity for end users.
What tools are currently available?
The first beta release of our MCP integration focuses on two groups of tools designed to support end customers directly through AI agents embedded in e-commerce websites.
See here the full list of tools: Supported tools
🛍️ Conversion-oriented tools
These features help guide shoppers during the buying journey by making product availability and delivery promises transparent. Customers can instantly check stock by SKU, view accurate delivery options and fees, locate nearby stores with opening hours, or even reserve items in-store before picking them up. By removing friction, these tools are aimed at increasing trust and driving sales conversion.
Check stock availability : expose detailed inventory by SKU across stores network
Expose delivery promise : provide accurate delivery methods, ETAs and shipping fees.
Find nearby stores : list stores around a location with their opening hours
In-store reservation : allow customers to reserve items in store before pickup.
📦 After-sales tools
Once an order is placed, customers often need quick access to order management. This set of tools empowers them to track delivery status in real time, search for orders by various identifiers, cancel an item, update shipping or pickup details, and even initiate returns or exchanges autonomously. These capabilities ensure a smooth and reassuring post-purchase experience, reducing the need for customer service intervention.
Search orders : retrieve an order by ID, email, customer reference, or status.
Track order status : expose real-time delivery status, carrier info, and tracking link.
Cancel order : allow items cancellation
Update order : modify shipping address, or pickup store with address validation
Initiate a return : be able to declare a return or propose an exchange in total autonomy
Together, these two groups provide a complete journey: from building confidence at checkout to simplifying post-purchase interactions, making AI agents true companions across the entire e-commerce lifecycle.
Which systems are compatible with Onestock’s MCP?
Any AI agent that supports the Model Context Protocol can connect to OneStock’s MCP server, provided that the necessary access rights have been granted. This means the integration is not limited to a single partner: as long as the agent speaks MCP, it can interact with OneStock’s tools in a secure and standardized way.
For testing purposes, the easiest option today is to use Claude from Anthropic, which natively implements MCP and allows you to quickly explore how the integration works in practice.
To configure the Onestock MCP on Claude, see Access OneStock MCP from Claude