

EXCLUSIVE, EARLY RELEASE
Build Agentic Apps With an LLM Mesh

Build Reliable Agentic Applications Across the Enterprise
In this technical guide from O’Reilly and Dataiku, discover how the LLM Mesh addresses key challenges around using agentic applications at scale, making it easier and safer for IT teams to harness the power of GenAI.
IT Teams: Get Exclusive Access to Chapters 1-4

88% of Organizations Are Increasing Their GenAI Investment for 2025
This data, from a 2024 Dataiku survey of 400 data, analytics, and IT leaders, highlights that GenAI is a top priority at the highest level — 66% have invested over $1 million in GenAI in the last 12 months and 88% are increasing their GenAI investment in 2025 vs. 2024.
IT leaders: Make sure you can execute fast, at the right cost, with an eye to the future with an LLM Mesh approach.
Why an LLM Mesh?
The Dataiku LLM Mesh is the most comprehensive and agnostic LLM gateway offering on the market, partnering with the top generative AI players for secure access to thousands of LLMs (both as a service or self-managed).

The Importance of an LLM Mesh Architecture
The Key to High-Performing Agentic Applications
What Exactly Is an
LLM Mesh?
An LLM Mesh is an architecture paradigm for building agentic applications in the enterprise.
An LLM Mesh should enable you to:
1. Access various LLM-related services through an abstraction layer.
2. Provide federated services for control and analysis.
3. Provide central discovery and documentation for LLM-related objects via a catalog.
When Should You Use an LLM Mesh?
The LLM Mesh is valuable when the total number of agentic applications being developed in an organization begins to grow, more teams start building and using the applications, and more complex agentic applications are being designed and built.
In this context, the LLM Mesh will accelerate the development of the applications, simplify their maintenance, and help to ensure that the applications meet enterprise standards for safety, security, and performance.
Measure & Monitor the Performance of Agentic Applications
Measuring the performance of agentic applications requires a structured approach to monitoring quality, speed, and cost efficiency.
In the just-released Chapter 4, discover how an LLM Mesh provides scalable, consistent performance monitoring for agentic applications across the enterprise.
While agentic applications introduce unique challenges, a well-designed performance framework empowers IT leaders to optimize accuracy, responsiveness, and cost-effectiveness.
“A wise LLM strategy, once you have moved beyond initial prototypes, is to use the right model for the right application. This is easier said than done. LLMs are easy to use out of the box. You don’t even need to know how to code! However, using them in production is a different challenge, and it requires careful consideration of security, safety, performance, and cost.”
— Kurt Muehmel and the Dataiku team
“The power of an LLM Mesh architecture is in reducing the cost of introducing an incremental model into production use in an organization. Once set up properly, an LLM Mesh architecture can drive the cost of using an additional LLM in the enterprise to nearly zero. In such a context, an enterprise is free to develop an optimal AI strategy without compromising on performance, cost, safety, or security.”
— Kurt Muehmel and the Dataiku team
“The LLM Mesh represents a pivotal step in AI. At Dataiku, we’re bridging the gap between the promise and reality of using generative AI in the enterprise. We believe the LLM Mesh provides the structure and control many have sought, paving the way for safer, faster GenAI deployments that deliver real value.”