ISG Software Research Analyst Perspectives

Semantic Layers Provide Business Context for BI and AI Agents

Written by Matt Aslett | Mar 17, 2026 10:00:00 AM

As I previously described, context is everything for generative (GenAI) and agentic AI. Enterprises moved in large numbers to adopt foundational GenAI since its emergence into mainstream consciousness during 2023. As they did so, it quickly became clear that establishing trust in GenAI output would require enterprises to augment the realistic content generated by foundation models with real-world context from enterprise information and data. Analytics and data software providers have responded by adding capabilities such as vector search and retrieval-augmented generation to enable enterprises to combine foundation models with enterprise information, as well as support for Model Context Protocol to connect AI agents and models with enterprise data sources. The incorporation of agreed business definitions is another critical element in connecting and combining generated content with trusted context, highlighting the importance of semantic data modeling and semantic layer software.

Semantic models are by no means new to enterprise IT. They have been an important part of the computing landscape since the early days of data processing and analytics software, adding meaning to data that provides the conceptual context for its use. Data modeling defines how information related to entities, facts and relationships is structured, stored and organized in a persistent data store, such as a database. Semantic modeling provides additional conceptual information that reflects the meaning of those entities, facts and relationships, as well as agreed definitions related to business logic. This is critical for providing business context to the stored data. For example, the data model can represent the fact that a customer is headquartered in Canada, while the semantic model represents the concept that Canada is a target market for business expansion. It is this context that elevates this specific customer—and its data—to a higher priority.

Agreed definitions are critical in enabling enterprises to achieve business insight. Even ostensibly simple questions can be fiendishly difficult to answer without agreed definitions. How many Canadian customers does the business have? The answer depends on how the business defines a customer. Without agreed definitions, the answer could vary depending on which region or department is asking the question. Semantic modeling is designed to ensure that everyone across the business understands and agrees on the definitions included in the model, both for raw data generated by enterprise applications and metrics derived from that raw data.

Semantic modeling has always been important but was for many years a niche activity. Only 9% of participants in ISG’s Data and AI Market Lens Study named semantic modeling as one of their five largest funded data activities in 2025. Semantic modeling was served for many years by a select group of software providers. These included business intelligence (BI) software providers that used semantic modeling to facilitate the generation of BI dashboards, as well as a handful of dedicated providers of semantic layer software designed to work with multiple BI tools. It has become the focus of increased attention in recent years as a critical element of several key trends that are shaping the future of enterprise computing.

I have previously mentioned that generating agreed definitions through semantic modeling is also essential to the creation of knowledge graphs, which facilitate search-based data discovery by identifying, classifying and maintaining a map of entities, attributes and relationships, for example. Clear data definitions are also important in the context of headless BI and data products. As more business users begin to interact with and analyze enterprise data through natural language interfaces, there is a greater need for agreement on data definitions. The ability to answer an informal natural language question, such as “Has the Canada campaign been successful?” is reliant on semantic modeling to not only identify the number of Canadian customers, but also the concepts, metrics and key performance indicators involved in determining growth and success.

This example also highlights the importance of semantic models in facilitating the use of data for AI initiatives, as well as enabling agent-based automation. AI agents are autonomous software entities designed to perceive their environment, make context-aware decisions and take actions based on perception and reasoning. Access to semantic models is required to provide a foundation of business context that can be understood and acted upon by an autonomous agent. I assert that through 2027, analytics and data providers will prioritize the development of semantic modeling capabilities to facilitate the execution of autonomous AI agents.

The importance of semantic data modeling to each of these trends—as well as their confluence—is driving a rapid expansion in the number of data and analytics software providers adding semantic modeling and semantic layer functionality to their products. While this would appear to be a boon for adoption of semantic modeling in general, it also poses a dilemma for many enterprises, which now have multiple options for where, when and how to create semantic models—including multiple BI tools, data platforms, data catalogs, data transformation tools, AI platforms and specialist semantic layers.

The availability of so many different options for semantic modeling increases the risk that an enterprise may end up with multiple groups across the organization using multiple tools to create multiple semantic models, thereby undermining the goal of creating agreed definitions. The disparate nature of the market combined with the complexity of semantic modeling perhaps explains why more than one-third (36%) of participants in ISG’s Data and AI Market Lens Study rated their semantic modeling initiatives as performing below expectations. In an attempt to overcome this problem, multiple providers, led by Snowflake,
announced the creation of the Open Semantic Interchange (OSI) in September 2025. The OSI is an attempt to create a vendor-neutral semantic model specification that standardizes semantic metadata to facilitate sharing between multiple products and providers.

Although the OSI is in the early stages of development, it has a worthy goal, and in January 2026, it announced the availability of the first version of its specification, as well as several new participants. The full group of providers participating in the OSI now includes Alation, Atlan, AtScale, BlackRock, Blue Yonder, Coalesce, Collate, Collibra, Credible, Cube, Databricks, DataHub, dbt Labs, Domo, Elementum AI, Firebolt, Hex, Honeydew, Informatica, JetBrains, Lightdash, Mistral AI, Omni, Preset, Qlik, RelationalAI, Salesforce, Select Star, Sigma, Snowflake, Starburst Data, Strategy and ThoughtSpot.

We are aware of several other providers that are watching the progress of the OSI with interest. As such, it appears to have the potential to become a true industry standard. I recommend that while continuing to monitor the progress of the OSI, enterprises investing in analytics and data initiatives and AI agents should understand the benefits of semantic modeling and evaluate providers in relation to their support for semantic modeling and semantic layer capabilities.

Regards,

Matt Aslett