ISG Research is happy to share insights gleaned from our latest Buyers Guide, an assessment of how well software providers’ offerings meet buyers’ requirements. The DataOps: ISG Research Buyers Guide is the distillation of a year of market and product research by ISG Research.
As enterprises embracing artificial intelligence move from initial pilots and trial projects through deployment and into production at scale, many are realizing the critical importance
of agile and responsive data processes. These processes are often combined with tools and platforms that facilitate data management to improve trust in the data used for AI and analytics.
This has led to increased attention on the role of data operations, which ISG Research defines as the application of agile development, DevOps and lean manufacturing by data engineering professionals in support of data production. It encompasses the development, testing, deployment and orchestration of data integration and processing pipelines, along with improved data quality and validity via data monitoring and observability.
DataOps has been part of the lexicon of the data market for almost a decade. It takes inspiration from DevOps, which describes a set of tools, practices and a philosophy used to support the continuous delivery of software applications in the face of constant change.
Interest in DataOps is growing. ISG Research asserts that through 2026, more than one-half of enterprises will adopt agile and collaborative DataOps practices to facilitate responsiveness, avoid repetitive tasks and deliver measurable data reliability improvements. A variety of products, practices and processes enable DataOps, including products that support agile and continuous delivery of data analytics and AI and continuous, measurable improvement.
An emphasis on agility, collaboration and automation separates DataOps from traditional approaches to data management, which typically included batch-based, manual and rigid tools and practices. However, this distinction between DataOps and traditional data management tools is clearer in theory than in practice. Providers of traditional data management have, in recent years, incorporated capabilities that make products more automated, collaborative and agile. There is no industry-wide consensus on the level of agility, collaboration and automation that must be provided for products to be considered part of the DataOps category.
This has led to some traditional data management providers adopting a broader definition of DataOps that describes the combination of people, processes and technology needed to automate the delivery of data to users in an enterprise and enable collaboration to facilitate data-driven decisions. This definition is broad enough that it could be interpreted to encompass all products and services that address data management and data governance, including many traditional batch-based, manual products that do not support agile and continuous delivery and continuous, measurable improvement.
A narrower definition of DataOps focuses on the practical application of agile development, DevOps and lean manufacturing to the tasks and skills employed by data engineering professionals in support of data analytics development and operations. This definition emphasizes specific capabilities such as continuous delivery of analytic insight, process simplification, code generation, automation to avoid repeated errors and reduced repetitive tasks, the incorporation of stakeholder feedback and advancement and measurable improvement in the efficient generation of insight from data. As such, the narrow definition of DataOps provides a set of criteria for agile and collaborative practices that products and services can be measured against.
ISG Research’s perspective, based on our interaction with the software provider and user communities, aligns with the narrow definition. While traditional data management and data governance are complementary, our DataOps coverage focuses specifically on the delivery of agile business intelligence and AI through the automation and orchestration of data processing pipelines, incorporating improved data reliability and integrity via data monitoring and observability.
To be more specific, we believe that DataOps products and services provide the following functionality: agile and collaborative data operations; the development, testing and deployment of data and analytics pipelines; data orchestration; data observability; and the delivery of data products. These are the key criteria we used to assess DataOps products and services as part of this Buyer’s Guide.
This research is comprised of parallel evaluations of products addressing each of the four core areas of functionality: data pipelines, data orchestration, data observability and data products. Additionally, we evaluated all products in all categories in relation to their support for agile and collaborative practices.
The development, testing and deployment of data pipelines is a fundamental accelerator of data-driven strategies, enabling enterprises to extract data generated by operational applications used to run the business and transport it into the analytic data platforms used to analyze operations. ISG Research defines data pipelines as the systems used to transport, process and deliver data produced by operational data platforms and applications into analytic data platforms and applications for consumption. Healthy data pipelines are necessary to ensure data is ingested, processed and loaded in the sequence required to generate BI and AI.
Given the increasing complexity of evolving data sources and requirements, it is essential to automate and coordinate the creation, scheduling and monitoring of data pipelines as part of a DataOps approach to data management. This is the realm of data orchestration, which ISG Research defines as providing the capabilities to automate and accelerate the flow of data to support operational and analytics initiatives and drive business value via capabilities for the monitoring and management of data pipelines and associated workflow. By 2027, more than one-half of enterprises will adopt data orchestration technologies to automate and coordinate data workflows and increase efficiency and agility in data and analytics projects.
Maintaining data quality and trust is a perennial data management challenge, often preventing enterprises from operating at the speed of business. In addition to automating and coordinating the creation, scheduling and monitoring of data pipelines via data orchestration, it is also critical to monitor the quality and reliability of the data flowing through those data pipelines. This is achieved using data observability, which ISG Research defines as providing the capabilities for monitoring the quality and reliability of data used for analytics and governance projects as well as the reliability and health of the overall data environment. The metrics generated by data observability also form a critical component of the development and sharing of data products, providing the information by which data consumers can gauge if a data product meets their requirements in terms of a variety of attributes including validity, uniqueness, timeliness, consistency, completeness and accuracy.
ISG Research defines data products as the outcome of data initiatives developed with product thinking and delivered as reusable assets that can be discovered and consumed by others on a self-service basis, along with associated data contracts and feedback options. Key capabilities for platforms that enable the development of data products include a dedicated interface for the development and classification of data products and data contracts as well as a dedicated interface for the self-service discovery and consumption of data products and data contracts. Data product platforms should also include the ability for consumers of data products to provide feedback, comments and ratings as well as request improvements or new products, and the ability for data owners to monitor data product usage and performance metrics and view and manage requests for data product modifications and the development of new data products.
As always, however, software products are only one aspect of delivering on the promise of DataOps. New approaches to people, processes and information are also required to deliver agile and collaborative development, testing and deployment of data and analytics workloads, as well as data operations. To improve the value generated from analytics and data initiatives, enterprises need to adopt processes and methodologies that support rapid innovation and experimentation, automation, collaboration, measurement and monitoring, and high data quality.
The ISG Buyers Guide™ for DataOps evaluates software providers and products to address data pipeline development, testing and deployment, data pipeline orchestration, data pipeline observability, and data products. Providers with products that address at least two elements—data pipelines, data orchestration or data observability—were deemed to provide a superset of functionality to address DataOps overall. This approach is designed to maintain consistency with the 2023 DataOps Buyers Guide and reflects the relative immaturity of data product platforms.
This research evaluates the following software providers that offer products to address key elements of DataOps as we define it: Alteryx, AWS, Astronomer, BMC, Dagster Labs, Databricks, DataKitchen, DataOps.live, dbt Labs, Google, Hitachi, IBM, Informatica, Infoworks, K2View, Keboola, Matillion, Microsoft, Nexla, Prefect, Qlik, Rivery, SAP, Y42 and Zoho.
This research-based index evaluates the full business and information technology value of DataOps software offerings. We encourage you to learn more about our Buyers Guide and its effectiveness as a provider selection and RFI/RFP tool.
We urge organizations to do a thorough job of evaluating DataOps offerings in this Buyers Guide as both the results of our in-depth analysis of these software providers and as an evaluation methodology. The Buyers Guide can be used to evaluate existing suppliers, plus provides evaluation criteria for new projects. Using it can shorten the cycle time for an RFP and the definition of an RFI.
The Buyers Guide for DataOps in 2024 finds Informatica first on the list, followed by Microsoft and IBM.
Software providers that rated in the top three of any category ﹘ including the product and customer experience dimensions ﹘ earn the designation of Leader.
The Leaders in Product Experience are:
- Informatica.
- Microsoft.
- IBM.
The Leaders in Customer Experience are:
- Databricks.
- Microsoft.
- SAP.
The Leaders across any of the seven categories are:
- Informatica, which has achieved this rating in five of the seven categories.
- Databricks and Microsoft in three categories.
- Google and SAP in two categories.
- Alteryx, AWS, Dataops.live, IBM, Keboola and Qlik in one category.

The overall performance chart provides a visual representation of how providers rate across product and customer experience. Software providers with products scoring higher in a weighted rating of the five product experience categories place farther to the right. The combination of ratings for the two customer experience categories determines their placement on the vertical axis. As a result, providers that place closer to the upper-right are “exemplary” and rated higher than those closer to the lower-left and identified as providers of “merit.” Software providers that excelled at customer experience over product experience have an “assurance” rating, and those excelling instead in product experience have an “innovative” rating.
Note that close provider scores should not be taken to imply that the packages evaluated are functionally identical or equally well-suited for use by every enterprise or process. Although there is a high degree of commonality in how organizations handle DataOps, there are many idiosyncrasies and differences that can make one provider’s offering a better fit than another.
ISG Research has made every effort to encompass in this Buyers Guide the overall product and customer experience from our DataOps blueprint, which we believe reflects what a well-crafted RFP should contain. Even so, there may be additional areas that affect which software provider and products best fit an enterprise’s particular requirements. Therefore, while this research is complete as it stands, utilizing it in your own organizational context is critical to ensure that products deliver the highest level of support for your projects.
You can find more details on our community as well as on our expertise in the research for this Buyers Guide.