Executive Summary
Data Integration
Data integration is a fundamental enabler of a data management strategy. Analysis of individual data sources—such as customer or product data—can provide insights to improve operational efficiency. However, the combination of data from multiple sources enables enterprises to innovate, improving customer experience and revenue generation, for example, by targeting the most lucrative customers with offers to adopt the latest product.
ISG Research defines data integration as set of processes and technologies that enable enterprises to extract, combine, transform and process data from multiple internal and external data platforms and applications to maximize the value of analytic and operational use. Without data integration, business data would be trapped in the applications and systems in which it was generated.
Traditional approaches to data management are rooted in point-to-point batch data processing, whereby data is extracted from its source, transformed for a specific purpose and loaded into a target environment for analysis. The transformation could include the normalization, cleansing and aggregation of data. More than two-thirds (69%) of enterprises cite preparing data for analysis as the most time-consuming aspect of the analytics process. Reducing the time and effort spent on data integration and preparation can significantly accelerate time to business insight.
Although point-to-point data integration continues to serve tactical data integration use cases, it is unsuitable for more strategic enterprise-wide data integration initiatives. These require the orchestration of a complex mesh of agile data pipelines that traverse multiple data-processing locations and can evolve in response to changing data sources and business requirements. We have seen increased focus in recent years in the concept of data fabric, which represents a technology-driven approach to managing and governing data across distributed environments. We assert that through 2027, three-quarters of enterprises will adopt data fabric technologies to facilitate the management and processing of data across multiple data platforms and cloud environments.
Combining data from multiple data sources has clear potential benefits but is not without its risks and challenges. Although 38% of participants in ISG’s 2025 Market Lens Data and AI Program Study agreed that the cost of harmonizing data management across the entire organization outweighs the likely benefits, a similar proportion (37%) disagreed. Success with data integration initiatives is therefore not guaranteed and will depend on the approach and tools adopted.
Traditional batch extract, transform and load (ETL) integration products extract data from a source and transform it in a dedicated staging area before loading it into a target environment (typically a data warehouse or data lake) for analysis. The dedicated ETL staging layers were important to avoid placing an undue transformation processing burden on the target data platform, ensuring that sufficient processing power was available to perform the necessary analytic queries.
Since they are designed for a specific data transformation task, ETL pipelines are often highly efficient. However, they are also rigid, difficult to adapt and ill-suited to continuous and agile processes. As data and business requirements change, ETL pipelines must be rewritten accordingly. The need for greater agility and flexibility to meet the demands of real-time data processing is one reason we have seen increased interest in extract, load and transform data pipelines.
As data and business requirements change, ETL pipelines must be rewritten accordingly.
Extract, load and transform pipelines use a more lightweight staging tier, which is required simply to extract data from the source and load it into the target data platform. Rather than a separate transformation stage prior to loading, ELT pipelines make use of pushdown optimization, leveraging the data processing functionality and processing power of the target data platform to transform the data. Pushing data transformation execution to the target data platform results in a more agile data extraction and loading phase, which is more adaptable to changing data sources. This approach is well-aligned with the application of schema-on-read applied in data lakehouse environments, as opposed to the schema-on-write approach in which schema is applied as it is loaded into a data warehouse.
Since the data is not transformed before being loaded into the target data platform, data sources can change and evolve without delaying data loading. This potentially enables data analysts to transform data to meet their requirements rather than have dedicated data integration professionals perform the task. As such, many ELT offerings are positioned for use by data analysts and developers rather than IT professionals. This can also reduce delays in deploying business intelligence projects by avoiding the need to wait for data transformation specialists to (re)configure pipelines in response to evolving business intelligence requirements and new data sources.
Whereas once there was considerable debate between software providers as to the relative merits of ETL versus ELT, today, many providers offer both approaches and recognize that there are multiple factors that influence whether one approach is more suitable than the other to any individual integration scenario. Like ETL pipelines, ELT pipelines may also be batch processes. Both can be accelerated by using change data capture techniques. Change data capture (CDC) is not new, but has come into greater focus given the increasing need for real-time data processing. As the name suggests, CDC is the process of capturing data changes. Specifically, in the context of data pipelines, CDC identifies and tracks changes to tables in the
source database as data is inserted, updated or deleted. CDC reduces complexity and increases agility by only synchronizing changed data rather than the entire dataset. The data changes can be synchronized incrementally or in a continuous stream.
Whereas once there was considerable debate between software providers as to the relative merits of ETL versus ELT, today, many providers offer both approaches
More recently, we have seen the emergence of the term zero-ETL by some providers offering automated replication of data from the source application, with immediate availability for analysis in the target analytic database. The term zero-ETL, along with some of the marketing around it, implies that users can do away with extraction, transformation and loading of data entirely. That might sound too good to be true, and in many cases it will be.
Removing the need for data transformation can only be met if all the data required for an analytics project is generated by a single source. Many analytics projects rely on combining data from multiple applications. If this is the case, then transformation of the data will be required after loading to integrate and prepare it for analysis. Even if all the data is generated by a single application, the theory that data does not need to be transformed relies on the assumption that the schema is strictly enforced when the data is generated. If not, enterprises are likely to need declarative transformations to cleanse and normalize the data for longer-term analytics or data governance requirements. As such, zero-ETL could arguably be seen as a form of ELT that automates extraction and loading and has the potential to remove the need for transformation in some use cases.
Our Data Integration Buyers Guide provides a holistic view of a software provider’s ability to deliver the combination of functionality to provide comprehensive data integration with either a single product or a suite of products. As such, the Data Integration Buyers Guide includes the full breadth of data integration functionality, including connectivity, integration development and integration management. Our assessment also considered whether the functionality in question was available from a software provider in a single offering or as a suite of products or cloud services.
This Data Integration Buyers Guide evaluates products based on whether the data integration platform enables the integration of real-time data in motion in addition to data at rest, as well as the use of artificial intelligence to automate and enhance data integration, and the availability and depth of functionality to enable enterprises to integrate data with business partners and other external entities. To be included in this Buyers Guide, products must offer data pipeline development, deployment and management.
This research evaluates the following software providers offering products to address key elements of data integration as we define it: Alibaba Cloud, Alteryx, AWS, Boomi, CData Software, Cloud Software Group, Confluent, Databricks, Denodo, Domo, Fivetran, Google Cloud, Huawei Cloud, IBM, Informatica, Jitterbit, Microsoft, Oracle, Pentaho, Precisely, Qlik, Reltio, Rocket Software, Salesforce, SAP, SAS Institute, Snowflake, Software AG, Solace, Syniti, Tencent Cloud and Workato.
Buyers Guide Overview
ISG Research has designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of business requirements in any enterprise.
ISG Research has designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of business requirements in any enterprise.
For over two decades, ISG Research has conducted market research in a spectrum of areas across business applications, tools and technologies. We have designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of the business requirements in any enterprise. Utilization of our research methodology and decades of experience enables our Buyers Guide to be an effective method to assess and select software providers and products. The findings of this research undertaking contribute to our comprehensive approach to rating software providers in a manner that is based on the assessments completed by an enterprise.
The ISG Buyers Guide™ for Data Integration is the distillation of over a year of market and product research efforts. It is an assessment of how well software providers’ offerings address enterprises’ requirements for data integration software. The index is structured to support a request for information (RFI) that could be used in the request for proposal (RFP) process by incorporating all criteria needed to evaluate, select, utilize and maintain relationships with software providers. An effective product and customer experience with a provider can ensure the best long-term relationship and value achieved from a resource and financial investment.
In this Buyers Guide, ISG Research evaluates the software in seven key categories that are weighted to reflect buyers’ needs based on our expertise and research. Five are product-experience related: Adaptability, Capability, Manageability, Reliability, and Usability. In addition, we consider two customer-experience categories: Validation, and Total Cost of Ownership/Return on Investment (TCO/ROI). To assess functionality, one of the components of Capability, we applied the ISG Research Value Index methodology and blueprint, which links the personas and processes for data integration to an enterprise’s requirements.
The structure of the research reflects our understanding that the effective evaluation of software providers and products involves far more than just examining product features, potential revenue or customers generated from a provider’s marketing and sales efforts. We believe it is important to take a comprehensive, research-based approach, since making the wrong choice of data integration technology can raise the total cost of ownership, lower the return on investment and hamper an enterprise’s ability to reach its full performance potential. In addition, this approach can reduce the project’s development and deployment time and eliminate the risk of relying on a short list of software providers that does not represent a best fit for your enterprise.
ISG Research believes that an objective review of software providers and products is a critical business strategy for the adoption and implementation of data integration software and applications. An enterprise’s review should include a thorough analysis of both what is possible and what is relevant. We urge enterprises to do a thorough job of evaluating data integration systems and tools and offer this Buyers Guide as both the results of our in-depth analysis of these providers and as an evaluation methodology.
Key Takeaways
Data integration is becoming a critical enabler of modern data strategies, moving beyond batch ETL to support agile, real-time pipelines across diverse platforms and cloud environments. Enterprises must address the complexity of harmonizing fragmented data sources while ensuring flexibility through ELT, CDC and emerging zero-ETL approaches. Good platforms provide end-to-end connectivity, orchestration and management, unifying data in motion and at rest to accelerate analytics and AI initiatives. Data integration is central to delivering trusted, timely and accessible data for innovation and operational efficiency.
Software Provider Summary
The research identifies Oracle, Informatica and Boomi as the market leaders, with strengths across multiple categories, while providers such as CData Software, Domo and Pentaho demonstrated targeted capabilities. Classification placed Oracle, Informatica and Boomi in the Exemplary quadrant alongside providers including AWS, IBM, Microsoft, Salesforce, Google Cloud and SAP. Providers such as CData Software, Qlik and Rocket Software were categorized as Innovative; Denodo, Precisely and Solace as Assurance; and Alibaba Cloud, Cloud Software Group, Confluent, Fivetran, Huawei Cloud, Jitterbit, Reltio, SAS Institute, Snowflake, Software AG, Syniti, Tencent Cloud and Workato in the Merit quadrant.
Product Experience Insights
Product Experience accounted for 80% of the overall rating, with emphasis on capability, usability, reliability, adaptability and manageability. Informatica, Oracle and Microsoft led in delivering breadth and depth across integration, governance and adaptability, while Boomi, Domo and Pentaho demonstrated strong capability but less overall balance. Leaders distinguished themselves with platforms that support enterprise-wide integration.
Customer Experience Value
Customer Experience represented 20% of the evaluation, focused on validation and TCO/ROI. Databricks, Oracle and Informatica led in this category by demonstrating strong customer commitment, transparent ROI frameworks and consistent lifecycle support. Alteryx and Domo also performed well, though short of leadership. Lower-performing providers often lacked sufficient clarity in CX, making it harder for enterprises to justify long-term investments.
Strategic Recommendations
Enterprises should treat data integration platform selection as a strategic decision that balances foundational functions such as adaptability, capability and manageability with expanded AI-driven features for real-time integration and automation. Buyers should prioritize platforms that ensure interoperability, simplify administration and deliver measurable ROI through transparent TCO frameworks. Using the ISG Buyers Guide as a structured framework enables enterprises to evaluate providers against both product and customer experience, ensuring investments improve data integration outcomes and align with evolving enterprise data strategies.
How To Use This Buyers Guide
Evaluating Software Providers: The Process
We recommend using the Buyers Guide to assess and evaluate new or existing software providers for your enterprise. The market research can be used as an evaluation framework to establish a formal request for information from providers on products and customer experience and will shorten the cycle time when creating an RFI. The steps listed below provide a process that can facilitate best possible outcomes.
- Define the business case and goals.
Define the mission and business case for investment and the expected outcomes from your organizational and technology efforts. - Specify the business needs.
Defining the business requirements helps identify what specific capabilities are required with respect to people, processes, information and technology. - Assess the required roles and responsibilities.
Identify the individuals required for success at every level of the organization from executives to front line workers and determine the needs of each. - Outline the project’s critical path.
What needs to be done, in what order and who will do it? This outline should make clear the prior dependencies at each step of the project plan. - Ascertain the technology approach.
Determine the business and technology approach that most closely aligns to your organization’s requirements. - Establish technology vendor evaluation criteria.
Utilize the product experience: Adaptability, Capability, Manageability, Reliability and Usability, and the customer experience in TCO/ROI and Validation. - Evaluate and select the technology properly.
Weight the categories in the technology evaluation criteria to reflect your organization’s priorities to determine the short list of vendors and products. - Establish the business initiative team to start the project.
Identify who will lead the project and the members of the team needed to plan and execute it with timelines, priorities and resources.
The Findings
All of the products we evaluated are feature-rich, but not all the capabilities offered by a software provider are equally valuable to types of workers or support everything needed to manage products on a continuous basis. Moreover, the existence of too many capabilities may be a negative factor for an enterprise if it introduces unnecessary complexity. Nonetheless, you may decide that a larger number of features in the product is a plus, especially if some of them match your enterprise’s established practices or support an initiative that is driving the purchase of new software.
Factors beyond features and functions or software provider assessments may become a deciding factor. For example, an enterprise may face budget constraints such that the TCO evaluation can tip the balance to one provider or another. This is where the Value Index methodology and the appropriate category weighting can be applied to determine the best fit of software providers and products to your specific needs.
Overall Scoring of Software Providers Across Categories
The research finds Oracle atop the list, followed by Informatica and Boomi. Providers that place in the top three of a category earn the designation of Leader. Oracle has done so in six categories; Databricks and Informatica in five; Google Cloud in two; and Boomi, CData Software, Domo and Pentaho in one category.
The overall representation of the research below places the rating of the Product Experience and Customer Experience on the x and y axes, respectively, to provide a visual representation and classification of the software providers. Those providers whose Product Experience have a higher weighted performance to the axis in aggregate of the five product categories place farther to the right, while the performance and weighting for the two Customer Experience categories determines placement on the vertical axis. In short, software providers that place closer to the upper-right on this chart performed better than those closer to the lower-left.
The research places software providers into one of four overall categories: Assurance, Exemplary, Merit or Innovative. This representation classifies providers’ overall weighted performance.

Exemplary: The categorization and placement of software providers in Exemplary (upper right) represent those that performed the best in meeting the overall Product and Customer Experience requirements. The providers rated Exemplary are: Alteryx, AWS, Boomi, Databricks, Domo, Google Cloud, IBM, Informatica, Microsoft, Oracle, Pentaho, Salesforce and SAP.
Innovative: The categorization and placement of software providers in Innovative (lower right) represent those that performed the best in meeting the overall Product Experience requirements but did not achieve the highest levels of requirements in Customer Experience. The providers rated Innovative are: CData Software, Qlik and Rocket Software.
Assurance: The categorization and placement of software providers in Assurance (upper left) represent those that achieved the highest levels in the overall Customer Experience requirements but did not achieve the highest levels of Product Experience. The providers rated Assurance are: Denodo, Precisely and Solace.
Merit: The categorization of software providers in Merit (lower left) represents those that did not surpass the thresholds for the Assurance, Exemplary or Innovative categories in Customer or Product Experience. The providers rated Merit are: Alibaba Cloud, Cloud Software Group, Confluent, Fivetran, Huawei Cloud, Jitterbit, Reltio, SAS Institute, Snowflake, Software AG, Syniti, Tencent Cloud and Workato.
We warn that close provider placement proximity should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every enterprise or for a specific process. Although there is a high degree of commonality in how enterprises handle data integration, there are many idiosyncrasies and differences in how they do these functions that can make one software provider’s offering a better fit than another’s for a particular enterprise’s needs.
We advise enterprises to assess and evaluate software providers based on organizational requirements and use this research as a supplement to internal evaluation of a provider and products.
Product Experience
The process of researching products to address an enterprise’s needs should be comprehensive. Our Value Index methodology examines Product Experience and how it aligns with an enterprise’s lifecycle of onboarding, configuration, operations, usage and maintenance. Too often, software providers are not evaluated for the entirety of the product; instead, they are evaluated on market execution and vision of the future, which are flawed since they do not represent an enterprise’s requirements but how the provider operates. As more software providers orient to a complete product experience, evaluations will be more robust.
The research results in Product Experience are ranked at 80%, or four-fifths, of the overall rating using the specific underlying weighted category performance. Importance was placed on the categories as follows: Usability (5%), Capability (25%), Reliability (15%), Adaptability (25%) and Manageability (10%). This weighting impacted the resulting overall ratings in this research. Informatica, Oracle and Microsoft were designated Product Experience Leaders.
Customer Experience
The importance of a customer relationship with a software provider is essential to the actual success of the products and technology. The advancement of the Customer Experience and the entire lifecycle an enterprise has with its software provider is critical for ensuring satisfaction in working with that provider. Technology providers that have chief customer officers are more likely to have greater investments in the customer relationship and focus more on their success. These leaders also need to take responsibility for ensuring this commitment is made abundantly clear on the website and in the buying process and customer journey.
The research results in Customer Experience are ranked at 20%, or one-fifth, using the specific underlying weighted category performance as it relates to the framework of commitment and value to the software provider-customer relationship. The two evaluation categories are Validation 10%) and TCO/ROI (10%), which are weighted to represent their importance to the overall research.
The software providers that evaluated the highest overall in the aggregated and weighted Customer Experience categories are Databricks, Oracle and Informatica. These category Leaders best communicate commitment and dedication to customer needs. Software providers that did not perform well in this category were unable to provide sufficient customer case studies to demonstrate success or articulate their commitment to customer experience and an enterprise’s journey. The selection of a software provider means a continuous investment by the enterprise, so a holistic evaluation must include examination of how they support their customer experience.
Appendix: Software Provider Inclusion
For inclusion in the ISG Buyers Guide™ for Data Integration in 2025, a software provider must be in good standing financially and ethically, have at least $100 million in annual or projected revenue verified using independent sources, sell products and provide support on at least two continents, and have at least 100 employees. The principal source of the relevant business unit’s revenue must be software-related, and there must have been at least one major software release in the last 12 months.
Data integration is a set of processes and technologies that enable enterprises to extract, combine, transform and process data from multiple internal and external data platforms and applications to maximize the value of analytic and operational use. Without data integration, business data would be trapped in the applications and systems in which it was generated.
To be included in the Data Integration Buyers Guide requires functionality that addresses the following sections of the capabilities document:
- Connectivity
- Integration development
- Integration management
- AI
The research is designed to be independent of the specifics of software provider packaging and pricing. To represent the real-world environment in which businesses operate, we include providers that offer suites or packages of products that may include relevant individual modules or applications. If a software provider is actively marketing, selling and developing a product for the general market and it is reflected on the provider’s website that the product is within the scope of the research, that provider is automatically evaluated for inclusion.
All software providers that offer relevant data integration products and meet the inclusion requirements were invited to participate in the evaluation process at no cost to them.
Software providers that meet our inclusion criteria but did not completely participate in our Buyers Guide were assessed solely on publicly available information. As this could have a significant impact on classification and ratings, we recommend additional scrutiny when evaluating those providers.
Products Evaluated
Provider |
Product Names |
Version |
Release |
Alibaba Cloud |
Alibaba Cloud DataWorks |
May 2025 |
May 2025 |
Alteryx |
Alteryx One |
July 2025 |
July 2025 |
AWS |
AWS Glue, AWS B2B Data Interchange |
January 2025, July 2025 |
January 2025, July 2025 |
Boomi |
Boomi Enterprise Platform |
June 2025 |
June 2025 |
CData Software |
CData Connect Cloud, CData Sync, CData Virtuality, CData Arc |
July 2025, 25.2.9330, v. 25, v. 25 |
July 2025, June2025, April 2025, July 2025 |
Cloud Software Group |
TIBCO Cloud Integration, TIBCO Data Virtualization, TIBCO BusinessConnect Container Edition, |
3.10.6.4, 8.8.1, 1.6.0 |
April 2025, April 2025, April 2025 |
Confluent |
Confluent Cloud |
July 2025 |
July 2025 |
Databricks |
Databricks Data Intelligence Platform |
July 2025 |
July 2025 |
Denodo |
Denodo Platform |
9.2 |
April 2025 |
Domo |
Domo |
2025 Release 3 |
May 2025 |
Fivetran |
Fivetran |
July 2025 |
July 2025 |
Google Cloud |
Google Cloud Data Fusion, Google Cloud Dataflow |
June 2025, June 2025 |
June 2025, June 2025 |
Huawei Cloud |
Huawei Cloud ROMA Connect |
June 2025 |
June 2025 |
IBM |
IBM watsonx.data integration, IBM Sterling B2B Integrator |
July 2025, 6.2.1.0 |
July 2025, May 2025 |
Informatica |
Informatica Intelligent Data Management Cloud |
May 2025 |
May 2025 |
Jitterbit |
Jitterbit Harmony |
11.46 |
July 2025 |
Microsoft |
Microsoft Fabric, Azure Logic Apps |
July 2025, May 2025 |
July 2025, May 2025 |
Oracle |
Oracle Cloud Infrastructure (OCI) Integration, Oracle Cloud Infrastructure (OCI) GoldenGate, Oracle Cloud Infrastructure (OCI) Data Integration |
25.06, June 2025, February 2025 |
June 2025, June 2025, February 2025 |
Pentaho |
Pentaho Data Integration |
10.2 |
July 2025 |
Precisely |
Precisely Data Integrity Suite |
July 2025 |
July 2025 |
Qlik |
Qlik Talend Cloud |
R2025-07 |
July 2025 |
Reltio |
Reltio Data Cloud |
2025.1.20.0 |
July 2025 |
Rocket Software |
Rocket DataEdge - Data Virtualization, Rocket DataEdge - Data Replicate and Sync |
2.1, 7.0 |
September 2024, November 2024 |
Salesforce |
Mulesoft Anypoint Platform |
July 2025 |
July 2025 |
SAP |
SAP Datasphere, SAP Integration Suite |
2025.14, July 2025 |
July 2025, July 2025 |
SAS Institute |
SAS Studio |
2025.07 |
July 2025 |
Snowflake |
Snowflake Platform |
9.17 |
June 2025 |
Software AG |
Software AG CONNX |
14.8 |
October 2024 |
Solace |
Solace Platform |
June 2025 |
June 2025 |
Syniti |
Syniti Knowledge Platform |
July 2025 |
July 2025 |
Tencent Cloud |
Tencent Cloud WeData |
April 2025 |
April 2025 |
Workato |
Workato |
June 2025 |
June 2025 |
Providers of Promise
We did not include software providers that, as a result of our research and analysis, did not satisfy the criteria for inclusion in this Buyers Guide. These are listed below as “Providers of Promise.”
Provider |
Product |
Annual Revenue |
Operates Across 2 Continents |
At least 100 employees |
Ab Initio |
Ab Initio |
No |
Yes |
Yes |
Actian |
Actian DataConnect, Actian DataFlow |
No |
Yes |
Yes |
Astera Software |
Astera Data Stack |
No |
Yes |
Yes |
Cinchy |
Cinchy Data Collaboration Platform |
No |
Yes |
No |
Coalesce |
CastorDoc |
No |
Yes |
Yes |
Congruity360 |
Classify360 |
No |
Yes |
No |
Datameer |
Datameer Cloud |
No |
Yes |
Yes |
Great Expectations |
GX Cloud |
No |
Yes |
No |
Innovative Systems |
Enlighten |
No |
Yes |
Yes |
Irion |
Irion EDM |
No |
Yes |
Yes |
K2view |
K2view Data Product Platform |
No |
Yes |
Yes |
Matillion |
Matillion Data Productivity Cloud |
No |
Yes |
Yes |
Nexla |
Nexla |
No |
Yes |
No |
Profisee |
Profisee |
No |
Yes |
Yes |
Safe Software |
FME Platform |
No |
Yes |
Yes |
Semarchy |
Semarchy Data Platform |
No |
Yes |
Yes |
SnapLogic |
SnapLogic Platform |
No |
Yes |
Yes |
Stratio Big Data |
Stratio Generative AI Data Fabric |
No |
Yes |
Yes |
Striim |
Striim Cloud |
No |
Yes |
Yes |
TimeXtender |
TimeXtender |
No |
Yes |
No |
Tray.ai |
Tray.ai Universal Automation Cloud |
No |
Yes |
Yes |
Tresata |
Tresata |
No |
Yes |
No |
Executive Summary
Data Integration
Data integration is a fundamental enabler of a data management strategy. Analysis of individual data sources—such as customer or product data—can provide insights to improve operational efficiency. However, the combination of data from multiple sources enables enterprises to innovate, improving customer experience and revenue generation, for example, by targeting the most lucrative customers with offers to adopt the latest product.
ISG Research defines data integration as set of processes and technologies that enable enterprises to extract, combine, transform and process data from multiple internal and external data platforms and applications to maximize the value of analytic and operational use. Without data integration, business data would be trapped in the applications and systems in which it was generated.
Traditional approaches to data management are rooted in point-to-point batch data processing, whereby data is extracted from its source, transformed for a specific purpose and loaded into a target environment for analysis. The transformation could include the normalization, cleansing and aggregation of data. More than two-thirds (69%) of enterprises cite preparing data for analysis as the most time-consuming aspect of the analytics process. Reducing the time and effort spent on data integration and preparation can significantly accelerate time to business insight.
Although point-to-point data integration continues to serve tactical data integration use cases, it is unsuitable for more strategic enterprise-wide data integration initiatives. These require the orchestration of a complex mesh of agile data pipelines that traverse multiple data-processing locations and can evolve in response to changing data sources and business requirements. We have seen increased focus in recent years in the concept of data fabric, which represents a technology-driven approach to managing and governing data across distributed environments. We assert that through 2027, three-quarters of enterprises will adopt data fabric technologies to facilitate the management and processing of data across multiple data platforms and cloud environments.
Combining data from multiple data sources has clear potential benefits but is not without its risks and challenges. Although 38% of participants in ISG’s 2025 Market Lens Data and AI Program Study agreed that the cost of harmonizing data management across the entire organization outweighs the likely benefits, a similar proportion (37%) disagreed. Success with data integration initiatives is therefore not guaranteed and will depend on the approach and tools adopted.
Traditional batch extract, transform and load (ETL) integration products extract data from a source and transform it in a dedicated staging area before loading it into a target environment (typically a data warehouse or data lake) for analysis. The dedicated ETL staging layers were important to avoid placing an undue transformation processing burden on the target data platform, ensuring that sufficient processing power was available to perform the necessary analytic queries.
Since they are designed for a specific data transformation task, ETL pipelines are often highly efficient. However, they are also rigid, difficult to adapt and ill-suited to continuous and agile processes. As data and business requirements change, ETL pipelines must be rewritten accordingly. The need for greater agility and flexibility to meet the demands of real-time data processing is one reason we have seen increased interest in extract, load and transform data pipelines.
As data and business requirements change, ETL pipelines must be rewritten accordingly.
Extract, load and transform pipelines use a more lightweight staging tier, which is required simply to extract data from the source and load it into the target data platform. Rather than a separate transformation stage prior to loading, ELT pipelines make use of pushdown optimization, leveraging the data processing functionality and processing power of the target data platform to transform the data. Pushing data transformation execution to the target data platform results in a more agile data extraction and loading phase, which is more adaptable to changing data sources. This approach is well-aligned with the application of schema-on-read applied in data lakehouse environments, as opposed to the schema-on-write approach in which schema is applied as it is loaded into a data warehouse.
Since the data is not transformed before being loaded into the target data platform, data sources can change and evolve without delaying data loading. This potentially enables data analysts to transform data to meet their requirements rather than have dedicated data integration professionals perform the task. As such, many ELT offerings are positioned for use by data analysts and developers rather than IT professionals. This can also reduce delays in deploying business intelligence projects by avoiding the need to wait for data transformation specialists to (re)configure pipelines in response to evolving business intelligence requirements and new data sources.
Whereas once there was considerable debate between software providers as to the relative merits of ETL versus ELT, today, many providers offer both approaches and recognize that there are multiple factors that influence whether one approach is more suitable than the other to any individual integration scenario. Like ETL pipelines, ELT pipelines may also be batch processes. Both can be accelerated by using change data capture techniques. Change data capture (CDC) is not new, but has come into greater focus given the increasing need for real-time data processing. As the name suggests, CDC is the process of capturing data changes. Specifically, in the context of data pipelines, CDC identifies and tracks changes to tables in the
source database as data is inserted, updated or deleted. CDC reduces complexity and increases agility by only synchronizing changed data rather than the entire dataset. The data changes can be synchronized incrementally or in a continuous stream.
Whereas once there was considerable debate between software providers as to the relative merits of ETL versus ELT, today, many providers offer both approaches
More recently, we have seen the emergence of the term zero-ETL by some providers offering automated replication of data from the source application, with immediate availability for analysis in the target analytic database. The term zero-ETL, along with some of the marketing around it, implies that users can do away with extraction, transformation and loading of data entirely. That might sound too good to be true, and in many cases it will be.
Removing the need for data transformation can only be met if all the data required for an analytics project is generated by a single source. Many analytics projects rely on combining data from multiple applications. If this is the case, then transformation of the data will be required after loading to integrate and prepare it for analysis. Even if all the data is generated by a single application, the theory that data does not need to be transformed relies on the assumption that the schema is strictly enforced when the data is generated. If not, enterprises are likely to need declarative transformations to cleanse and normalize the data for longer-term analytics or data governance requirements. As such, zero-ETL could arguably be seen as a form of ELT that automates extraction and loading and has the potential to remove the need for transformation in some use cases.
Our Data Integration Buyers Guide provides a holistic view of a software provider’s ability to deliver the combination of functionality to provide comprehensive data integration with either a single product or a suite of products. As such, the Data Integration Buyers Guide includes the full breadth of data integration functionality, including connectivity, integration development and integration management. Our assessment also considered whether the functionality in question was available from a software provider in a single offering or as a suite of products or cloud services.
This Data Integration Buyers Guide evaluates products based on whether the data integration platform enables the integration of real-time data in motion in addition to data at rest, as well as the use of artificial intelligence to automate and enhance data integration, and the availability and depth of functionality to enable enterprises to integrate data with business partners and other external entities. To be included in this Buyers Guide, products must offer data pipeline development, deployment and management.
This research evaluates the following software providers offering products to address key elements of data integration as we define it: Alibaba Cloud, Alteryx, AWS, Boomi, CData Software, Cloud Software Group, Confluent, Databricks, Denodo, Domo, Fivetran, Google Cloud, Huawei Cloud, IBM, Informatica, Jitterbit, Microsoft, Oracle, Pentaho, Precisely, Qlik, Reltio, Rocket Software, Salesforce, SAP, SAS Institute, Snowflake, Software AG, Solace, Syniti, Tencent Cloud and Workato.
Buyers Guide Overview
ISG Research has designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of business requirements in any enterprise.
ISG Research has designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of business requirements in any enterprise.
For over two decades, ISG Research has conducted market research in a spectrum of areas across business applications, tools and technologies. We have designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of the business requirements in any enterprise. Utilization of our research methodology and decades of experience enables our Buyers Guide to be an effective method to assess and select software providers and products. The findings of this research undertaking contribute to our comprehensive approach to rating software providers in a manner that is based on the assessments completed by an enterprise.
The ISG Buyers Guide™ for Data Integration is the distillation of over a year of market and product research efforts. It is an assessment of how well software providers’ offerings address enterprises’ requirements for data integration software. The index is structured to support a request for information (RFI) that could be used in the request for proposal (RFP) process by incorporating all criteria needed to evaluate, select, utilize and maintain relationships with software providers. An effective product and customer experience with a provider can ensure the best long-term relationship and value achieved from a resource and financial investment.
In this Buyers Guide, ISG Research evaluates the software in seven key categories that are weighted to reflect buyers’ needs based on our expertise and research. Five are product-experience related: Adaptability, Capability, Manageability, Reliability, and Usability. In addition, we consider two customer-experience categories: Validation, and Total Cost of Ownership/Return on Investment (TCO/ROI). To assess functionality, one of the components of Capability, we applied the ISG Research Value Index methodology and blueprint, which links the personas and processes for data integration to an enterprise’s requirements.
The structure of the research reflects our understanding that the effective evaluation of software providers and products involves far more than just examining product features, potential revenue or customers generated from a provider’s marketing and sales efforts. We believe it is important to take a comprehensive, research-based approach, since making the wrong choice of data integration technology can raise the total cost of ownership, lower the return on investment and hamper an enterprise’s ability to reach its full performance potential. In addition, this approach can reduce the project’s development and deployment time and eliminate the risk of relying on a short list of software providers that does not represent a best fit for your enterprise.
ISG Research believes that an objective review of software providers and products is a critical business strategy for the adoption and implementation of data integration software and applications. An enterprise’s review should include a thorough analysis of both what is possible and what is relevant. We urge enterprises to do a thorough job of evaluating data integration systems and tools and offer this Buyers Guide as both the results of our in-depth analysis of these providers and as an evaluation methodology.
Key Takeaways
Data integration is becoming a critical enabler of modern data strategies, moving beyond batch ETL to support agile, real-time pipelines across diverse platforms and cloud environments. Enterprises must address the complexity of harmonizing fragmented data sources while ensuring flexibility through ELT, CDC and emerging zero-ETL approaches. Good platforms provide end-to-end connectivity, orchestration and management, unifying data in motion and at rest to accelerate analytics and AI initiatives. Data integration is central to delivering trusted, timely and accessible data for innovation and operational efficiency.
Software Provider Summary
The research identifies Oracle, Informatica and Boomi as the market leaders, with strengths across multiple categories, while providers such as CData Software, Domo and Pentaho demonstrated targeted capabilities. Classification placed Oracle, Informatica and Boomi in the Exemplary quadrant alongside providers including AWS, IBM, Microsoft, Salesforce, Google Cloud and SAP. Providers such as CData Software, Qlik and Rocket Software were categorized as Innovative; Denodo, Precisely and Solace as Assurance; and Alibaba Cloud, Cloud Software Group, Confluent, Fivetran, Huawei Cloud, Jitterbit, Reltio, SAS Institute, Snowflake, Software AG, Syniti, Tencent Cloud and Workato in the Merit quadrant.
Product Experience Insights
Product Experience accounted for 80% of the overall rating, with emphasis on capability, usability, reliability, adaptability and manageability. Informatica, Oracle and Microsoft led in delivering breadth and depth across integration, governance and adaptability, while Boomi, Domo and Pentaho demonstrated strong capability but less overall balance. Leaders distinguished themselves with platforms that support enterprise-wide integration.
Customer Experience Value
Customer Experience represented 20% of the evaluation, focused on validation and TCO/ROI. Databricks, Oracle and Informatica led in this category by demonstrating strong customer commitment, transparent ROI frameworks and consistent lifecycle support. Alteryx and Domo also performed well, though short of leadership. Lower-performing providers often lacked sufficient clarity in CX, making it harder for enterprises to justify long-term investments.
Strategic Recommendations
Enterprises should treat data integration platform selection as a strategic decision that balances foundational functions such as adaptability, capability and manageability with expanded AI-driven features for real-time integration and automation. Buyers should prioritize platforms that ensure interoperability, simplify administration and deliver measurable ROI through transparent TCO frameworks. Using the ISG Buyers Guide as a structured framework enables enterprises to evaluate providers against both product and customer experience, ensuring investments improve data integration outcomes and align with evolving enterprise data strategies.
How To Use This Buyers Guide
Evaluating Software Providers: The Process
We recommend using the Buyers Guide to assess and evaluate new or existing software providers for your enterprise. The market research can be used as an evaluation framework to establish a formal request for information from providers on products and customer experience and will shorten the cycle time when creating an RFI. The steps listed below provide a process that can facilitate best possible outcomes.
- Define the business case and goals.
Define the mission and business case for investment and the expected outcomes from your organizational and technology efforts. - Specify the business needs.
Defining the business requirements helps identify what specific capabilities are required with respect to people, processes, information and technology. - Assess the required roles and responsibilities.
Identify the individuals required for success at every level of the organization from executives to front line workers and determine the needs of each. - Outline the project’s critical path.
What needs to be done, in what order and who will do it? This outline should make clear the prior dependencies at each step of the project plan. - Ascertain the technology approach.
Determine the business and technology approach that most closely aligns to your organization’s requirements. - Establish technology vendor evaluation criteria.
Utilize the product experience: Adaptability, Capability, Manageability, Reliability and Usability, and the customer experience in TCO/ROI and Validation. - Evaluate and select the technology properly.
Weight the categories in the technology evaluation criteria to reflect your organization’s priorities to determine the short list of vendors and products. - Establish the business initiative team to start the project.
Identify who will lead the project and the members of the team needed to plan and execute it with timelines, priorities and resources.
The Findings
All of the products we evaluated are feature-rich, but not all the capabilities offered by a software provider are equally valuable to types of workers or support everything needed to manage products on a continuous basis. Moreover, the existence of too many capabilities may be a negative factor for an enterprise if it introduces unnecessary complexity. Nonetheless, you may decide that a larger number of features in the product is a plus, especially if some of them match your enterprise’s established practices or support an initiative that is driving the purchase of new software.
Factors beyond features and functions or software provider assessments may become a deciding factor. For example, an enterprise may face budget constraints such that the TCO evaluation can tip the balance to one provider or another. This is where the Value Index methodology and the appropriate category weighting can be applied to determine the best fit of software providers and products to your specific needs.
Overall Scoring of Software Providers Across Categories
The research finds Oracle atop the list, followed by Informatica and Boomi. Providers that place in the top three of a category earn the designation of Leader. Oracle has done so in six categories; Databricks and Informatica in five; Google Cloud in two; and Boomi, CData Software, Domo and Pentaho in one category.
The overall representation of the research below places the rating of the Product Experience and Customer Experience on the x and y axes, respectively, to provide a visual representation and classification of the software providers. Those providers whose Product Experience have a higher weighted performance to the axis in aggregate of the five product categories place farther to the right, while the performance and weighting for the two Customer Experience categories determines placement on the vertical axis. In short, software providers that place closer to the upper-right on this chart performed better than those closer to the lower-left.
The research places software providers into one of four overall categories: Assurance, Exemplary, Merit or Innovative. This representation classifies providers’ overall weighted performance.

Exemplary: The categorization and placement of software providers in Exemplary (upper right) represent those that performed the best in meeting the overall Product and Customer Experience requirements. The providers rated Exemplary are: Alteryx, AWS, Boomi, Databricks, Domo, Google Cloud, IBM, Informatica, Microsoft, Oracle, Pentaho, Salesforce and SAP.
Innovative: The categorization and placement of software providers in Innovative (lower right) represent those that performed the best in meeting the overall Product Experience requirements but did not achieve the highest levels of requirements in Customer Experience. The providers rated Innovative are: CData Software, Qlik and Rocket Software.
Assurance: The categorization and placement of software providers in Assurance (upper left) represent those that achieved the highest levels in the overall Customer Experience requirements but did not achieve the highest levels of Product Experience. The providers rated Assurance are: Denodo, Precisely and Solace.
Merit: The categorization of software providers in Merit (lower left) represents those that did not surpass the thresholds for the Assurance, Exemplary or Innovative categories in Customer or Product Experience. The providers rated Merit are: Alibaba Cloud, Cloud Software Group, Confluent, Fivetran, Huawei Cloud, Jitterbit, Reltio, SAS Institute, Snowflake, Software AG, Syniti, Tencent Cloud and Workato.
We warn that close provider placement proximity should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every enterprise or for a specific process. Although there is a high degree of commonality in how enterprises handle data integration, there are many idiosyncrasies and differences in how they do these functions that can make one software provider’s offering a better fit than another’s for a particular enterprise’s needs.
We advise enterprises to assess and evaluate software providers based on organizational requirements and use this research as a supplement to internal evaluation of a provider and products.
Product Experience
The process of researching products to address an enterprise’s needs should be comprehensive. Our Value Index methodology examines Product Experience and how it aligns with an enterprise’s lifecycle of onboarding, configuration, operations, usage and maintenance. Too often, software providers are not evaluated for the entirety of the product; instead, they are evaluated on market execution and vision of the future, which are flawed since they do not represent an enterprise’s requirements but how the provider operates. As more software providers orient to a complete product experience, evaluations will be more robust.
The research results in Product Experience are ranked at 80%, or four-fifths, of the overall rating using the specific underlying weighted category performance. Importance was placed on the categories as follows: Usability (5%), Capability (25%), Reliability (15%), Adaptability (25%) and Manageability (10%). This weighting impacted the resulting overall ratings in this research. Informatica, Oracle and Microsoft were designated Product Experience Leaders.
Customer Experience
The importance of a customer relationship with a software provider is essential to the actual success of the products and technology. The advancement of the Customer Experience and the entire lifecycle an enterprise has with its software provider is critical for ensuring satisfaction in working with that provider. Technology providers that have chief customer officers are more likely to have greater investments in the customer relationship and focus more on their success. These leaders also need to take responsibility for ensuring this commitment is made abundantly clear on the website and in the buying process and customer journey.
The research results in Customer Experience are ranked at 20%, or one-fifth, using the specific underlying weighted category performance as it relates to the framework of commitment and value to the software provider-customer relationship. The two evaluation categories are Validation 10%) and TCO/ROI (10%), which are weighted to represent their importance to the overall research.
The software providers that evaluated the highest overall in the aggregated and weighted Customer Experience categories are Databricks, Oracle and Informatica. These category Leaders best communicate commitment and dedication to customer needs. Software providers that did not perform well in this category were unable to provide sufficient customer case studies to demonstrate success or articulate their commitment to customer experience and an enterprise’s journey. The selection of a software provider means a continuous investment by the enterprise, so a holistic evaluation must include examination of how they support their customer experience.
Appendix: Software Provider Inclusion
For inclusion in the ISG Buyers Guide™ for Data Integration in 2025, a software provider must be in good standing financially and ethically, have at least $100 million in annual or projected revenue verified using independent sources, sell products and provide support on at least two continents, and have at least 100 employees. The principal source of the relevant business unit’s revenue must be software-related, and there must have been at least one major software release in the last 12 months.
Data integration is a set of processes and technologies that enable enterprises to extract, combine, transform and process data from multiple internal and external data platforms and applications to maximize the value of analytic and operational use. Without data integration, business data would be trapped in the applications and systems in which it was generated.
To be included in the Data Integration Buyers Guide requires functionality that addresses the following sections of the capabilities document:
- Connectivity
- Integration development
- Integration management
- AI
The research is designed to be independent of the specifics of software provider packaging and pricing. To represent the real-world environment in which businesses operate, we include providers that offer suites or packages of products that may include relevant individual modules or applications. If a software provider is actively marketing, selling and developing a product for the general market and it is reflected on the provider’s website that the product is within the scope of the research, that provider is automatically evaluated for inclusion.
All software providers that offer relevant data integration products and meet the inclusion requirements were invited to participate in the evaluation process at no cost to them.
Software providers that meet our inclusion criteria but did not completely participate in our Buyers Guide were assessed solely on publicly available information. As this could have a significant impact on classification and ratings, we recommend additional scrutiny when evaluating those providers.
Products Evaluated
Provider |
Product Names |
Version |
Release |
Alibaba Cloud |
Alibaba Cloud DataWorks |
May 2025 |
May 2025 |
Alteryx |
Alteryx One |
July 2025 |
July 2025 |
AWS |
AWS Glue, AWS B2B Data Interchange |
January 2025, July 2025 |
January 2025, July 2025 |
Boomi |
Boomi Enterprise Platform |
June 2025 |
June 2025 |
CData Software |
CData Connect Cloud, CData Sync, CData Virtuality, CData Arc |
July 2025, 25.2.9330, v. 25, v. 25 |
July 2025, June2025, April 2025, July 2025 |
Cloud Software Group |
TIBCO Cloud Integration, TIBCO Data Virtualization, TIBCO BusinessConnect Container Edition, |
3.10.6.4, 8.8.1, 1.6.0 |
April 2025, April 2025, April 2025 |
Confluent |
Confluent Cloud |
July 2025 |
July 2025 |
Databricks |
Databricks Data Intelligence Platform |
July 2025 |
July 2025 |
Denodo |
Denodo Platform |
9.2 |
April 2025 |
Domo |
Domo |
2025 Release 3 |
May 2025 |
Fivetran |
Fivetran |
July 2025 |
July 2025 |
Google Cloud |
Google Cloud Data Fusion, Google Cloud Dataflow |
June 2025, June 2025 |
June 2025, June 2025 |
Huawei Cloud |
Huawei Cloud ROMA Connect |
June 2025 |
June 2025 |
IBM |
IBM watsonx.data integration, IBM Sterling B2B Integrator |
July 2025, 6.2.1.0 |
July 2025, May 2025 |
Informatica |
Informatica Intelligent Data Management Cloud |
May 2025 |
May 2025 |
Jitterbit |
Jitterbit Harmony |
11.46 |
July 2025 |
Microsoft |
Microsoft Fabric, Azure Logic Apps |
July 2025, May 2025 |
July 2025, May 2025 |
Oracle |
Oracle Cloud Infrastructure (OCI) Integration, Oracle Cloud Infrastructure (OCI) GoldenGate, Oracle Cloud Infrastructure (OCI) Data Integration |
25.06, June 2025, February 2025 |
June 2025, June 2025, February 2025 |
Pentaho |
Pentaho Data Integration |
10.2 |
July 2025 |
Precisely |
Precisely Data Integrity Suite |
July 2025 |
July 2025 |
Qlik |
Qlik Talend Cloud |
R2025-07 |
July 2025 |
Reltio |
Reltio Data Cloud |
2025.1.20.0 |
July 2025 |
Rocket Software |
Rocket DataEdge - Data Virtualization, Rocket DataEdge - Data Replicate and Sync |
2.1, 7.0 |
September 2024, November 2024 |
Salesforce |
Mulesoft Anypoint Platform |
July 2025 |
July 2025 |
SAP |
SAP Datasphere, SAP Integration Suite |
2025.14, July 2025 |
July 2025, July 2025 |
SAS Institute |
SAS Studio |
2025.07 |
July 2025 |
Snowflake |
Snowflake Platform |
9.17 |
June 2025 |
Software AG |
Software AG CONNX |
14.8 |
October 2024 |
Solace |
Solace Platform |
June 2025 |
June 2025 |
Syniti |
Syniti Knowledge Platform |
July 2025 |
July 2025 |
Tencent Cloud |
Tencent Cloud WeData |
April 2025 |
April 2025 |
Workato |
Workato |
June 2025 |
June 2025 |
Providers of Promise
We did not include software providers that, as a result of our research and analysis, did not satisfy the criteria for inclusion in this Buyers Guide. These are listed below as “Providers of Promise.”
Provider |
Product |
Annual Revenue |
Operates Across 2 Continents |
At least 100 employees |
Ab Initio |
Ab Initio |
No |
Yes |
Yes |
Actian |
Actian DataConnect, Actian DataFlow |
No |
Yes |
Yes |
Astera Software |
Astera Data Stack |
No |
Yes |
Yes |
Cinchy |
Cinchy Data Collaboration Platform |
No |
Yes |
No |
Coalesce |
CastorDoc |
No |
Yes |
Yes |
Congruity360 |
Classify360 |
No |
Yes |
No |
Datameer |
Datameer Cloud |
No |
Yes |
Yes |
Great Expectations |
GX Cloud |
No |
Yes |
No |
Innovative Systems |
Enlighten |
No |
Yes |
Yes |
Irion |
Irion EDM |
No |
Yes |
Yes |
K2view |
K2view Data Product Platform |
No |
Yes |
Yes |
Matillion |
Matillion Data Productivity Cloud |
No |
Yes |
Yes |
Nexla |
Nexla |
No |
Yes |
No |
Profisee |
Profisee |
No |
Yes |
Yes |
Safe Software |
FME Platform |
No |
Yes |
Yes |
Semarchy |
Semarchy Data Platform |
No |
Yes |
Yes |
SnapLogic |
SnapLogic Platform |
No |
Yes |
Yes |
Stratio Big Data |
Stratio Generative AI Data Fabric |
No |
Yes |
Yes |
Striim |
Striim Cloud |
No |
Yes |
Yes |
TimeXtender |
TimeXtender |
No |
Yes |
No |
Tray.ai |
Tray.ai Universal Automation Cloud |
No |
Yes |
Yes |
Tresata |
Tresata |
No |
Yes |
No |
Fill out the form to continue reading.