Executive Summary
Real-Time Data
As enterprises strive to be data-driven, making a higher proportion of decisions based on data becomes the norm. To differentiate from the pack, enterprises need to process and analyze more data and increase the frequency with which data-driven decisions are made by acting upon data in real time.
To differentiate from the pack, enterprises need to process and analyze more data and increase the frequency with which data-driven decisions are made by acting upon data in real time.
ISG Research defines real-time data as the sharing, processing and analysis of data communicated in messages and streams of messages generated by enterprise systems, devices and applications as business events occur. Real-time data processing enables enterprises to operate at the speed of business by acting on events as they happen.
Despite the overwhelming reliance on batch data processing, it is an artificial construct driven by the historical limitations of computing capabilities to generate and process data at the same time without impacting performance. Real-time data processing has existed for years in industry segments with the most extreme high-performance requirements, such as financial services and telecommunications.
In other industries, however, the reliance on batch data processing is so entrenched that processing data in real time has primarily been seen as a niche requirement, separate from the default focus on batch processing of data at rest. Less than one-quarter (22%) of enterprises participating in the ISG Research Analytics and Data Benchmark Research analyze data in real time.
Attitudes towards real-time data processing are changing as an increasing number of enterprises recognize that by failing to process and analyze data in real time, there is a risk of failing to operate at the pace of the real world. The pressure on enterprises to improve the ability to process and analyze data in real time is exacerbated by increased demand for intelligent operational applications infused with the results of analytic processes, such as personalization and artificial intelligence (AI)-driven recommendations. AI-driven intelligent applications require a new approach to data processing that enables real-time performance of machine learning (ML) on operational data to deliver instant, relevant information for accelerated decision-making.
As demand for real-time interactive applications becomes more pervasive, processing real-time data becomes a more mainstream pursuit.
Enterprises differentiate user experiences with real-time, AI-driven functionality. Doing so requires AI models to have access to up-to-date data via streams of events as they are generated in real time, as well as the ability to incorporate model inferencing into streaming analytics pipelines. Enterprises with an over-reliance on batch data processing will not match those able to harness real-time data as it is generated. As demand for real-time interactive applications becomes more pervasive, processing real-time data becomes a more mainstream pursuit. This is aided by the proliferation of products capable of real-time data processing and analytics, which have lowered the cost and technical barriers to developing new applications that take advantage of data in motion.
Reliance on batch data processing is so pervasive that for many data practitioners and business executives, real-time data may be an alien concept that appears to have a language of its own. Terms such as messaging, event processing, stream processing and streaming analytics are often used interchangeably, and the nuances are not necessarily clear to the uninitiated. However, the core concepts of real-time data processing are proven and well-defined, and the technologies that implement them are mature and readily available.
The starting point for any real-time data strategy is the concept of an event. Simply put, an event is a thing that happens. In the context of an application, an event is a change of state, such as a sensor identifying a new temperature reading. Messaging is the sharing of information between applications, devices and systems about events. As an event occurs, the application, device or system generates a message about the event that is shared with other applications, devices and systems across the enterprise.
Direct communication between applications is enabled and managed by application integration, which supports the fulfillment of business processes and workflows that rely on multiple applications operating in concert. While application integration has traditionally relied on point-to-point integration between individual applications, today’s application integration increasingly depends on application programming interfaces and API management.
In addition to communicating messages about events between applications, processing a continuous stream of event messages enables enterprises to act on the event data as it is generated and communicated. Stream processing encompasses the ingestion, filtering, integration, transformation, aggregation and enrichment of stream data, and the associated management and governance of stream processing.
Stream processing also forms the basis of stream analytics, which uses streaming compute engines to analyze streams of event data using SQL queries and real-time materialized views, including chart-based visualization of streaming data and, more recently, support for machine learning and generative AI model inferencing.
Processing and analyzing event data in isolation is valuable. Success with streaming data relies on the holistic management and governance of data in motion and data at rest. In recent years, traditional data processing providers have added support for continually processing streams of event data, while streaming and event specialist providers improve capabilities for the persistence of event data in a data warehouse, data lake or cloud storage for batch processing of historical event data. ISG asserts that by 2027, more than three-quarters of enterprises’ standard information architectures will include streaming data and event processing, allowing enterprises to be more responsive and provide better customer experiences.
As more enterprises adopt event-driven architecture, capabilities for the persistence and processing of historical event data increase the potential for streaming and event specialists to stake a claim and be considered an enterprise’s primary data platform provider rather than being utilized for a limited set of use cases.
The ability to execute at the speed of business by processing and acting on events as they occur will be the difference between competing and winning with analytics and data. Enterprises evaluating current and future data architecture requirements should consider real-time data technologies alongside more traditional batch-oriented data platforms to provide a holistic view of all data in motion and at rest.
The ISG Buyers Guide™ for Real-Time Data evaluates products based on core capabilities such as messaging and event processing, application integration, streaming data and streaming analytics. While products addressing these functional areas are valuable, an enterprise-wide real-time data strategy requires an event-driven architecture that delivers the full breadth of streaming and event functionality. To be included in this Buyers Guide, products must address at least two capabilities: messaging and event processing, application integration, streaming data and streaming analytics. Our assessment also considered whether the functionality was available from a software provider in a single offering or as a suite of products or cloud services.
This research evaluates the following software providers that offer products that address key elements of real-time data as we define it: Actian, Aiven, Alibaba Cloud, Altair, AWS, Cloud Software Group, Cloudera, Confluent, Cumulocity, Databricks, Google Cloud, GridGain, Hazelcast, Huawei Cloud, IBM, Informatica, Materialize, Microsoft, Oracle, Qubole, Redpanda, SAS, Solace, Striim and Tencent Cloud.
Buyers Guide Overview
For over two decades, ISG Research has conducted market research in a spectrum of areas across business applications, tools and technologies. We have designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of the business requirements in any enterprise. Utilization of our research methodology and decades of experience enables our Buyers Guide to be an effective method to assess and select software providers and products. The findings of this research undertaking contribute to our comprehensive approach to rating software providers in a manner that is based on the assessments completed by an enterprise.
ISG Research has designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of business requirements in any enterprise.
The ISG Buyers Guide™ for Real-Time Data is the distillation of over a year of market and product research efforts. It is an assessment of how well software providers’ offerings address enterprises’ requirements for real-time data software. The index is structured to support a request for information (RFI) that could be used in the request for proposal (RFP) process by incorporating all criteria needed to evaluate, select, utilize and maintain relationships with software providers. An effective product and customer experience with a provider can ensure the best long-term relationship and value achieved from a resource and financial investment.
In this Buyers Guide, ISG Research evaluates the software in seven key categories that are weighted to reflect buyers’ needs based on our expertise and research. Five are product-experience related: Adaptability, Capability, Manageability, Reliability, and Usability. In addition, we consider two customer-experience categories: Validation, and Total Cost of Ownership/Return on Investment (TCO/ROI). To assess functionality, one of the components of Capability, we applied the ISG Research Value Index methodology and blueprint, which links the personas and processes for real-time data to an enterprise’s requirements.
The structure of the research reflects our understanding that the effective evaluation of software providers and products involves far more than just examining product features, potential revenue or customers generated from a provider’s marketing and sales efforts. We believe it is important to take a comprehensive, research-based approach, since making the wrong choice of real-time data technology can raise the total cost of ownership, lower the return on investment and hamper an enterprise’s ability to reach its full performance potential. In addition, this approach can reduce the project’s development and deployment time and eliminate the risk of relying on a short list of software providers that does not represent a best fit for your enterprise.
ISG Research believes that an objective review of software providers and products is a critical business strategy for the adoption and implementation of real-time data software and applications. An enterprise’s review should include a thorough analysis of both what is possible and what is relevant. We urge enterprises to do a thorough job of evaluating real-time data systems and tools and offer this Buyers Guide as both the results of our in-depth analysis of these providers and as an evaluation methodology.
How To Use This Buyers Guide
Evaluating Software Providers: The Process
We recommend using the Buyers Guide to assess and evaluate new or existing software providers for your enterprise. The market research can be used as an evaluation framework to establish a formal request for information from providers on products and customer experience and will shorten the cycle time when creating an RFI. The steps listed below provide a process that can facilitate best possible outcomes.
- Define the business case and goals.
Define the mission and business case for investment and the expected outcomes from your organizational and technology efforts. - Specify the business needs. Defining the business requirements helps identify what specific capabilities are required with respect to people, processes, information and technology.
- Assess the required roles and responsibilities. Identify the individuals required for success at every level of the organization from executives to front line workers and determine the needs of each.
- Outline the project’s critical path. What needs to be done, in what order and who will do it? This outline should make clear the prior dependencies at each step of the project plan.
- Ascertain the technology approach. Determine the business and technology approach that most closely aligns to your organization’s requirements.
- Establish technology vendor evaluation criteria. Utilize the product experience: Adaptability, Capability, Manageability, Reliability and Usability, and the customer experience in TCO/ROI and Validation.
- Evaluate and select the technology properly. Weight the categories in the technology evaluation criteria to reflect your organization’s priorities to determine the short list of vendors and products.
- Establish the business initiative team to start the project.
Identify who will lead the project and the members of the team needed to plan and execute it with timelines, priorities and resources.
The Findings
All of the products we evaluated are feature-rich, but not all the capabilities offered by a software provider are equally valuable to types of workers or support everything needed to manage products on a continuous basis. Moreover, the existence of too many capabilities may be a negative factor for an enterprise if it introduces unnecessary complexity. Nonetheless, you may decide that a larger number of features in the product is a plus, especially if some of them match your enterprise’s established practices or support an initiative that is driving the purchase of new software.
Factors beyond features and functions or software provider assessments may become a deciding factor. For example, an enterprise may face budget constraints such that the TCO evaluation can tip the balance to one provider or another. This is where the Value Index methodology and the appropriate category weighting can be applied to determine the best fit of software providers and products to your specific needs.
Overall Scoring of Software Providers Across Categories
The research finds AWS atop the list, followed by Google Cloud and Microsoft. Software providers that place in the top three of a category earn the designation of Leader. Oracle has done so in six categories, Informatica in five, Databricks and Google Cloud in three, Microsoft in two and AWS and Solace in one category.
The overall representation of the research below places the rating of the Product Experience and Customer Experience on the x and y axes, respectively, to provide a visual representation and classification of the software providers. Those providers whose Product Experience have a higher weighted performance to the axis in aggregate of the five product categories place farther to the right, while the performance and weighting for the two Customer Experience categories determines placement on the vertical axis. In short, software providers that place closer to the upper-right on this chart performed better than those closer to the lower-left.
The research places software providers into one of four overall categories: Assurance, Exemplary, Merit or Innovative. This representation classifies providers’ overall weighted performance.
Exemplary: The categorization and placement of software providers in Exemplary (upper right) represent those that performed the best in meeting the overall Product and Customer Experience requirements. The providers rated Exemplary are: AWS, Cloudera, Confluent, Databricks, Google Cloud, IBM, Informatica, Microsoft, Oracle and Solace.
Innovative: The categorization and placement of software providers in Innovative (lower right) represent those that performed the best in meeting the overall Product Experience requirements but did not achieve the highest levels of requirements in Customer Experience. The providers rated Innovative are: Alibaba Cloud, Cloud Software Group, Huawei Cloud and Redpanda.
Assurance: The categorization and placement of software providers in Assurance (upper left) represent those that achieved the highest levels in the overall Customer Experience requirements but did not achieve the highest levels of Product Experience. The provider rated Assurance is: Actian.
Merit: The categorization of software providers in Merit (lower left) represents those that did not exceed the median of performance in Customer or Product Experience or surpass the threshold for the other three categories. The providers rated Merit are: Aiven, Altair, Cumulocity, GridGain, Hazelcast, Materialize, Qubole, SAS, Striim and Tencent Cloud.
We warn that close provider placement proximity should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every enterprise or for a specific process. Although there is a high degree of commonality in how enterprises handle real-time data, there are many idiosyncrasies and differences in how they do these functions that can make one software provider’s offering a better fit than another’s for a particular enterprise’s needs.
We advise enterprises to assess and evaluate software providers based on organizational requirements and use this research as a supplement to internal evaluation of a provider and products.
Product Experience
The process of researching products to address an enterprise’s needs should be comprehensive. Our Value Index methodology examines Product Experience and how it aligns with an enterprise’s life cycle of onboarding, configuration, operations, usage and maintenance. Too often, software providers are not evaluated for the entirety of the product; instead, they are evaluated on market execution and vision of the future, which are flawed since they do not represent an enterprise’s requirements but how the provider operates. As more software providers orient to a complete product experience, evaluations will be more robust.
The research results in Product Experience are ranked at 80%, or four-fifths, of the overall rating using the specific underlying weighted category performance. Importance was placed on the categories as follows: Usability (12.5%), Capability (30%), Reliability (12.5%), Adaptability (12.5%) and Manageability (12.5%). This weighting impacted the resulting overall ratings in this research. AWS, Google Cloud and Microsoft were designated Product Experience Leaders.
Customer Experience
The importance of a customer relationship with a software provider is essential to the actual success of the products and technology. The advancement of the Customer Experience and the entire life cycle an enterprise has with its software provider is critical for ensuring satisfaction in working with that provider. Technology providers that have chief customer officers are more likely to have greater investments in the customer relationship and focus more on their success. These leaders also need to take responsibility for ensuring this commitment is made abundantly clear on the website and in the buying process and customer journey.
The research results in Customer Experience are ranked at 20%, or one-fifth, using the specific underlying weighted category performance as it relates to the framework of commitment and value to the software provider-customer relationship. The two evaluation categories are Validation (10%) and TCO/ROI (10%), which are weighted to represent their importance to the overall research.
The software providers that evaluated the highest overall in the aggregated and weighted Customer Experience categories are Databricks, Informatica and Oracle. These category leaders best communicate commitment and dedication to customer needs.
Software providers that did not perform well in this category were unable to provide sufficient customer references to demonstrate success or articulate their commitment to customer experience and an enterprise’s journey. The selection of a software provider means a continuous investment by the enterprise, so a holistic evaluation must include examination of how they support their customer experience.
Appendix: Software Provider Inclusion
For inclusion in the ISG Buyers Guide™ for Real-Time Data in 2025, a software provider must be in good standing financially and ethically, have at least $20 million in annual or projected revenue verified using independent sources, sell products and provide support on at least two continents and have at least 50 workers. The principal source of the relevant business unit’s revenue must be software-related, and there must have been at least one major software release in the past 12 months.
Real-time data focuses on the sharing, processing and analysis of data communicated in messages and streams of messages generated by enterprise systems, devices and applications as business events occur. Real-time data processing enables enterprises to operate at the speed of business by acting on events as they happen.
To be included in the Real-Time Data Buyers Guide, the product must enable the sharing, processing and analysis of messages and streams of data generated by enterprise systems, devices and applications. It must be actively marketed as addressing at least two of the following functional areas, which are mapped into Buyers Guide capability model:
- Messaging and event processing
- Streaming data
- Streaming analytics
- Application integration
The research is designed to be independent of the specifics of software provider packaging and pricing. To represent the real-world environment in which businesses operate, we include providers that offer suites or packages of products that may include relevant individual modules or applications. If a software provider is actively marketing, selling and developing a product for the general market and it is reflected on the provider’s website that the product is within the scope of the research, that provider is automatically evaluated for inclusion.
All software providers that offer relevant real-time data products and meet the inclusion requirements were invited to participate in the evaluation process at no cost to them.
Software providers that meet our inclusion criteria but did not completely participate in our Buyers Guide were assessed solely on publicly available information. As this could have a significant impact on classification and ratings, we recommend additional scrutiny when evaluating those providers.
Products Evaluated
Provider |
Product Names |
Version |
Release |
Actian |
Actian DataFlow |
8.1 |
January 2025 |
Aiven |
Aiven for Apache Kafka Aiven for Apache Flink |
March 2025 June 2024 |
March 2025 June 2024 |
Alibaba Cloud |
ApsaraMQ for Kafka Realtime Compute for Apache Flink |
February 2025 8.0.11 |
February 2025 February 2025 |
Altair |
Altair Panopticon Altair AI Studio |
2025.1 2025.0.1 |
2025 February 2025 |
AWS |
Amazon MQ, Amazon Managed Service for Apache Flink Amazon Managed Streaming for Apache Kafka Amazon Data Firehose Amazon AppFlow Amazon API Gateway Amazon EventBridge |
February 2025, 1.20 November 2024 November 2024 May 2024 March 2025 August 2024 |
February 2025, September 2024 November 2024 November 2024 May 2024 March 2025 August 2024 |
Cloud Software Group |
TIBCO Enterprise Message Service TIBCO Spotfire Data Streams TIBCO Spotfire TIBCO Cloud Integration |
10.4.0 11.1.1 14.4 3.10.6.4 |
February 2025 October 2024 June 2024 April 2025 |
Cloudera |
Cloudera DataFlow Cloudera DataFlow for Data Hub |
2.9.0-h5-b2 7.3.1 |
February 2025 December 2024 |
Confluent |
Confluent Cloud |
February 2025 |
February 2025 |
Cumulocity |
Cumulocity Apama Cumulocity Streaming Analytics |
10.15.5 February |
June 2024 February 2025 |
Databricks |
Databricks Data Intelligence Platform |
April 2025 |
April 2025 |
Google Cloud |
Google Cloud Managed Service for Apache Kafka Google Cloud Pub/Sub Google Cloud Dataflow Google Cloud Apigee Google Cloud Application Integration |
December 2024 March 2025 March 2025 April 2025 April 2025 |
December 2024 March 2025 March 2025 April 2025 April 2025 |
GridGain |
GridGain Platform |
9.0.17 |
April 2025 |
Hazelcast |
Hazelcast Platform |
5.5.0 |
July 2024 |
Huawei Cloud |
Huawei Distributed Message Service (DMS) for Kafka Huawei Data Lake Insight (DLI) Huawei Cloud ROMA Connect |
September 2024 March 2025 April 2025 |
September 2024 March 2025 April 2025 |
IBM |
IBM Event Streams IBM Event Processing IBM Cloud Pak for Integration |
11.6.0 1.3.0 16.1.1 |
January 2025 January 2025 February 2025 |
Informatica |
Informatica Data Engineering Streaming Informatica Cloud Application Integration |
10.5.8 April 2025 |
February 2025 April 2025 |
Materialize |
Materialize |
0.137 |
March 2025 |
Microsoft |
Azure Event Hubs Azure Event Grid Microsoft Fabric Real-Time Intelligence Azure Logic Apps Azure API Management Azure Stream Analytics |
December 2024 February 2025 January 2025 January 2025 February 2025 January 2025 |
December 2024 February 2025 January 2025 January 2025 February 2025 January 2025 |
Oracle |
Oracle Cloud Infrastructure Queue Oracle Integration Oracle Cloud Infrastructure (OCI) API Gateway Oracle GoldenGate Stream Analytics |
February 2024 February 2025 December 2023 19.1 |
February 2024 February 2025 December 2023 November 2024 |
Qubole |
Open Data Lake Platform |
R64 |
March 2025 |
Redpanda |
Redpanda |
March 2025 |
March 2025 |
SAS |
SAS Event Stream Processing |
2025.03 |
March 2025 |
Solace |
Solace Platform |
March 2025 |
March 2025 |
Striim |
Striim Cloud |
5.0.6 |
February 2025 |
Tencent Cloud |
TDMQ for CKafka |
January 2025 |
January 2025 |
Providers of Promise
We did not include software providers that, as a result of our research and analysis, did not satisfy the criteria for inclusion in this Buyers Guide. These are listed below as “Providers of Promise.”
Provider |
Product |
Annual Revenue >$20m |
Operates on 2 Continents |
At Least 50 Employees |
GA or Current Product |
DataStax |
Astra Streaming |
Yes |
Yes |
Yes |
No |
DeltaStream |
DeltaStream |
No |
Yes |
No |
Yes |
Redpanda |
Redpanda Cloud |
Yes |
Yes |
Yes |
No |
RisingWave |
RisingWave Cloud |
No |
Yes |
No |
Yes |
StreamNative |
StreamNative Cloud |
No |
Yes |
Yes |
Yes |
Timeplus |
Timeplus Enterprise |
No |
Yes |
No |
Yes |
Ververica |
Ververica Unified Streaming Data Platform |
No |
Yes |
No |
Yes |
Executive Summary
Real-Time Data
As enterprises strive to be data-driven, making a higher proportion of decisions based on data becomes the norm. To differentiate from the pack, enterprises need to process and analyze more data and increase the frequency with which data-driven decisions are made by acting upon data in real time.
To differentiate from the pack, enterprises need to process and analyze more data and increase the frequency with which data-driven decisions are made by acting upon data in real time.
ISG Research defines real-time data as the sharing, processing and analysis of data communicated in messages and streams of messages generated by enterprise systems, devices and applications as business events occur. Real-time data processing enables enterprises to operate at the speed of business by acting on events as they happen.
Despite the overwhelming reliance on batch data processing, it is an artificial construct driven by the historical limitations of computing capabilities to generate and process data at the same time without impacting performance. Real-time data processing has existed for years in industry segments with the most extreme high-performance requirements, such as financial services and telecommunications.
In other industries, however, the reliance on batch data processing is so entrenched that processing data in real time has primarily been seen as a niche requirement, separate from the default focus on batch processing of data at rest. Less than one-quarter (22%) of enterprises participating in the ISG Research Analytics and Data Benchmark Research analyze data in real time.
Attitudes towards real-time data processing are changing as an increasing number of enterprises recognize that by failing to process and analyze data in real time, there is a risk of failing to operate at the pace of the real world. The pressure on enterprises to improve the ability to process and analyze data in real time is exacerbated by increased demand for intelligent operational applications infused with the results of analytic processes, such as personalization and artificial intelligence (AI)-driven recommendations. AI-driven intelligent applications require a new approach to data processing that enables real-time performance of machine learning (ML) on operational data to deliver instant, relevant information for accelerated decision-making.
As demand for real-time interactive applications becomes more pervasive, processing real-time data becomes a more mainstream pursuit.
Enterprises differentiate user experiences with real-time, AI-driven functionality. Doing so requires AI models to have access to up-to-date data via streams of events as they are generated in real time, as well as the ability to incorporate model inferencing into streaming analytics pipelines. Enterprises with an over-reliance on batch data processing will not match those able to harness real-time data as it is generated. As demand for real-time interactive applications becomes more pervasive, processing real-time data becomes a more mainstream pursuit. This is aided by the proliferation of products capable of real-time data processing and analytics, which have lowered the cost and technical barriers to developing new applications that take advantage of data in motion.
Reliance on batch data processing is so pervasive that for many data practitioners and business executives, real-time data may be an alien concept that appears to have a language of its own. Terms such as messaging, event processing, stream processing and streaming analytics are often used interchangeably, and the nuances are not necessarily clear to the uninitiated. However, the core concepts of real-time data processing are proven and well-defined, and the technologies that implement them are mature and readily available.
The starting point for any real-time data strategy is the concept of an event. Simply put, an event is a thing that happens. In the context of an application, an event is a change of state, such as a sensor identifying a new temperature reading. Messaging is the sharing of information between applications, devices and systems about events. As an event occurs, the application, device or system generates a message about the event that is shared with other applications, devices and systems across the enterprise.
Direct communication between applications is enabled and managed by application integration, which supports the fulfillment of business processes and workflows that rely on multiple applications operating in concert. While application integration has traditionally relied on point-to-point integration between individual applications, today’s application integration increasingly depends on application programming interfaces and API management.
In addition to communicating messages about events between applications, processing a continuous stream of event messages enables enterprises to act on the event data as it is generated and communicated. Stream processing encompasses the ingestion, filtering, integration, transformation, aggregation and enrichment of stream data, and the associated management and governance of stream processing.
Stream processing also forms the basis of stream analytics, which uses streaming compute engines to analyze streams of event data using SQL queries and real-time materialized views, including chart-based visualization of streaming data and, more recently, support for machine learning and generative AI model inferencing.
Processing and analyzing event data in isolation is valuable. Success with streaming data relies on the holistic management and governance of data in motion and data at rest. In recent years, traditional data processing providers have added support for continually processing streams of event data, while streaming and event specialist providers improve capabilities for the persistence of event data in a data warehouse, data lake or cloud storage for batch processing of historical event data. ISG asserts that by 2027, more than three-quarters of enterprises’ standard information architectures will include streaming data and event processing, allowing enterprises to be more responsive and provide better customer experiences.
As more enterprises adopt event-driven architecture, capabilities for the persistence and processing of historical event data increase the potential for streaming and event specialists to stake a claim and be considered an enterprise’s primary data platform provider rather than being utilized for a limited set of use cases.
The ability to execute at the speed of business by processing and acting on events as they occur will be the difference between competing and winning with analytics and data. Enterprises evaluating current and future data architecture requirements should consider real-time data technologies alongside more traditional batch-oriented data platforms to provide a holistic view of all data in motion and at rest.
The ISG Buyers Guide™ for Real-Time Data evaluates products based on core capabilities such as messaging and event processing, application integration, streaming data and streaming analytics. While products addressing these functional areas are valuable, an enterprise-wide real-time data strategy requires an event-driven architecture that delivers the full breadth of streaming and event functionality. To be included in this Buyers Guide, products must address at least two capabilities: messaging and event processing, application integration, streaming data and streaming analytics. Our assessment also considered whether the functionality was available from a software provider in a single offering or as a suite of products or cloud services.
This research evaluates the following software providers that offer products that address key elements of real-time data as we define it: Actian, Aiven, Alibaba Cloud, Altair, AWS, Cloud Software Group, Cloudera, Confluent, Cumulocity, Databricks, Google Cloud, GridGain, Hazelcast, Huawei Cloud, IBM, Informatica, Materialize, Microsoft, Oracle, Qubole, Redpanda, SAS, Solace, Striim and Tencent Cloud.
Buyers Guide Overview
For over two decades, ISG Research has conducted market research in a spectrum of areas across business applications, tools and technologies. We have designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of the business requirements in any enterprise. Utilization of our research methodology and decades of experience enables our Buyers Guide to be an effective method to assess and select software providers and products. The findings of this research undertaking contribute to our comprehensive approach to rating software providers in a manner that is based on the assessments completed by an enterprise.
ISG Research has designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of business requirements in any enterprise.
The ISG Buyers Guide™ for Real-Time Data is the distillation of over a year of market and product research efforts. It is an assessment of how well software providers’ offerings address enterprises’ requirements for real-time data software. The index is structured to support a request for information (RFI) that could be used in the request for proposal (RFP) process by incorporating all criteria needed to evaluate, select, utilize and maintain relationships with software providers. An effective product and customer experience with a provider can ensure the best long-term relationship and value achieved from a resource and financial investment.
In this Buyers Guide, ISG Research evaluates the software in seven key categories that are weighted to reflect buyers’ needs based on our expertise and research. Five are product-experience related: Adaptability, Capability, Manageability, Reliability, and Usability. In addition, we consider two customer-experience categories: Validation, and Total Cost of Ownership/Return on Investment (TCO/ROI). To assess functionality, one of the components of Capability, we applied the ISG Research Value Index methodology and blueprint, which links the personas and processes for real-time data to an enterprise’s requirements.
The structure of the research reflects our understanding that the effective evaluation of software providers and products involves far more than just examining product features, potential revenue or customers generated from a provider’s marketing and sales efforts. We believe it is important to take a comprehensive, research-based approach, since making the wrong choice of real-time data technology can raise the total cost of ownership, lower the return on investment and hamper an enterprise’s ability to reach its full performance potential. In addition, this approach can reduce the project’s development and deployment time and eliminate the risk of relying on a short list of software providers that does not represent a best fit for your enterprise.
ISG Research believes that an objective review of software providers and products is a critical business strategy for the adoption and implementation of real-time data software and applications. An enterprise’s review should include a thorough analysis of both what is possible and what is relevant. We urge enterprises to do a thorough job of evaluating real-time data systems and tools and offer this Buyers Guide as both the results of our in-depth analysis of these providers and as an evaluation methodology.
How To Use This Buyers Guide
Evaluating Software Providers: The Process
We recommend using the Buyers Guide to assess and evaluate new or existing software providers for your enterprise. The market research can be used as an evaluation framework to establish a formal request for information from providers on products and customer experience and will shorten the cycle time when creating an RFI. The steps listed below provide a process that can facilitate best possible outcomes.
- Define the business case and goals.
Define the mission and business case for investment and the expected outcomes from your organizational and technology efforts. - Specify the business needs. Defining the business requirements helps identify what specific capabilities are required with respect to people, processes, information and technology.
- Assess the required roles and responsibilities. Identify the individuals required for success at every level of the organization from executives to front line workers and determine the needs of each.
- Outline the project’s critical path. What needs to be done, in what order and who will do it? This outline should make clear the prior dependencies at each step of the project plan.
- Ascertain the technology approach. Determine the business and technology approach that most closely aligns to your organization’s requirements.
- Establish technology vendor evaluation criteria. Utilize the product experience: Adaptability, Capability, Manageability, Reliability and Usability, and the customer experience in TCO/ROI and Validation.
- Evaluate and select the technology properly. Weight the categories in the technology evaluation criteria to reflect your organization’s priorities to determine the short list of vendors and products.
- Establish the business initiative team to start the project.
Identify who will lead the project and the members of the team needed to plan and execute it with timelines, priorities and resources.
The Findings
All of the products we evaluated are feature-rich, but not all the capabilities offered by a software provider are equally valuable to types of workers or support everything needed to manage products on a continuous basis. Moreover, the existence of too many capabilities may be a negative factor for an enterprise if it introduces unnecessary complexity. Nonetheless, you may decide that a larger number of features in the product is a plus, especially if some of them match your enterprise’s established practices or support an initiative that is driving the purchase of new software.
Factors beyond features and functions or software provider assessments may become a deciding factor. For example, an enterprise may face budget constraints such that the TCO evaluation can tip the balance to one provider or another. This is where the Value Index methodology and the appropriate category weighting can be applied to determine the best fit of software providers and products to your specific needs.
Overall Scoring of Software Providers Across Categories
The research finds AWS atop the list, followed by Google Cloud and Microsoft. Software providers that place in the top three of a category earn the designation of Leader. Oracle has done so in six categories, Informatica in five, Databricks and Google Cloud in three, Microsoft in two and AWS and Solace in one category.
The overall representation of the research below places the rating of the Product Experience and Customer Experience on the x and y axes, respectively, to provide a visual representation and classification of the software providers. Those providers whose Product Experience have a higher weighted performance to the axis in aggregate of the five product categories place farther to the right, while the performance and weighting for the two Customer Experience categories determines placement on the vertical axis. In short, software providers that place closer to the upper-right on this chart performed better than those closer to the lower-left.
The research places software providers into one of four overall categories: Assurance, Exemplary, Merit or Innovative. This representation classifies providers’ overall weighted performance.
Exemplary: The categorization and placement of software providers in Exemplary (upper right) represent those that performed the best in meeting the overall Product and Customer Experience requirements. The providers rated Exemplary are: AWS, Cloudera, Confluent, Databricks, Google Cloud, IBM, Informatica, Microsoft, Oracle and Solace.
Innovative: The categorization and placement of software providers in Innovative (lower right) represent those that performed the best in meeting the overall Product Experience requirements but did not achieve the highest levels of requirements in Customer Experience. The providers rated Innovative are: Alibaba Cloud, Cloud Software Group, Huawei Cloud and Redpanda.
Assurance: The categorization and placement of software providers in Assurance (upper left) represent those that achieved the highest levels in the overall Customer Experience requirements but did not achieve the highest levels of Product Experience. The provider rated Assurance is: Actian.
Merit: The categorization of software providers in Merit (lower left) represents those that did not exceed the median of performance in Customer or Product Experience or surpass the threshold for the other three categories. The providers rated Merit are: Aiven, Altair, Cumulocity, GridGain, Hazelcast, Materialize, Qubole, SAS, Striim and Tencent Cloud.
We warn that close provider placement proximity should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every enterprise or for a specific process. Although there is a high degree of commonality in how enterprises handle real-time data, there are many idiosyncrasies and differences in how they do these functions that can make one software provider’s offering a better fit than another’s for a particular enterprise’s needs.
We advise enterprises to assess and evaluate software providers based on organizational requirements and use this research as a supplement to internal evaluation of a provider and products.
Product Experience
The process of researching products to address an enterprise’s needs should be comprehensive. Our Value Index methodology examines Product Experience and how it aligns with an enterprise’s life cycle of onboarding, configuration, operations, usage and maintenance. Too often, software providers are not evaluated for the entirety of the product; instead, they are evaluated on market execution and vision of the future, which are flawed since they do not represent an enterprise’s requirements but how the provider operates. As more software providers orient to a complete product experience, evaluations will be more robust.
The research results in Product Experience are ranked at 80%, or four-fifths, of the overall rating using the specific underlying weighted category performance. Importance was placed on the categories as follows: Usability (12.5%), Capability (30%), Reliability (12.5%), Adaptability (12.5%) and Manageability (12.5%). This weighting impacted the resulting overall ratings in this research. AWS, Google Cloud and Microsoft were designated Product Experience Leaders.
Customer Experience
The importance of a customer relationship with a software provider is essential to the actual success of the products and technology. The advancement of the Customer Experience and the entire life cycle an enterprise has with its software provider is critical for ensuring satisfaction in working with that provider. Technology providers that have chief customer officers are more likely to have greater investments in the customer relationship and focus more on their success. These leaders also need to take responsibility for ensuring this commitment is made abundantly clear on the website and in the buying process and customer journey.
The research results in Customer Experience are ranked at 20%, or one-fifth, using the specific underlying weighted category performance as it relates to the framework of commitment and value to the software provider-customer relationship. The two evaluation categories are Validation (10%) and TCO/ROI (10%), which are weighted to represent their importance to the overall research.
The software providers that evaluated the highest overall in the aggregated and weighted Customer Experience categories are Databricks, Informatica and Oracle. These category leaders best communicate commitment and dedication to customer needs.
Software providers that did not perform well in this category were unable to provide sufficient customer references to demonstrate success or articulate their commitment to customer experience and an enterprise’s journey. The selection of a software provider means a continuous investment by the enterprise, so a holistic evaluation must include examination of how they support their customer experience.
Appendix: Software Provider Inclusion
For inclusion in the ISG Buyers Guide™ for Real-Time Data in 2025, a software provider must be in good standing financially and ethically, have at least $20 million in annual or projected revenue verified using independent sources, sell products and provide support on at least two continents and have at least 50 workers. The principal source of the relevant business unit’s revenue must be software-related, and there must have been at least one major software release in the past 12 months.
Real-time data focuses on the sharing, processing and analysis of data communicated in messages and streams of messages generated by enterprise systems, devices and applications as business events occur. Real-time data processing enables enterprises to operate at the speed of business by acting on events as they happen.
To be included in the Real-Time Data Buyers Guide, the product must enable the sharing, processing and analysis of messages and streams of data generated by enterprise systems, devices and applications. It must be actively marketed as addressing at least two of the following functional areas, which are mapped into Buyers Guide capability model:
- Messaging and event processing
- Streaming data
- Streaming analytics
- Application integration
The research is designed to be independent of the specifics of software provider packaging and pricing. To represent the real-world environment in which businesses operate, we include providers that offer suites or packages of products that may include relevant individual modules or applications. If a software provider is actively marketing, selling and developing a product for the general market and it is reflected on the provider’s website that the product is within the scope of the research, that provider is automatically evaluated for inclusion.
All software providers that offer relevant real-time data products and meet the inclusion requirements were invited to participate in the evaluation process at no cost to them.
Software providers that meet our inclusion criteria but did not completely participate in our Buyers Guide were assessed solely on publicly available information. As this could have a significant impact on classification and ratings, we recommend additional scrutiny when evaluating those providers.
Products Evaluated
Provider |
Product Names |
Version |
Release |
Actian |
Actian DataFlow |
8.1 |
January 2025 |
Aiven |
Aiven for Apache Kafka Aiven for Apache Flink |
March 2025 June 2024 |
March 2025 June 2024 |
Alibaba Cloud |
ApsaraMQ for Kafka Realtime Compute for Apache Flink |
February 2025 8.0.11 |
February 2025 February 2025 |
Altair |
Altair Panopticon Altair AI Studio |
2025.1 2025.0.1 |
2025 February 2025 |
AWS |
Amazon MQ, Amazon Managed Service for Apache Flink Amazon Managed Streaming for Apache Kafka Amazon Data Firehose Amazon AppFlow Amazon API Gateway Amazon EventBridge |
February 2025, 1.20 November 2024 November 2024 May 2024 March 2025 August 2024 |
February 2025, September 2024 November 2024 November 2024 May 2024 March 2025 August 2024 |
Cloud Software Group |
TIBCO Enterprise Message Service TIBCO Spotfire Data Streams TIBCO Spotfire TIBCO Cloud Integration |
10.4.0 11.1.1 14.4 3.10.6.4 |
February 2025 October 2024 June 2024 April 2025 |
Cloudera |
Cloudera DataFlow Cloudera DataFlow for Data Hub |
2.9.0-h5-b2 7.3.1 |
February 2025 December 2024 |
Confluent |
Confluent Cloud |
February 2025 |
February 2025 |
Cumulocity |
Cumulocity Apama Cumulocity Streaming Analytics |
10.15.5 February |
June 2024 February 2025 |
Databricks |
Databricks Data Intelligence Platform |
April 2025 |
April 2025 |
Google Cloud |
Google Cloud Managed Service for Apache Kafka Google Cloud Pub/Sub Google Cloud Dataflow Google Cloud Apigee Google Cloud Application Integration |
December 2024 March 2025 March 2025 April 2025 April 2025 |
December 2024 March 2025 March 2025 April 2025 April 2025 |
GridGain |
GridGain Platform |
9.0.17 |
April 2025 |
Hazelcast |
Hazelcast Platform |
5.5.0 |
July 2024 |
Huawei Cloud |
Huawei Distributed Message Service (DMS) for Kafka Huawei Data Lake Insight (DLI) Huawei Cloud ROMA Connect |
September 2024 March 2025 April 2025 |
September 2024 March 2025 April 2025 |
IBM |
IBM Event Streams IBM Event Processing IBM Cloud Pak for Integration |
11.6.0 1.3.0 16.1.1 |
January 2025 January 2025 February 2025 |
Informatica |
Informatica Data Engineering Streaming Informatica Cloud Application Integration |
10.5.8 April 2025 |
February 2025 April 2025 |
Materialize |
Materialize |
0.137 |
March 2025 |
Microsoft |
Azure Event Hubs Azure Event Grid Microsoft Fabric Real-Time Intelligence Azure Logic Apps Azure API Management Azure Stream Analytics |
December 2024 February 2025 January 2025 January 2025 February 2025 January 2025 |
December 2024 February 2025 January 2025 January 2025 February 2025 January 2025 |
Oracle |
Oracle Cloud Infrastructure Queue Oracle Integration Oracle Cloud Infrastructure (OCI) API Gateway Oracle GoldenGate Stream Analytics |
February 2024 February 2025 December 2023 19.1 |
February 2024 February 2025 December 2023 November 2024 |
Qubole |
Open Data Lake Platform |
R64 |
March 2025 |
Redpanda |
Redpanda |
March 2025 |
March 2025 |
SAS |
SAS Event Stream Processing |
2025.03 |
March 2025 |
Solace |
Solace Platform |
March 2025 |
March 2025 |
Striim |
Striim Cloud |
5.0.6 |
February 2025 |
Tencent Cloud |
TDMQ for CKafka |
January 2025 |
January 2025 |
Providers of Promise
We did not include software providers that, as a result of our research and analysis, did not satisfy the criteria for inclusion in this Buyers Guide. These are listed below as “Providers of Promise.”
Provider |
Product |
Annual Revenue >$20m |
Operates on 2 Continents |
At Least 50 Employees |
GA or Current Product |
DataStax |
Astra Streaming |
Yes |
Yes |
Yes |
No |
DeltaStream |
DeltaStream |
No |
Yes |
No |
Yes |
Redpanda |
Redpanda Cloud |
Yes |
Yes |
Yes |
No |
RisingWave |
RisingWave Cloud |
No |
Yes |
No |
Yes |
StreamNative |
StreamNative Cloud |
No |
Yes |
Yes |
Yes |
Timeplus |
Timeplus Enterprise |
No |
Yes |
No |
Yes |
Ververica |
Ververica Unified Streaming Data Platform |
No |
Yes |
No |
Yes |
Fill out the form or log in to continue reading.
Research Director

Matt Aslett
Director of Research, Analytics and Data
Matt Aslett leads the software research and advisory for Analytics and Data at ISG Software Research, covering software that improves the utilization and value of information. His focus areas of expertise and market coverage include analytics, data intelligence, data operations, data platforms, and streaming and events.
About ISG Software Research
ISG Software Research provides expert market insights on vertical industries, business, AI and IT through comprehensive consulting, advisory and research services with world-class industry analysts and client experience. Our ISG Buyers Guides offer comprehensive ratings and insights into technology providers and products. Explore our research at research.isg-one.com.
About ISG Research
ISG Research provides subscription research, advisory consulting and executive event services focused on market trends and disruptive technologies driving change in business computing. ISG Research delivers guidance that helps businesses accelerate growth and create more value. For more information about ISG Research subscriptions, please email contact@isg-one.com.
About ISG
ISG (Information Services Group) (Nasdaq: III) is a leading global technology research and advisory firm. A trusted business partner to more than 900 clients, including more than 75 of the world’s top 100 enterprises, ISG is committed to helping corporations, public sector organizations, and service and technology providers achieve operational excellence and faster growth. The firm specializes in digital transformation services, including AI and automation, cloud and data analytics; sourcing advisory; managed governance and risk services; network carrier services; strategy and operations design; change management; market intelligence and technology research and analysis. Founded in 2006 and based in Stamford, Conn., ISG employs 1,600 digital-ready professionals operating in more than 20 countries—a global team known for its innovative thinking, market influence, deep industry and technology expertise, and world-class research and analytical capabilities based on the industry’s most comprehensive marketplace data.
For more information, visit isg-one.com.