Executive Summary
Streaming Data
Messaging and event processing enables enterprises to capture information related to business events and communicate it to any dependent application, device or system. Although messaging and event processing can trigger specific actions in response to events, the ability of an enterprise to make real-time business decisions is enhanced through the processing of streaming data as it is communicated.
Processing streaming data enables enterprises to act on the event data as it is communicated and forms the basis of streaming analytics.
Messaging generated by business events can be published sequentially as message queues or a continuous flow of event messages (or an event stream). ISG Research defines streaming data as the processing and management of continuously generated streams of event-based messages. By processing streaming data in real time, enterprises can refine and enhance streaming data and combine data from multiple event streams. Processing streaming data enables enterprises to act on the event data as it is communicated and forms the basis of streaming analytics.
Almost one-half (48%) of enterprises participating in the ISG Research Analytics and Data Benchmark Research currently use streaming data in operational processes, slightly ahead of the 44% that use streaming data in analytic processes.
Processing streaming data has been part of the data and analytics landscape for decades. Until recently, it has typically been adopted in industry segments with the most extreme high-performance requirements, such as financial services and telecommunications. In other industries, the processing of streaming data is primarily seen as a niche requirement, separate from the default focus on batch processing of data at rest.
The fundamental basis of processing streaming data is the ability to ingest a stream of events into a data processing engine. A variety of processing approaches are applied to the streaming data. Many of these processing approaches are the same as those applied to batch data processing, including data enrichment, data transformation and data filtering. The latter is particularly important for processing streaming data as it is used to sift the flow of events so that processing power is applied only to data outside of expected boundaries and, therefore, worthy of processing and analysis. Windowing can also be applied to the continuous flow of event data to divide the stream into time-based chunks to assist in identifying patterns and anomalies.
In addition to processing data from individual sources, streaming data may also involve unifying streams from multiple data sources. In its simplest form, this unification can result in data from multiple streams summarized in unison. More advanced cases involve data from various sources being joined and integrated into a combined stream.
Simple processing approaches such as filtering and basic transformations can be applied in a stateless stream processing environment where each event is processed independently of the others. More complex approaches, such as aggregations and joins, require stateful stream processing that retains the context of previously processed events, and stream processing guarantees that ensure messages are processed even if there are system or performance failures.
Success with streaming data relies on the holistic management and governance of data in motion and data at rest. Integration with more traditional batch data processing technologies is, therefore, important to streaming data.
Processing and analyzing event data in isolation is valuable. Success with streaming data relies on the holistic management and governance of data in motion and data at rest. Integration with more traditional batch data processing technologies is, therefore, important to streaming data. This includes stream-table duality to maintain compatibility with database tables, the ability to materialize streaming data into an external database or data storage for long-term persistence and the analysis of real-time data streams alongside batches of historical event data.
Much like batch data processing pipelines, creating streaming data processing pipelines requires code-based development tools and low- or no-code visual development environments. Streaming data pipelines also need to be managed and monitored to ensure they are performing as intended.
Stream management involves monitoring stream processing performance metrics via reports, dashboards and alerts, as well as capabilities to manage the scalability and fault tolerance of streaming processor technologies. Integration with external monitoring and observability tools is also important to ensure that stream processing infrastructure is not monitored and managed in isolation.
Governance of streaming data is an important capability that can be seen as critical as enterprises become increasingly reliant on streaming data. As is the case with batch data, functionality for monitoring and managing data quality and data lineage is required to maintain trust in streaming data.
Data quality capabilities include the ability to define, monitor and enforce data quality rules. Data lineage functionality enables enterprises to keep track of where event data originated from, as well as when and how it has been transformed or changed and by whom.
The ability for users to discover data streams on a self-service basis encourages the use of event data and the development of event-driven applications. However, it depends on a catalog of event streams as well as access management capabilities and the tagging of both technical and business metadata. Additionally, schema management and a registry of streaming data schema are essential to verifying and keeping track of the rules that define the structure of streaming data.
Emerging requirements for processing streaming data include the ability to incorporate machine learning into streaming data pipelines and AI model inference. This includes the ability to make calls to external artificial intelligence services to coordinate data processing and AI workflows and ensure that AI models have access to streaming data as it is updated in real time. ISG asserts that through 2027, streaming and event software providers will accelerate integration with AI and GenAI services to facilitate the development of interactive real-time applications.
Processing streaming data also forms the basis of streaming analytics, which uses streaming compute engines to analyze streams of event data using SQL queries and real-time materialized views. This includes chart-based visualization of streaming data and additional support for native ML and GenAI model inferencing, as well as more advanced capabilities such as retrieval augmented generation.
The ISG Buyers Guide™ for Streaming Data evaluates products based on core capabilities such as event streaming, stream processing, stream management and stream governance. To be included in this Buyers Guide, products must offer functionality for event streaming, stream processing, stream management and stream governance. Our assessment also considered whether the functionality in question was available from a software provider in a single offering or as a suite of products or cloud services.
This research evaluates the following software providers that offer products that address key elements of streaming data as we define it: Actian, Aiven, Alibaba Cloud, Altair, AWS, Cloud Software Group, Cloudera, Confluent, Databricks, Google Cloud, Cumulocity, GridGain, Hazelcast, Huawei Cloud, IBM, Informatica, Kurrent, Materialize, Microsoft, MongoDB, Palantir, Qubole, Redpanda, SAS, Solace, Striim and Tencent Cloud.
Buyers Guide Overview
For over two decades, ISG Research has conducted market research in a spectrum of areas across business applications, tools and technologies. We have designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of the business requirements in any enterprise. Utilization of our research methodology and decades of experience enables our Buyers Guide to be an effective method to assess and select software providers and products. The findings of this research undertaking contribute to our comprehensive approach to rating software providers in a manner that is based on the assessments completed by an enterprise.
ISG Research has designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of business requirements in any enterprise.
The ISG Buyers Guide™ for Streaming Data is the distillation of over a year of market and product research efforts. It is an assessment of how well software providers’ offerings address enterprises’ requirements for streaming analytics software. The index is structured to support a request for information (RFI) that could be used in the request for proposal (RFP) process by incorporating all criteria needed to evaluate, select, utilize and maintain relationships with software providers. An effective product and customer experience with a provider can ensure the best long-term relationship and value achieved from a resource and financial investment.
In this Buyers Guide, ISG Research evaluates the software in seven key categories that are weighted to reflect buyers’ needs based on our expertise and research. Five are product-experience related: Adaptability, Capability, Manageability, Reliability, and Usability. In addition, we consider two customer-experience categories: Validation, and Total Cost of Ownership/Return on Investment (TCO/ROI). To assess functionality, one of the components of Capability, we applied the ISG Research Value Index methodology and blueprint, which links the personas and processes for streaming data to an enterprise’s requirements.
The structure of the research reflects our understanding that the effective evaluation of software providers and products involves far more than just examining product features, potential revenue or customers generated from a provider’s marketing and sales efforts. We believe it is important to take a comprehensive, research-based approach, since making the wrong choice of streaming analytics technology can raise the total cost of ownership, lower the return on investment and hamper an enterprise’s ability to reach its full performance potential. In addition, this approach can reduce the project’s development and deployment time and eliminate the risk of relying on a short list of software providers that does not represent a best fit for your enterprise.
ISG Research believes that an objective review of software providers and products is a critical business strategy for the adoption and implementation of streaming analytics software and applications. An enterprise’s review should include a thorough analysis of both what is possible and what is relevant. We urge enterprises to do a thorough job of evaluating streaming analytics systems and tools and offer this Buyers Guide as both the results of our in-depth analysis of these providers and as an evaluation methodology.
How To Use This Buyers Guide
Evaluating Software Providers: The Process
We recommend using the Buyers Guide to assess and evaluate new or existing software providers for your enterprise. The market research can be used as an evaluation framework to establish a formal request for information from providers on products and customer experience and will shorten the cycle time when creating an RFI. The steps listed below provide a process that can facilitate best possible outcomes.
- Define the business case and goals.
Define the mission and business case for investment and the expected outcomes from your organizational and technology efforts. - Specify the business needs. Defining the business requirements helps identify what specific capabilities are required with respect to people, processes, information and technology.
- Assess the required roles and responsibilities. Identify the individuals required for success at every level of the organization from executives to front line workers and determine the needs of each.
- Outline the project’s critical path. What needs to be done, in what order and who will do it? This outline should make clear the prior dependencies at each step of the project plan.
- Ascertain the technology approach. Determine the business and technology approach that most closely aligns to your organization’s requirements.
- Establish technology vendor evaluation criteria. Utilize the product experience: Adaptability, Capability, Manageability, Reliability and Usability, and the customer experience in TCO/ROI and Validation.
- Evaluate and select the technology properly. Weight the categories in the technology evaluation criteria to reflect your organization’s priorities to determine the short list of vendors and products.
- Establish the business initiative team to start the project.
Identify who will lead the project and the members of the team needed to plan and execute it with timelines, priorities and resources.
The Findings
All of the products we evaluated are feature-rich, but not all the capabilities offered by a software provider are equally valuable to types of workers or support everything needed to manage products on a continuous basis. Moreover, the existence of too many capabilities may be a negative factor for an enterprise if it introduces unnecessary complexity. Nonetheless, you may decide that a larger number of features in the product is a plus, especially if some of them match your enterprise’s established practices or support an initiative that is driving the purchase of new software.
Factors beyond features and functions or software provider assessments may become a deciding factor. For example, an enterprise may face budget constraints such that the TCO evaluation can tip the balance to one provider or another. This is where the Value Index methodology and the appropriate category weighting can be applied to determine the best fit of software providers and products to your specific needs.
Overall Scoring of Software Providers Across Categories
The research finds Databricks atop the list, followed by AWS and Microsoft. Providers that place in the top three of a category earn the designation of Leader. Informatica has done so in six categories; Databricks in four; Google Cloud in three; Microsoft and Actian in two; and Cloudera, Confluent, Cumulocity, Solace and Striim in one category.
The overall representation of the research below places the rating of the Product Experience and Customer Experience on the x and y axes, respectively, to provide a visual representation and classification of the software providers. Those providers whose Product Experience have a higher weighted performance to the axis in aggregate of the five product categories place farther to the right, while the performance and weighting for the two Customer Experience categories determines placement on the vertical axis. In short, software providers that place closer to the upper-right on this chart performed better than those closer to the lower-left.
The research places software providers into one of four overall categories: Assurance, Exemplary, Merit or Innovative. This representation classifies providers’ overall weighted performance.
Exemplary: The categorization and placement of software providers in Exemplary (upper right) represent those that performed the best in meeting the overall Product and Customer Experience requirements. The providers rated Exemplary are: AWS, Cloudera, Confluent, Databricks, Google Cloud, IBM, Informatica, Microsoft and MongoDB.
Innovative: The categorization and placement of software providers in Innovative (lower right) represent those that performed the best in meeting the overall Product Experience requirements but did not achieve the highest levels of requirements in Customer Experience. The providers rated Innovative are: Alibaba Cloud, Cloud Software Group, Cumulocity, Huawei Cloud, Palantir, Redpanda, SAS and Striim.
Assurance: The categorization and placement of software providers in Assurance (upper left) represent those that achieved the highest levels in the overall Customer Experience requirements but did not achieve the highest levels of Product Experience. The providers rated Assurance are: Actian and Solace.
Merit: The categorization of software providers in Merit (lower left) represents those that did not exceed the median of performance in Customer or Product Experience or surpass the threshold for the other three categories. The providers rated Merit are: Aiven, Altair, GridGain, Hazelcast, Kurrent, Materialize, Qubole and Tencent Cloud.
We warn that close provider placement proximity should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every enterprise or for a specific process. Although there is a high degree of commonality in how enterprises handle streaming data, there are many idiosyncrasies and differences in how they do these functions that can make one software provider’s offering a better fit than another’s for a particular enterprise’s needs.
We advise enterprises to assess and evaluate software providers based on organizational requirements and use this research as a supplement to internal evaluation of a provider and products.
Product Experience
The process of researching products to address an enterprise’s needs should be comprehensive. Our Value Index methodology examines Product Experience and how it aligns with an enterprise’s life cycle of onboarding, configuration, operations, usage and maintenance. Too often, software providers are not evaluated for the entirety of the product; instead, they are evaluated on market execution and vision of the future, which are flawed since they do not represent an enterprise’s requirements but how the provider operates. As more software providers orient to a complete product experience, evaluations will be more robust.
The research results in Product Experience are ranked at 80%, or four-fifths, of the overall rating using the specific underlying weighted category performance. Importance was placed on the categories as follows: Usability (10%), Capability (40%), Reliability (10%), Adaptability (10%) and Manageability (10%). This weighting impacted the resulting overall ratings in this research. Databricks, Microsoft and AWS were designated Product Experience Leaders. While not a Leader, Google Cloud was also found to meet a broad range of enterprise product experience requirements.
Customer Experience
The importance of a customer relationship with a software provider is essential to the actual success of the products and technology. The advancement of the Customer Experience and the entire life cycle an enterprise has with its software provider is critical for ensuring satisfaction in working with that provider. Technology providers that have chief customer officers are more likely to have greater investments in the customer relationship and focus more on their success. These leaders also need to take responsibility for ensuring this commitment is made abundantly clear on the website and in the buying process and customer journey.
The research results in Customer Experience are ranked at 20%, or one-fifth, using the specific underlying weighted category performance as it relates to the framework of commitment and value to the software provider-customer relationship. The two evaluation categories are Validation (10%) and TCO/ROI (10%), which are weighted to represent their importance to the overall research.
The software providers that evaluated the highest overall in the aggregated and weighted Customer Experience categories are Databricks, Informatica and Solace. These category leaders best communicate commitment and dedication to customer needs.
Software providers that did not perform well in this category were unable to provide sufficient customer references to demonstrate success or articulate their commitment to customer experience and an enterprise’s journey. The selection of a software provider means a continuous investment by the enterprise, so a holistic evaluation must include examination of how they support their customer experience.
Appendix: Software Provider Inclusion
For inclusion in the ISG Buyers Guide™ for Streaming Data in 2025, a software provider must be in good standing financially and ethically, have at least $20 million in annual or projected revenue verified using independent sources, sell products and provide support on at least two continents and have at least 50 employees. The principal source of the relevant business unit’s revenue must be software-related, and there must have been at least one major software release in the last 12 months.
The product must enable the processing and management of continuously generated streams of event-based messages. To be included in the Streaming Data Buyers Guide requires functionality that addresses the following sections of the capabilities model:
- Event streaming
- Stream processing
- Stream management
- Stream governance
The research is designed to be independent of the specifics of software provider packaging and pricing. To represent the real-world environment in which businesses operate, we include providers that offer suites or packages of products that may include relevant individual modules or applications. If a software provider is actively marketing, selling and developing a product for the general market and it is reflected on the provider’s website that the product is within the scope of the research, that provider is automatically evaluated for inclusion.
All software providers that offer relevant streaming data products and meet the inclusion requirements were invited to participate in the evaluation process at no cost to them.
Software providers that meet our inclusion criteria but did not completely participate in our Buyers Guide were assessed solely on publicly available information. As this could have a significant impact on classification and ratings, we recommend additional scrutiny when evaluating those providers.
Products Evaluated
Provider |
Product Names |
Version |
Release |
Actian |
Actian DataFlow |
8.1 |
January 2025 |
Aiven |
Aiven for Apache Kafka Aiven for Apache Flink |
March 2025 June 2024 |
March 2025 June 2024 |
Alibaba Cloud |
ApsaraMQ for Kafka Realtime Compute for Apache Flink |
February 2025 8.0.11 |
February 2025 January 2025 |
Altair |
Altair Panopticon |
2025.1 |
2025 |
AWS |
Amazon Managed Service for Apache Flink Amazon Managed Streaming for Apache Kafka Amazon Data Firehose |
1.20 November 2024 November 2024 |
September 2024 November 2024 November 2024 |
Cloud Software Group |
TIBCO Spotfire Data Streams |
11.1.1 |
October 2024 |
Cloudera |
Cloudera DataFlow Cloudera Data Flow for Data Hub |
2.9.0-h5-b2 7.3.1 |
February 2025 December 2024 |
Confluent |
Confluent Cloud |
February 2025 |
February 2025 |
Cumulocity |
Cumulocity Apama |
10.15.5 |
June 2024 |
Databricks |
Databricks Data Intelligence Platform |
April 2025 |
April 2025 |
Google Cloud |
Google Cloud Managed Service for Apache Kafka Google Cloud Dataflow |
December 2024 April 2025 |
December 2024 April 2025 |
GridGain |
GridGain Platform |
9.0.17 |
April 2025 |
Hazelcast |
Hazelcast Platform |
5.5.0 |
July 2024 |
Huawei Cloud |
Huawei Distributed Message Service (DMS) for Kafka |
March 2025 |
March 2025 |
IBM |
IBM Event Streams IBM Event Processing |
11.6.0 1.3.0 |
January 2025 January 2025 |
Informatica |
Informatica Data Engineering Streaming |
10.5.8 |
February 2025 |
Kurrent |
Kurrent |
25.0.0 |
March 2025 |
Materialize |
Materialize |
0.137 |
March 2025 |
Microsoft |
Azure Event Hubs Microsoft Fabric Real-Time Intelligence Azure Stream Analytics |
December 2024 January 2025 January 2025 |
December 2024 January 2025 January 2025 |
MongoDB |
Atlas Stream Processing |
January 2025 |
January 2025 |
Palantir |
Foundry |
February 2025 |
February 2025 |
Qubole |
Open Data Lake Platform |
R64 |
March 2025 |
Redpanda |
Redpanda Cloud |
March 2025 |
March 2025 |
SAS |
SAS Event Stream Processing |
2025.03 |
March 2025 |
Solace |
Solace Platform |
March 2025 |
March 2025 |
Striim |
Striim Cloud |
5.0.6 |
February 2025 |
Tencent Cloud |
TDMQ for CKafka |
January 2025 |
January 2025 |
Providers of Promise
We did not include software providers that, as a result of our research and analysis, did not satisfy the criteria for inclusion in this Buyers Guide. These are listed below as “Providers of Promise.”
Provider |
Product |
Annual Revenue >$20m |
Operates on 2 Continents |
At Least 50 Employees |
GA or Current Product |
||
DataStax |
Astra Streaming |
Yes |
Yes |
Yes |
No |
||
DeltaStream |
DeltaStream |
No |
Yes |
No |
Yes |
||
Oracle |
OCI Streaming with Apache Kafka |
Yes |
Yes |
Yes |
No |
||
RisingWave |
RisingWave Cloud |
No |
Yes |
No |
Yes |
||
StreamNative |
StreamNative Cloud |
No |
Yes |
Yes |
Yes |
||
Timeplus |
Timeplus Enterprise |
No |
Yes |
No |
Yes |
||
Ververica |
Ververica Unified Streaming Data Platform |
No |
Yes |
No |
Yes |
||
Executive Summary
Streaming Data
Messaging and event processing enables enterprises to capture information related to business events and communicate it to any dependent application, device or system. Although messaging and event processing can trigger specific actions in response to events, the ability of an enterprise to make real-time business decisions is enhanced through the processing of streaming data as it is communicated.
Processing streaming data enables enterprises to act on the event data as it is communicated and forms the basis of streaming analytics.
Messaging generated by business events can be published sequentially as message queues or a continuous flow of event messages (or an event stream). ISG Research defines streaming data as the processing and management of continuously generated streams of event-based messages. By processing streaming data in real time, enterprises can refine and enhance streaming data and combine data from multiple event streams. Processing streaming data enables enterprises to act on the event data as it is communicated and forms the basis of streaming analytics.
Almost one-half (48%) of enterprises participating in the ISG Research Analytics and Data Benchmark Research currently use streaming data in operational processes, slightly ahead of the 44% that use streaming data in analytic processes.
Processing streaming data has been part of the data and analytics landscape for decades. Until recently, it has typically been adopted in industry segments with the most extreme high-performance requirements, such as financial services and telecommunications. In other industries, the processing of streaming data is primarily seen as a niche requirement, separate from the default focus on batch processing of data at rest.
The fundamental basis of processing streaming data is the ability to ingest a stream of events into a data processing engine. A variety of processing approaches are applied to the streaming data. Many of these processing approaches are the same as those applied to batch data processing, including data enrichment, data transformation and data filtering. The latter is particularly important for processing streaming data as it is used to sift the flow of events so that processing power is applied only to data outside of expected boundaries and, therefore, worthy of processing and analysis. Windowing can also be applied to the continuous flow of event data to divide the stream into time-based chunks to assist in identifying patterns and anomalies.
In addition to processing data from individual sources, streaming data may also involve unifying streams from multiple data sources. In its simplest form, this unification can result in data from multiple streams summarized in unison. More advanced cases involve data from various sources being joined and integrated into a combined stream.
Simple processing approaches such as filtering and basic transformations can be applied in a stateless stream processing environment where each event is processed independently of the others. More complex approaches, such as aggregations and joins, require stateful stream processing that retains the context of previously processed events, and stream processing guarantees that ensure messages are processed even if there are system or performance failures.
Success with streaming data relies on the holistic management and governance of data in motion and data at rest. Integration with more traditional batch data processing technologies is, therefore, important to streaming data.
Processing and analyzing event data in isolation is valuable. Success with streaming data relies on the holistic management and governance of data in motion and data at rest. Integration with more traditional batch data processing technologies is, therefore, important to streaming data. This includes stream-table duality to maintain compatibility with database tables, the ability to materialize streaming data into an external database or data storage for long-term persistence and the analysis of real-time data streams alongside batches of historical event data.
Much like batch data processing pipelines, creating streaming data processing pipelines requires code-based development tools and low- or no-code visual development environments. Streaming data pipelines also need to be managed and monitored to ensure they are performing as intended.
Stream management involves monitoring stream processing performance metrics via reports, dashboards and alerts, as well as capabilities to manage the scalability and fault tolerance of streaming processor technologies. Integration with external monitoring and observability tools is also important to ensure that stream processing infrastructure is not monitored and managed in isolation.
Governance of streaming data is an important capability that can be seen as critical as enterprises become increasingly reliant on streaming data. As is the case with batch data, functionality for monitoring and managing data quality and data lineage is required to maintain trust in streaming data.
Data quality capabilities include the ability to define, monitor and enforce data quality rules. Data lineage functionality enables enterprises to keep track of where event data originated from, as well as when and how it has been transformed or changed and by whom.
The ability for users to discover data streams on a self-service basis encourages the use of event data and the development of event-driven applications. However, it depends on a catalog of event streams as well as access management capabilities and the tagging of both technical and business metadata. Additionally, schema management and a registry of streaming data schema are essential to verifying and keeping track of the rules that define the structure of streaming data.
Emerging requirements for processing streaming data include the ability to incorporate machine learning into streaming data pipelines and AI model inference. This includes the ability to make calls to external artificial intelligence services to coordinate data processing and AI workflows and ensure that AI models have access to streaming data as it is updated in real time. ISG asserts that through 2027, streaming and event software providers will accelerate integration with AI and GenAI services to facilitate the development of interactive real-time applications.
Processing streaming data also forms the basis of streaming analytics, which uses streaming compute engines to analyze streams of event data using SQL queries and real-time materialized views. This includes chart-based visualization of streaming data and additional support for native ML and GenAI model inferencing, as well as more advanced capabilities such as retrieval augmented generation.
The ISG Buyers Guide™ for Streaming Data evaluates products based on core capabilities such as event streaming, stream processing, stream management and stream governance. To be included in this Buyers Guide, products must offer functionality for event streaming, stream processing, stream management and stream governance. Our assessment also considered whether the functionality in question was available from a software provider in a single offering or as a suite of products or cloud services.
This research evaluates the following software providers that offer products that address key elements of streaming data as we define it: Actian, Aiven, Alibaba Cloud, Altair, AWS, Cloud Software Group, Cloudera, Confluent, Databricks, Google Cloud, Cumulocity, GridGain, Hazelcast, Huawei Cloud, IBM, Informatica, Kurrent, Materialize, Microsoft, MongoDB, Palantir, Qubole, Redpanda, SAS, Solace, Striim and Tencent Cloud.
Buyers Guide Overview
For over two decades, ISG Research has conducted market research in a spectrum of areas across business applications, tools and technologies. We have designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of the business requirements in any enterprise. Utilization of our research methodology and decades of experience enables our Buyers Guide to be an effective method to assess and select software providers and products. The findings of this research undertaking contribute to our comprehensive approach to rating software providers in a manner that is based on the assessments completed by an enterprise.
ISG Research has designed the Buyers Guide to provide a balanced perspective of software providers and products that is rooted in an understanding of business requirements in any enterprise.
The ISG Buyers Guide™ for Streaming Data is the distillation of over a year of market and product research efforts. It is an assessment of how well software providers’ offerings address enterprises’ requirements for streaming analytics software. The index is structured to support a request for information (RFI) that could be used in the request for proposal (RFP) process by incorporating all criteria needed to evaluate, select, utilize and maintain relationships with software providers. An effective product and customer experience with a provider can ensure the best long-term relationship and value achieved from a resource and financial investment.
In this Buyers Guide, ISG Research evaluates the software in seven key categories that are weighted to reflect buyers’ needs based on our expertise and research. Five are product-experience related: Adaptability, Capability, Manageability, Reliability, and Usability. In addition, we consider two customer-experience categories: Validation, and Total Cost of Ownership/Return on Investment (TCO/ROI). To assess functionality, one of the components of Capability, we applied the ISG Research Value Index methodology and blueprint, which links the personas and processes for streaming data to an enterprise’s requirements.
The structure of the research reflects our understanding that the effective evaluation of software providers and products involves far more than just examining product features, potential revenue or customers generated from a provider’s marketing and sales efforts. We believe it is important to take a comprehensive, research-based approach, since making the wrong choice of streaming analytics technology can raise the total cost of ownership, lower the return on investment and hamper an enterprise’s ability to reach its full performance potential. In addition, this approach can reduce the project’s development and deployment time and eliminate the risk of relying on a short list of software providers that does not represent a best fit for your enterprise.
ISG Research believes that an objective review of software providers and products is a critical business strategy for the adoption and implementation of streaming analytics software and applications. An enterprise’s review should include a thorough analysis of both what is possible and what is relevant. We urge enterprises to do a thorough job of evaluating streaming analytics systems and tools and offer this Buyers Guide as both the results of our in-depth analysis of these providers and as an evaluation methodology.
How To Use This Buyers Guide
Evaluating Software Providers: The Process
We recommend using the Buyers Guide to assess and evaluate new or existing software providers for your enterprise. The market research can be used as an evaluation framework to establish a formal request for information from providers on products and customer experience and will shorten the cycle time when creating an RFI. The steps listed below provide a process that can facilitate best possible outcomes.
- Define the business case and goals.
Define the mission and business case for investment and the expected outcomes from your organizational and technology efforts. - Specify the business needs. Defining the business requirements helps identify what specific capabilities are required with respect to people, processes, information and technology.
- Assess the required roles and responsibilities. Identify the individuals required for success at every level of the organization from executives to front line workers and determine the needs of each.
- Outline the project’s critical path. What needs to be done, in what order and who will do it? This outline should make clear the prior dependencies at each step of the project plan.
- Ascertain the technology approach. Determine the business and technology approach that most closely aligns to your organization’s requirements.
- Establish technology vendor evaluation criteria. Utilize the product experience: Adaptability, Capability, Manageability, Reliability and Usability, and the customer experience in TCO/ROI and Validation.
- Evaluate and select the technology properly. Weight the categories in the technology evaluation criteria to reflect your organization’s priorities to determine the short list of vendors and products.
- Establish the business initiative team to start the project.
Identify who will lead the project and the members of the team needed to plan and execute it with timelines, priorities and resources.
The Findings
All of the products we evaluated are feature-rich, but not all the capabilities offered by a software provider are equally valuable to types of workers or support everything needed to manage products on a continuous basis. Moreover, the existence of too many capabilities may be a negative factor for an enterprise if it introduces unnecessary complexity. Nonetheless, you may decide that a larger number of features in the product is a plus, especially if some of them match your enterprise’s established practices or support an initiative that is driving the purchase of new software.
Factors beyond features and functions or software provider assessments may become a deciding factor. For example, an enterprise may face budget constraints such that the TCO evaluation can tip the balance to one provider or another. This is where the Value Index methodology and the appropriate category weighting can be applied to determine the best fit of software providers and products to your specific needs.
Overall Scoring of Software Providers Across Categories
The research finds Databricks atop the list, followed by AWS and Microsoft. Providers that place in the top three of a category earn the designation of Leader. Informatica has done so in six categories; Databricks in four; Google Cloud in three; Microsoft and Actian in two; and Cloudera, Confluent, Cumulocity, Solace and Striim in one category.
The overall representation of the research below places the rating of the Product Experience and Customer Experience on the x and y axes, respectively, to provide a visual representation and classification of the software providers. Those providers whose Product Experience have a higher weighted performance to the axis in aggregate of the five product categories place farther to the right, while the performance and weighting for the two Customer Experience categories determines placement on the vertical axis. In short, software providers that place closer to the upper-right on this chart performed better than those closer to the lower-left.
The research places software providers into one of four overall categories: Assurance, Exemplary, Merit or Innovative. This representation classifies providers’ overall weighted performance.
Exemplary: The categorization and placement of software providers in Exemplary (upper right) represent those that performed the best in meeting the overall Product and Customer Experience requirements. The providers rated Exemplary are: AWS, Cloudera, Confluent, Databricks, Google Cloud, IBM, Informatica, Microsoft and MongoDB.
Innovative: The categorization and placement of software providers in Innovative (lower right) represent those that performed the best in meeting the overall Product Experience requirements but did not achieve the highest levels of requirements in Customer Experience. The providers rated Innovative are: Alibaba Cloud, Cloud Software Group, Cumulocity, Huawei Cloud, Palantir, Redpanda, SAS and Striim.
Assurance: The categorization and placement of software providers in Assurance (upper left) represent those that achieved the highest levels in the overall Customer Experience requirements but did not achieve the highest levels of Product Experience. The providers rated Assurance are: Actian and Solace.
Merit: The categorization of software providers in Merit (lower left) represents those that did not exceed the median of performance in Customer or Product Experience or surpass the threshold for the other three categories. The providers rated Merit are: Aiven, Altair, GridGain, Hazelcast, Kurrent, Materialize, Qubole and Tencent Cloud.
We warn that close provider placement proximity should not be taken to imply that the packages evaluated are functionally identical or equally well suited for use by every enterprise or for a specific process. Although there is a high degree of commonality in how enterprises handle streaming data, there are many idiosyncrasies and differences in how they do these functions that can make one software provider’s offering a better fit than another’s for a particular enterprise’s needs.
We advise enterprises to assess and evaluate software providers based on organizational requirements and use this research as a supplement to internal evaluation of a provider and products.
Product Experience
The process of researching products to address an enterprise’s needs should be comprehensive. Our Value Index methodology examines Product Experience and how it aligns with an enterprise’s life cycle of onboarding, configuration, operations, usage and maintenance. Too often, software providers are not evaluated for the entirety of the product; instead, they are evaluated on market execution and vision of the future, which are flawed since they do not represent an enterprise’s requirements but how the provider operates. As more software providers orient to a complete product experience, evaluations will be more robust.
The research results in Product Experience are ranked at 80%, or four-fifths, of the overall rating using the specific underlying weighted category performance. Importance was placed on the categories as follows: Usability (10%), Capability (40%), Reliability (10%), Adaptability (10%) and Manageability (10%). This weighting impacted the resulting overall ratings in this research. Databricks, Microsoft and AWS were designated Product Experience Leaders. While not a Leader, Google Cloud was also found to meet a broad range of enterprise product experience requirements.
Customer Experience
The importance of a customer relationship with a software provider is essential to the actual success of the products and technology. The advancement of the Customer Experience and the entire life cycle an enterprise has with its software provider is critical for ensuring satisfaction in working with that provider. Technology providers that have chief customer officers are more likely to have greater investments in the customer relationship and focus more on their success. These leaders also need to take responsibility for ensuring this commitment is made abundantly clear on the website and in the buying process and customer journey.
The research results in Customer Experience are ranked at 20%, or one-fifth, using the specific underlying weighted category performance as it relates to the framework of commitment and value to the software provider-customer relationship. The two evaluation categories are Validation (10%) and TCO/ROI (10%), which are weighted to represent their importance to the overall research.
The software providers that evaluated the highest overall in the aggregated and weighted Customer Experience categories are Databricks, Informatica and Solace. These category leaders best communicate commitment and dedication to customer needs.
Software providers that did not perform well in this category were unable to provide sufficient customer references to demonstrate success or articulate their commitment to customer experience and an enterprise’s journey. The selection of a software provider means a continuous investment by the enterprise, so a holistic evaluation must include examination of how they support their customer experience.
Appendix: Software Provider Inclusion
For inclusion in the ISG Buyers Guide™ for Streaming Data in 2025, a software provider must be in good standing financially and ethically, have at least $20 million in annual or projected revenue verified using independent sources, sell products and provide support on at least two continents and have at least 50 employees. The principal source of the relevant business unit’s revenue must be software-related, and there must have been at least one major software release in the last 12 months.
The product must enable the processing and management of continuously generated streams of event-based messages. To be included in the Streaming Data Buyers Guide requires functionality that addresses the following sections of the capabilities model:
- Event streaming
- Stream processing
- Stream management
- Stream governance
The research is designed to be independent of the specifics of software provider packaging and pricing. To represent the real-world environment in which businesses operate, we include providers that offer suites or packages of products that may include relevant individual modules or applications. If a software provider is actively marketing, selling and developing a product for the general market and it is reflected on the provider’s website that the product is within the scope of the research, that provider is automatically evaluated for inclusion.
All software providers that offer relevant streaming data products and meet the inclusion requirements were invited to participate in the evaluation process at no cost to them.
Software providers that meet our inclusion criteria but did not completely participate in our Buyers Guide were assessed solely on publicly available information. As this could have a significant impact on classification and ratings, we recommend additional scrutiny when evaluating those providers.
Products Evaluated
Provider |
Product Names |
Version |
Release |
Actian |
Actian DataFlow |
8.1 |
January 2025 |
Aiven |
Aiven for Apache Kafka Aiven for Apache Flink |
March 2025 June 2024 |
March 2025 June 2024 |
Alibaba Cloud |
ApsaraMQ for Kafka Realtime Compute for Apache Flink |
February 2025 8.0.11 |
February 2025 January 2025 |
Altair |
Altair Panopticon |
2025.1 |
2025 |
AWS |
Amazon Managed Service for Apache Flink Amazon Managed Streaming for Apache Kafka Amazon Data Firehose |
1.20 November 2024 November 2024 |
September 2024 November 2024 November 2024 |
Cloud Software Group |
TIBCO Spotfire Data Streams |
11.1.1 |
October 2024 |
Cloudera |
Cloudera DataFlow Cloudera Data Flow for Data Hub |
2.9.0-h5-b2 7.3.1 |
February 2025 December 2024 |
Confluent |
Confluent Cloud |
February 2025 |
February 2025 |
Cumulocity |
Cumulocity Apama |
10.15.5 |
June 2024 |
Databricks |
Databricks Data Intelligence Platform |
April 2025 |
April 2025 |
Google Cloud |
Google Cloud Managed Service for Apache Kafka Google Cloud Dataflow |
December 2024 April 2025 |
December 2024 April 2025 |
GridGain |
GridGain Platform |
9.0.17 |
April 2025 |
Hazelcast |
Hazelcast Platform |
5.5.0 |
July 2024 |
Huawei Cloud |
Huawei Distributed Message Service (DMS) for Kafka |
March 2025 |
March 2025 |
IBM |
IBM Event Streams IBM Event Processing |
11.6.0 1.3.0 |
January 2025 January 2025 |
Informatica |
Informatica Data Engineering Streaming |
10.5.8 |
February 2025 |
Kurrent |
Kurrent |
25.0.0 |
March 2025 |
Materialize |
Materialize |
0.137 |
March 2025 |
Microsoft |
Azure Event Hubs Microsoft Fabric Real-Time Intelligence Azure Stream Analytics |
December 2024 January 2025 January 2025 |
December 2024 January 2025 January 2025 |
MongoDB |
Atlas Stream Processing |
January 2025 |
January 2025 |
Palantir |
Foundry |
February 2025 |
February 2025 |
Qubole |
Open Data Lake Platform |
R64 |
March 2025 |
Redpanda |
Redpanda Cloud |
March 2025 |
March 2025 |
SAS |
SAS Event Stream Processing |
2025.03 |
March 2025 |
Solace |
Solace Platform |
March 2025 |
March 2025 |
Striim |
Striim Cloud |
5.0.6 |
February 2025 |
Tencent Cloud |
TDMQ for CKafka |
January 2025 |
January 2025 |
Providers of Promise
We did not include software providers that, as a result of our research and analysis, did not satisfy the criteria for inclusion in this Buyers Guide. These are listed below as “Providers of Promise.”
Provider |
Product |
Annual Revenue >$20m |
Operates on 2 Continents |
At Least 50 Employees |
GA or Current Product |
||
DataStax |
Astra Streaming |
Yes |
Yes |
Yes |
No |
||
DeltaStream |
DeltaStream |
No |
Yes |
No |
Yes |
||
Oracle |
OCI Streaming with Apache Kafka |
Yes |
Yes |
Yes |
No |
||
RisingWave |
RisingWave Cloud |
No |
Yes |
No |
Yes |
||
StreamNative |
StreamNative Cloud |
No |
Yes |
Yes |
Yes |
||
Timeplus |
Timeplus Enterprise |
No |
Yes |
No |
Yes |
||
Ververica |
Ververica Unified Streaming Data Platform |
No |
Yes |
No |
Yes |
||
Fill out the form or log in to continue reading.
Research Director

Matt Aslett
Director of Research, Analytics and Data
Matt Aslett leads the software research and advisory for Analytics and Data at ISG Software Research, covering software that improves the utilization and value of information. His focus areas of expertise and market coverage include analytics, data intelligence, data operations, data platforms, and streaming and events.
About ISG Software Research
ISG Software Research provides expert market insights on vertical industries, business, AI and IT through comprehensive consulting, advisory and research services with world-class industry analysts and client experience. Our ISG Buyers Guides offer comprehensive ratings and insights into technology providers and products. Explore our research at research.isg-one.com.
About ISG Research
ISG Research provides subscription research, advisory consulting and executive event services focused on market trends and disruptive technologies driving change in business computing. ISG Research delivers guidance that helps businesses accelerate growth and create more value. For more information about ISG Research subscriptions, please email contact@isg-one.com.
About ISG
ISG (Information Services Group) (Nasdaq: III) is a leading global technology research and advisory firm. A trusted business partner to more than 900 clients, including more than 75 of the world’s top 100 enterprises, ISG is committed to helping corporations, public sector organizations, and service and technology providers achieve operational excellence and faster growth. The firm specializes in digital transformation services, including AI and automation, cloud and data analytics; sourcing advisory; managed governance and risk services; network carrier services; strategy and operations design; change management; market intelligence and technology research and analysis. Founded in 2006 and based in Stamford, Conn., ISG employs 1,600 digital-ready professionals operating in more than 20 countries—a global team known for its innovative thinking, market influence, deep industry and technology expertise, and world-class research and analytical capabilities based on the industry’s most comprehensive marketplace data.
For more information, visit isg-one.com.