ISG Software Research Analyst Perspectives

The Benefits and Risks of OLTP-On-Lakehouse Data Platforms

Written by Matt Aslett | May 5, 2026 10:00:00 AM

The emergence of cloud computing has had an enormous impact on all segments of the IT industry, including data platforms. All providers of data platform products have enabled their products to be deployed in the cloud and/or consumed as cloud-hosted managed services. To date, the cloud has arguably had the largest impact on analytic data platforms, where cloud infrastructure led to the emergence of a new analytic data platform architecture: the data lakehouse. Now ubiquitous, the data lakehouse decoupled compute and storage and enabled enterprises to take advantage of data lakes based on cloud object storage combined with open file formats, open table formats and analytic data processing engines. Many data platform providers are now looking to bring some of the same advantages of the data lakehouse to operational data platforms.

ISG Research defines operational data platforms as data platforms used to manage the storage and processing of data generated by enterprise applications that run the business, including finance, operations, sales and customer experience systems. These have traditionally been known as Online Transaction Processing (OLTP) workloads. Running operational data platforms on the cloud is by no means new. All providers assessed in ISG’s recent 2026 Operational Data Platforms Buyers Guide offer products that can be deployed on public cloud infrastructure, and 93% offer products that can be consumed as a cloud-based managed service. Most of these cloud-based operational data platform products are based on an architecture in which the data processing compute layer is loosely coupled with dedicated storage. A new breed of products has emerged in recent months that combines the operational data processing compute layer with the data lakehouse. One of the primary proponents of this approach—Databricks— has described this architecture as a lakebase. It is unclear whether this name will become more widely adopted to describe the concept, not least because Databricks is using it as the branding for its own Databricks Lakebase product.

Databricks was also one of the prominent early users of the term data lakehouse, and for some time it marketed what is now its Databricks Data Intelligence Platform as the Databricks Data Lakehouse Platform. However, the term lakehouse was already in use by several other providers at the time. In contrast, the only other provider that I am aware of using the term lakebase is Onehouse, which in February announced Onehouse LakeBase to provide a low-latency serving layer for open lakehouse tables. Despite the name, Onehouse LakeBase does not meet Databricks’ definition of a lakebase. Rather than a fully-fledged operational database on the lakehouse, Onehouse LakeBase is best described as a Postgres-compatible endpoint providing index and serving extensions that enable lowlatency processing of lakehouse data.

Several other providers are delivering products that are more aligned with the lakebase concept as Databricks has defined it without using the term. Examples include Microsoft with Fabric Databases and Snowflake with Snowflake Postgres. I would also add Oracle with Oracle Autonomous AI Lakehouse. Although it is primarily being positioned for analytics workloads, it enables interoperability between lakehouse data in Apache Iceberg tables and all Oracle Database workloads—both analytic and operational. I anticipate multiple other providers will deliver OLTP-on-lakehouse offerings, whether that is through providers combining existing operational database and lakehouse products, lakehouse specialists adding operational databases, or operational database specialists providing integration with partner lakehouse offerings. I assert that through 2028, data platform providers will prioritize the development of hybrid operational and analytic processing functionality to meet the requirements of intelligent applications driven by agentic artificial intelligence (AI) and generative AI (GenAI). OLTP-on-lakehouse is one method of addressing those requirements.

The primary claimed advantages of OLTP-on-lakehouse are rooted in the decoupling of the storage layer from the compute layer. This enables the use of relatively low-cost and highly durable cloud object storage, as well as independent performance and scaling of compute and storage, to meet demand and avoid unnecessary overprovisioning costs. That means scaling down as well as up, with the ability to scale the data processing layer down to zero as required to better align costs with usage. The use of cloud object storage as a shared storage layer for both analytic and operational workloads also enables a reduction of data replication requirements by facilitating operational and analytic workloads to read and write the same data. In combination, all these claimed advantages also facilitate support for automated agents, along with automated provisioning and management of databases by AI agents. This is among the evolving requirements for operational data platforms I identified late last year. Many OLTP-on-lakehouse products are also delivering other capabilities discussed in that analyst perspective, including database cloning and branching to enable integration with continuous integration and continuous development processes.

One of the other claimed advantages of OLTP-on-lakehouse is freedom from lock-in. The validity of that claim is open to question, however. It is correct that the separation of compute and storage along with support for open standards and formats have some theoretical benefits in relation to freedom and choice. However, the benefits of OLTP-on-lakehouse are clearly dependent on having the same provider for analytic and operational data platform workloads, which by default increases the theoretical risk of lock-in to a specific provider. I recommend that enterprises evaluating operational data platform providers include OLTP-on-lakehouse products in their assessments and evaluate them in relation to the claimed benefits as well as potential risks and challenges. Additionally, while emerging requirements for operational data platforms are being incorporated into OLTP-on-lakehouse products, they should also be evaluated in relation to core traditional database functionality to address data persistence, data processing, data analytics, data administration and data development.

Regards,

Matt Aslett