Visualização normal

Antes de ontemStream principal
  • ✇Security | CIO
  • SAP to acquire data lakehouse vendor Dremio
    SAP on Monday announced plans to acquire Dremio, which bills itself as an agentic lakehouse company, for an unspecified price. The move is complicated by similar offerings from existing SAP partners Snowflake and Databricks, but analysts point to key differences with Dremio, especially in its ability to work with data while it sits in the enterprise’s environment, rather than having to live externally. One of SAP’s justifications for the acquisition is that it will theo
     

SAP to acquire data lakehouse vendor Dremio

4 de Maio de 2026, 23:56

SAP on Monday announced plans to acquire Dremio, which bills itself as an agentic lakehouse company, for an unspecified price. The move is complicated by similar offerings from existing SAP partners Snowflake and Databricks, but analysts point to key differences with Dremio, especially in its ability to work with data while it sits in the enterprise’s environment, rather than having to live externally.

One of SAP’s justifications for the acquisition is that it will theoretically make it easier for IT executives to combine SAP data with non-SAP data. But its strongest rationale involves Dremio’s ability to make complex data more AI-friendly, so that it can more quickly and cost-effectively be made usable. 

“Most enterprise AI projects fail to deliver value not because of the AI itself, but because the underlying data is fragmented, locked in proprietary formats and stripped of the business context that makes it meaningful,” the SAP announcement said. “The result is a familiar and costly pattern: pilots that cannot scale, slow integration of new data sources, duplicated engineering work and compliance risk when organizations cannot explain how an AI-driven decision was reached. Dremio helps eliminate that data fragmentation and integration friction.”

While SAP is citing the data quality argument, there are many elements of enterprise data quality, including data that is outdated, from unreliable sources, or that exists without meaningful context that aren’t addressed by Dremio.

However, SAP said, “With Dremio, SAP Business Data Cloud will become an Apache Iceberg-native enterprise lakehouse that unifies SAP and non-SAP data to power agentic AI at enterprise scale. Apache Iceberg is the industry-standard open table format, and SAP Business Data Cloud will natively support it as its foundation.” This means that there need be no data movement or format conversion; SAP and non-SAP data “can coexist on the same open foundation, with federated analytical reach across every enterprise data source.”

Complicated comparison

Analysts and consultants said that any comparison of Dremio to existing SAP partners Snowflake and Databricks is complicated. For example, Dremio is younger and less established than either Snowflake or Databricks, which suggests that it is a less ideal match for enterprises. 

SAP strategy specialist Harikishore Sreenivasalu, CEO of Aarini Consulting in the Netherlands, said that both Snowflake and Databricks would have been ideal acquisition targets many years ago, but they would be far too expensive today. 

“Databricks and Snowflake are better [for enterprise IT] for sure because they have a mature platform, they do multi cloud” whereas Dremio “is the new entrant in the market and they have to mature more to be enterprise ready. Their security aspects need to mature,” Sreenivasalu told CIO.

But Sreenivasalu added that the situation could easily change after SAP invests and works with the Dremio team. He advised CIOs to “stick with where you are today but watch how technologies get integrated. Listen to the SAP roadmap.”

In a LinkedIn post, Sreenivasalu said the move still is very positive for SAP: “This is the missing piece. SAP has Joule. SAP has BTP. SAP has the business processes. Now it has the open data fabric to feed AI agents the context they need to act, not just answer. For those of us building on SAP BTP + Databricks + SAP BDC, this is a signal: the lakehouse and the ERP world are converging, fast. The future of enterprise AI just got a whole lot clearer.” 

Addresses LLM limitations

During a news conference Monday morning, SAP executives focused on how this move potentially addresses some of the key large language model (LLM) limitations with enterprise data, especially with predictive analytics.

Philipp Herzig, SAP’s chief technology officer, said that LLMs have various limitations, noting, “LLMs don’t deal really well with numbers” and that they struggle with structured data “where we have a lot of differentiation.” 

The practical difference is when systems try to predict the future as opposed to analyzing the past, such as when asking how well a retailer’s product will sell over the next 10 months, or predicting likely payment delays and their impacts on projected cashflow. “This is where LLMs struggle a lot,” Herzig said. He also stressed that Dremio’s ability to work with enterprise data while it still resides in that organization’s on-prem systems is critical for highly-regulated enterprises. 

Local data difference

Flavio Villanustre, CISO for the LexisNexis Risk Solutions Group, also sees the ability to handle data locally as the big draw.

Databricks and Snowflake both offer strong functionality, he pointed out, but users must move the data to their platform and reformat it. After this is complete, the result is a central data lake to address data access needs. “Dremio, on the other hand, provides easy decentralized data access, allowing users to access their data in place,” he said. “Of course, this could be at the expense of data processing performance, but the ease of use and flexibility could outweigh the performance loss.” Implementation speed in days versus weeks or months is another plus, he added. “There is a significant benefit to that.”

Sanchit Vir Gogia, chief analyst at Greyhound Research, agreed with Villanustre, but only to a limited extent. 

“The distinction is not as clean as ‘Dremio lets data stay in place, while Snowflake and Databricks require everything to move,’” he noted. “Snowflake and Databricks have both invested significantly in external data access, sharing, open formats, governance layers, and interoperability. So it would be unfair to describe either as old-style ‘move everything first’ platforms.’” But, he added, the broader argument is correct. “[Dremio] starts from the assumption that enterprise data is already distributed and that the first problem is often access, context, federation, and governance, not wholesale relocation. For SAP customers, that matters a great deal,” he said.

That’s because of the nature of many of SAP enterprise customers’ datasets. 

“Most large SAP estates are not clean, centralized data environments,” he pointed out. “They are brownfield landscapes: SAP data, non-SAP data, legacy warehouses, departmental lakes, regional repositories, acquired systems, partner data, and industry-specific platforms.” While telling these customers that AI-readiness begins with moving everything into one central platform may be good for the vendor, it’s a lot of work for the buyer.

Dremio gives SAP “a more pragmatic story,” Gogia said. “It allows SAP to say: keep more of your data where it is, access it faster, apply more consistent catalogue and semantic controls, and bring it into Business Data Cloud and AI workflows without forcing a major migration program upfront.”

Aman Mahapatra, chief strategy officer for Tribeca Softtech, a New York City-based technology consulting firm, noted that an acquisition of either Snowflake or Databricks would obliterate SAP’s marketing message/sales pitch.

“SAP did not buy a data warehouse. They bought a position in the open table format wars, and the timing tells you exactly why Snowflake and Databricks were never realistic targets,” he said. “Acquiring either would have collapsed SAP Business Data Cloud’s neutrality story overnight and alienated half the customer base in either direction. SAP’s strategic position depends on sitting above the warehouse layer rather than inside it, and Dremio is the federated layer that talks to both Snowflake and Databricks without requiring SAP to pick a side.”

Assume things will change

Mahapatra urges enterprise CIOs to be extra cautious. 

“For IT executives with active Snowflake and Databricks contracts this morning, nothing changes in the next two quarters, but by the first half of 2027, expect SAP to steer net-new AI workloads toward Business Data Cloud regardless of what the partnership press releases say today. The CIOs who plan for that trajectory now will negotiate from strength,” Mahapatra said.

Compute and storage that data warehouse vendors provide is rapidly becoming a commodity, he said, and the “defensible value” in enterprise AI is migrating up the stack to the semantic layer, the catalog, the lineage graph, and the business context that lets an agent know what ‘active customer’ means within an organization.

“SAP just bought the toolkit to own that layer for any company running SAP at the core,” he said. “If you are an SAP-heavy shop running analytics on Snowflake or Databricks, your warehouse vendors are about to feel less strategic and more like high-performance compute backends.”

Corrects a strategic error

Jason Andersen, principal analyst for Moor Insights & Strategy, noted that for quite some time, SAP has been relentlessly encouraging enterprises to host all of their data within SAP systems. SAP can’t reverse that position even if it wanted to. 

What the Dremio deal does, Andersen opines, is to instead address the pockets of data that many enterprise CIOs, especially in manufacturing and highly-regulated verticals, have refused to turn over to SAP. The Dremio deal gives SAP a face-saving way to get an even higher percentage of its customers’ data, he said. 

“Manufacturing is loath to put things in the cloud and [manufacturing CIOs] put up a violent protest [against] going into the cloud,” Andersen said. “This [acquisition] lets SAP access a lot of data that hasn’t yet moved to SAP.”

Shashi Bellamkonda, principal research director at Info-Tech Research Group, said he sees the SAP Dremio move as fixing a strategic error that SAP made years ago, when it did not develop its own Apache Iceberg capabilities. 

“Apache Iceberg is an open-source table format designed for large-scale analytical datasets stored in data lakes, a kind of bridge between raw data files and analytical tools,” Bellamkonda said. “[SAP] should have done this earlier rather than waiting till 2026.”

  • ✇Security | CIO
  • 독일 소버린 AI 대표주자 알레프 알파, 코히어와 손잡고 글로벌 연합 선택
    유럽이 미국 기술 기업 의존에서 벗어나 기술 주권을 강화하려는 가운데, 한때 독일의 ‘소버린 AI’ 기대주로 꼽히던 알레프 알파가 해외 기업 인수 대상으로 거론되고 있다. 알레프 알파는 캐나다 AI 기업 코히어와의 합병을 앞두고 있다. 이번 거래는 코히어의 글로벌 AI 경쟁력과 알레프 알파의 연구 기반을 결합하는 데 초점을 맞춘다. 양사는 캐나다와 독일의 산업 생태계를 바탕으로 강력한 AI 기업으로 성장하겠다는 계획이다. 코히어 CEO 에이단 고메즈는 “전 세계 조직들은 AI 스택 전반에 대해 타협 없는 통제권을 요구하고 있다”라며 “이번 협력은 이러한 수요를 충족하기 위해 필요한 대규모 확장성, 견고한 인프라, 그리고 세계적 수준의 연구개발 인재를 확보하는 계기가 될 것”이라고 밝혔다. 24일 공개된 공식 보도자료에서는 이번 거래를 ‘대등한 합병’으로 설명하고 있지만, 각주 내용에 따르면 독일 기업 주주 승인만으로 성사가 가
     

독일 소버린 AI 대표주자 알레프 알파, 코히어와 손잡고 글로벌 연합 선택

29 de Abril de 2026, 06:40

유럽이 미국 기술 기업 의존에서 벗어나 기술 주권을 강화하려는 가운데, 한때 독일의 ‘소버린 AI’ 기대주로 꼽히던 알레프 알파가 해외 기업 인수 대상으로 거론되고 있다.

알레프 알파는 캐나다 AI 기업 코히어와의 합병을 앞두고 있다. 이번 거래는 코히어의 글로벌 AI 경쟁력과 알레프 알파의 연구 기반을 결합하는 데 초점을 맞춘다. 양사는 캐나다와 독일의 산업 생태계를 바탕으로 강력한 AI 기업으로 성장하겠다는 계획이다.

코히어 CEO 에이단 고메즈는 “전 세계 조직들은 AI 스택 전반에 대해 타협 없는 통제권을 요구하고 있다”라며 “이번 협력은 이러한 수요를 충족하기 위해 필요한 대규모 확장성, 견고한 인프라, 그리고 세계적 수준의 연구개발 인재를 확보하는 계기가 될 것”이라고 밝혔다.

24일 공개된 공식 보도자료에서는 이번 거래를 ‘대등한 합병’으로 설명하고 있지만, 각주 내용에 따르면 독일 기업 주주 승인만으로 성사가 가능한 구조여서 인수 성격이 일부 반영된 것으로 해석된다.

합병 이후 양사는 금융, 방위, 헬스케어 등 규제가 엄격한 산업을 중심으로 맞춤형 AI 서비스를 제공할 계획이다. 양사의 기술과 제품을 결합해 각 지역의 법률, 문화적 맥락, 기관별 요구사항에 부합하는 AI 솔루션을 제공하겠다는 전략이다.

이번 움직임은 전 세계 기업들이 미국 외 대안을 모색하는 흐름 속에서 나왔다. 트럼프 행정부의 관세 정책과 이란과의 갈등에서 비롯된 불확실성이 이러한 변화의 배경으로 지목된다.

유럽 내부에서도 미국 중심의 기술 패권에 대응하기 위한 다양한 시도가 이어지고 있다. 유럽연합(EU)은 주요 프로젝트에서 유럽 기업을 선택할 수 있도록 하는 ‘유로스택(Eurostack)’ 전략을 추진해 왔으며, 알레프 알파는 해당 계획에서 핵심 기업 중 하나로 언급된 바 있다. 또한 EU는 미국과 중국의 AI 주도권을 견제하기 위해 ‘오픈 유로 LLM(Open Euro LLM)’ 프로젝트도 출범시켰다.
dl-ciokorea@foundryco.com

  • ✇Security | CIO
  • Germany’s sovereign AI hope changes hands
    As Europe seeks to assert its technological independence from the US vendors Aleph Alpha, once seen as Germany’s sovereign AI hope, is the target of a transatlantic takeover. Aleph Alpha is set to merge with Canada’s Cohere in a deal that will bring together Cohere’s global AI clout and Aleph Alpha’s background in research. The two companies hope they will be able to develop an AI powerhouse, with backing from their Canadian and German ecosystems “Organizations globa
     

Germany’s sovereign AI hope changes hands

24 de Abril de 2026, 15:31

As Europe seeks to assert its technological independence from the US vendors Aleph Alpha, once seen as Germany’s sovereign AI hope, is the target of a transatlantic takeover.

Aleph Alpha is set to merge with Canada’s Cohere in a deal that will bring together Cohere’s global AI clout and Aleph Alpha’s background in research. The two companies hope they will be able to develop an AI powerhouse, with backing from their Canadian and German ecosystems

“Organizations globally are demanding uncompromising control over their AI stack. This transatlantic partnership unlocks the massive scale, robust infrastructure, and world-class R&D talent required to meet that demand,” said ” said Cohere CEO Aidan Gomez in a news release that artfully presents the deal as a merger of equals but that, according to a footnote, only requires the approval of the German company’s shareholders, a sure sign of a one-sided takeover.

The combined companies will be looking to offer customized AI in highly-regulated sectors including finance, defense and healthcare. By pooling their talents and offerings, theu hope to offer AI solutions to organizations according to local laws, cultural contexts, and institutional requirements.

The move comes at a time when businesses across the word are looking at non-US options as a reaction to the Trump administration’s policy on tariffs and the uncertainty caused by the war with Iran.

There have been several initiatives within Europe to counteract the US dominance. The EU’s Eurostack plan looked to make sure that major projects had a European option. Aleph Alpha was one of the companies highlighted within the scheme. The EU also launched Open  Euro LLM, an attempt to slow down the US and China’s lead in AI.

  • ✇The Cloudflare Blog
  • Astro is joining Cloudflare Fred Schott · Brendan Irvine-Broque
    The Astro Technology Company, creators of the Astro web framework, is joining Cloudflare.Astro is the web framework for building fast, content-driven websites. Over the past few years, we’ve seen an incredibly diverse range of developers and companies use Astro to build for the web. This ranges from established brands like Porsche and IKEA, to fast-growing AI companies like Opencode and OpenAI. Platforms that are built on Cloudflare, like Webflow Cloud and Wix Vibe, have chosen Astro to power th
     

Astro is joining Cloudflare

16 de Janeiro de 2026, 11:00

The Astro Technology Company, creators of the Astro web framework, is joining Cloudflare.

Astro is the web framework for building fast, content-driven websites. Over the past few years, we’ve seen an incredibly diverse range of developers and companies use Astro to build for the web. This ranges from established brands like Porsche and IKEA, to fast-growing AI companies like Opencode and OpenAI. Platforms that are built on Cloudflare, like Webflow Cloud and Wix Vibe, have chosen Astro to power the websites their customers build and deploy to their own platforms. At Cloudflare, we use Astro, too — for our developer docs, website, landing pages, blog, and more. Astro is used almost everywhere there is content on the Internet.

By joining forces with the Astro team, we are doubling down on making Astro the best framework for content-driven websites for many years to come. The best version of Astro — Astro 6 —  is just around the corner, bringing a redesigned development server powered by Vite. The first public beta release of Astro 6 is now available, with GA coming in the weeks ahead.

We are excited to share this news and even more thrilled for what it means for developers building with Astro. If you haven’t yet tried Astro — give it a spin and run npm create astro@latest.

What this means for Astro

Astro will remain open source, MIT-licensed, and open to contributions, with a public roadmap and open governance. All full-time employees of The Astro Technology Company are now employees of Cloudflare, and will continue to work on Astro. We’re committed to Astro’s long-term success and eager to keep building.

Astro wouldn’t be what it is today without an incredibly strong community of open-source contributors. Cloudflare is also committed to continuing to support open-source contributions, via the Astro Ecosystem Fund, alongside industry partners including Webflow, Netlify, Wix, Sentry, Stainless and many more.

From day one, Astro has been a bet on the web and portability: Astro is built to run anywhere, across clouds and platforms. Nothing changes about that. You can deploy Astro to any platform or cloud, and we’re committed to supporting Astro developers everywhere.

There are many web frameworks out there — so why are developers choosing Astro?

Astro has been growing rapidly:

Why? Many web frameworks have come and gone trying to be everything to everyone, aiming to serve the needs of both content-driven websites and web applications.

The key to Astro’s success: Instead of trying to serve every use case, Astro has stayed focused on five design principles. Astro is…

  • Content-driven: Astro was designed to showcase your content.

  • Server-first: Websites run faster when they render HTML on the server.

  • Fast by default: It should be impossible to build a slow website in Astro.

  • Easy to use: You don’t need to be an expert to build something with Astro.

  • Developer-focused: You should have the resources you need to be successful.

Astro’s Islands Architecture is a core part of what makes all of this possible. The majority of each page can be fast, static HTML — fast and simple to build by default, oriented around rendering content. And when you need it, you can render a specific part of a page as a client island, using any client UI framework. You can even mix and match multiple frameworks on the same page, whether that’s React.js, Vue, Svelte, Solid, or anything else:

Bringing back the joy in building websites

The more Astro and Cloudflare started talking, the clearer it became how much we have in common. Cloudflare’s mission is to help build a better Internet — and part of that is to help build a faster Internet. Almost all of us grew up building websites, and we want a world where people have fun building things on the Internet, where anyone can publish to a site that is truly their own.

When Astro first launched in 2021, it had become painful to build great websites — it felt like a fight with build tools and frameworks. It sounds strange to say it, with the coding agents and powerful LLMs of 2026, but in 2021 it was very hard to build an excellent and fast website without being a domain expert in JavaScript build tooling. So much has gotten better, both because of Astro and in the broader frontend ecosystem, that we take this almost for granted today.

The Astro project has spent the past five years working to simplify web development. So as LLMs, then vibe coding, and now true coding agents have come along and made it possible for truly anyone to build — Astro provided a foundation that was simple and fast by default. We’ve all seen how much better and faster agents get when building off the right foundation, in a well-structured codebase. More and more, we’ve seen both builders and platforms choose Astro as that foundation.

We’ve seen this most clearly through the platforms that both Cloudflare and Astro serve, that extend Cloudflare to their own customers in creative ways using Cloudflare for Platforms, and have chosen Astro as the framework that their customers build on. 

When you deploy to Webflow Cloud, your Astro site just works and is deployed across Cloudflare’s network. When you start a new project with Wix Vibe, behind the scenes you’re creating an Astro site, running on Cloudflare. And when you generate a developer docs site using Stainless, that generates an Astro project, running on Cloudflare, powered by Starlight — a framework built on Astro.

Each of these platforms is built for a different audience. But what they have in common — beyond their use of Cloudflare and Astro — is they make it fun to create and publish content to the Internet. In a world where everyone can be both a builder and content creator, we think there are still so many more platforms to build and people to reach.

Astro 6 — new local dev server, powered by Vite

Astro 6 is coming, and the first open beta release is now available. To be one of the first to try it out, run:

npm create astro@latest -- --ref next

Or to upgrade your existing Astro app, run:

npx @astrojs/upgrade beta

Astro 6 brings a brand new development server, built on the Vite Environments API, that runs your code locally using the same runtime that you deploy to. This means that when you run astro dev with the Cloudflare Vite plugin, your code runs in workerd, the open-source Cloudflare Workers runtime, and can use Durable Objects, D1, KV, Agents and more. This isn’t just a Cloudflare feature: Any JavaScript runtime with a plugin that uses the Vite Environments API can benefit from this new support, and ensure local dev runs in the same environment, with the same runtime APIs as production.

Live Content Collections in Astro are also stable in Astro 6 and out of beta. These content collections let you update data in real time, without requiring a rebuild of your site. This makes it easy to bring in content that changes often, such as the current inventory in a storefront, while still benefitting from the built-in validation and caching that come with Astro’s existing support for content collections.

There’s more to Astro 6, including Astro’s most upvoted feature request — first-class support for Content Security Policy (CSP) — as well as simpler APIs, an upgrade to Zod 4, and more.

Doubling down on Astro

We're thrilled to welcome the Astro team to Cloudflare. We’re excited to keep building, keep shipping, and keep making Astro the best way to build content-driven sites. We’re already thinking about what comes next beyond V6, and we’d love to hear from you.

To keep up with the latest, follow the Astro blog and join the Astro Discord. Tell us what you’re building!

❌
❌