How AI is changing open source

Open source has become less of a “thing” in the last few years. Oh, sure, you’ll find the usual suspects waving their “open source is always better” flag, even as the AI community keeps releasing ambitious (and very closed) models and other tools (and as the very nature of open source evolves, as I’ve argued time and time again.). This doesn’t mean open source is fading in importance. It’s not. As CNCF’s contribution tables, GitHub’s Octoverse data, or the Apache Software Foundation’s latest annual report indicate, open source engagement is shifting to the layers that matter most: Kubernetes (yes, really), observability, platform engineering, networking, and the infrastructure required to make AI work in production.

Open source grew up and became dull. We’re all better for that.

Control through code

While we can’t help but be inundated by news of this or that latest model, open source keeps quietly chugging away in the background. CNCF now hosts more than 230 projects with more than 300,000 contributors worldwide. Its 2025 survey found that 98% of organizations have adopted cloud-native techniques, and 82% of container users now run Kubernetes in production. GitHub’s 2025 Octoverse report tells the same story but from an even wider angle: 1.12 billion contributions, more than 180 million developers, and a record 518.7 million merged pull requests. Apache is a bit less flashy but isn’t exactly withering, either. The ASF says it had 9,905 committers working across 295 projects and issued 1,310 software releases in fiscal year 2025.

Who employs all the developers contributing this code? In 2025, as CNCF Devstats show, Red Hat led all CNCF contribution activity with 194,699 contributions. Second place? Microsoft with 107,645. And third? Google at 91,158. Independent contributors still mattered, landing fourth at 52,404, which is a useful reminder that open source hasn’t become purely corporate. But the center of gravity is unmistakable. Serious companies now spend serious money for engineers to shape the plumbing their products depend on. The top contributors have remained constant over the past decade, indicating their willingness to invest in the long game. But during that same time we’ve seen an influx of new contributors, too. 

That shift matters because it changes how we should read open source contributions. Too many people still talk about them as if they were mostly philanthropy. Too many open source program offices still try to convince their engineering teams to contribute because “it’s the right thing to do,” and they hope their developers’ efforts will ingratiate the company into some nebulous community. Nope. Open source is increasingly where vendors try to set defaults, normalize interfaces, and shape the operational assumptions everyone else has to live with.

In other words, open source has become less about openness for its own sake and more about control. Not proprietary control, exactly, but control over the layers where ecosystems harden into standards. The companies investing upstream aren’t doing it because they’ve discovered civic virtue. They’re doing it because whoever shapes the substrate usually gets leverage over everything built on top of it.

Who gives, and why?

Take Red Hat. It’s still the heavyweight in CNCF, which isn’t hard to explain. Red Hat’s OpenShift is a Kubernetes-centric application platform. So of course Red Hat continues to pour effort into the Kubernetes-centered world. That’s not community service; it’s product strategy. It fits the way Red Hat has long exercised influence (and control). But it’s not charity. Fortunately for Kubernetes, Red Hat isn’t alone in contributing to Kubernetes; the stats point to a growing, increasingly diverse contributor base across thousands of organizations.

Kubernetes won because it became too important for any serious infrastructure company to ignore, and Red Hat contributes heavily because its business depends on that remaining true.

Microsoft’s position is even more revealing. Once the company most associated with open source hostility, it now sits second in overall CNCF contributions in 2025. But the more interesting signal is where companies like Microsoft are investing. OpenTelemetry has become one of the fastest-rising CNCF projects, with a 39% rise in commits in 2025 and a contributor base that grew from 1,301 to 1,756 in a single year. Again, this isn’t about charity—more like a land grab around observability standards. Microsoft, Splunk, and other top OpenTelemetry contributors are all helping in order to help themselves. That’s the way open source has always worked.

Then there’s Cilium, which is what happens when boring infrastructure stops being boring, as I recently noted. Cilium’s journey report says the number of contributing companies rose 90% after it joined CNCF, from 533 to 1,011, while individual contributors jumped from 1,269 to 4,464. Google, Datadog, and Cloudflare all expanded their contributions as the project matured. That’s not random. Cilium sits at the intersection of networking, observability, and security, which are precisely the categories that become mission-critical once workloads become distributed, latency-sensitive, and expensive. AI may be driving headlines, but a lot of the real strategic work is happening in projects like Cilium, where the infrastructure determines whether those AI workloads are governable, visible, and efficient.

And how about Nvidia, a company with so much cash it could buy a few countries and set all their developers to work building for Nvidia. But this isn’t how Nvidia has chosen to spend its riches: It ranked 14th in Kubernetes contributions in the past two years, with 5,892 contributions. It has also open sourced KAI Scheduler, a Kubernetes-native GPU scheduler that came out of Run:ai, and Nvidia has described itself as a key contributor to Kubeflow. In other words, Nvidia isn’t just selling chips; it’s investing in the scheduling, orchestration, and workflow layers that determine how effectively those chips get used in real-world AI systems. And it’s doing so through developer communities, rather than lump sum cash payouts.

The Nvidia work is a tell for where open source is going in AI. CNCF says 66% of organizations hosting generative AI models now use Kubernetes for some or all inference workloads, and it explicitly calls Kubernetes the de facto operating system for AI. Of course it would say that, given the foundation’s dependence on Kubernetes as a tentpole project, but that doesn’t diminish the reality that Kubernetes and Kubeflow are increasingly central to training and inference systems. In sum, AI is making open infrastructure more important because few organizations really want to build their future on opaque, inescapable infrastructure they can’t inspect or influence.

An essential supporting actor

So is open source increasing in importance? Absolutely, but not in the warm, nostalgic way some people still imagine. It’s becoming less romantic and more essential. The old story about open source as a fringe alternative or a developer-led morality play was never true, but it’s not even remotely credible now. Open source is where the cloud-native stack gets standardized, where observability gets normalized, where platform engineering gets productized, and where AI infrastructure is increasingly being built.

Go to Source

Author: