The Dance Nobody Else Was Doing
In 2017, Rodrigo Liang left Oracle's air-conditioned campus in Redwood Shores and did something engineers rarely do: he declared the dominant paradigm wrong. Not slightly miscalibrated - wrong at the architectural level, the kind of wrong that only reveals itself when you spend two decades staring at silicon from the inside.
Liang had shipped 12 major SPARC processors and ASICs at Oracle and Sun Microsystems. He understood, bone-deep, why the instruction-centric model that had powered computing for fifty years was quietly strangling AI. The problem wasn't compute. It was the constant, murderous traffic of moving data to compute. Data movement kills you. That wasn't a slogan. It was the diagnosis he'd been building toward for twenty years.
We're going to flip the paradigm on its head - not worry as much about the instructions, but worry about the data.
- Rodrigo Liang, SambaNova CEOThe company he co-founded with Stanford professors Kunle Olukotun and Chris Re is named SambaNova - "New Dance" in Portuguese, a nod to the Brazil where Liang grew up after being born in Taipei, Taiwan. The name is either whimsical branding or a literal description of what they built: a processor that moves differently, thinks differently, computes differently.
That processor - the Reconfigurable Dataflow Unit, or RDU - replaces the add/subtract/multiply fundamentals of classical chips with map, reduce, and filter. It's not a tweak. It's a different definition of what a chip does. When a former Oracle colleague described Liang as someone who "understood better than most that for AI workloads, data movement kills you," they were describing a man who had spent his career watching that problem grow, and spent the next chapter solving it.
Twenty Years of Other People's Chips
Before SambaNova, Liang's resume reads like a tour of the machines that built Silicon Valley's reputation. Hewlett-Packard in the early 1990s. InSilicon. Afara Websystems, where he focused on multi-core processor design until Sun Microsystems acquired the company in 2002. Then a decade at Sun leading the Niagara line of multi-core chips - the processors that made Sun's enterprise servers faster without turning data centers into furnaces.
When Oracle absorbed Sun in 2010, Liang became Senior Vice President of SPARC Processor and ASIC Development. By the time he left in 2017, he had overseen 12 major chips and ASICs, accumulated an encyclopedic knowledge of where the instruction-centric architecture struggled, and assembled connections to a network of chip architects, software engineers, and AI researchers that would become SambaNova's founding talent pool.
SambaNova SN50 vs. Competing AI Chips (2026)
Source: SambaNova February 2026 announcement. SN50 claims 5x speed advantage and 3x lower TCO vs. Nvidia B200.
The co-founders brought different superpowers. Olukotun, a Cadence Design Professor at Stanford, had pioneered chip multiprocessor design and founded Afara Websystems - the same company Liang had worked at before Sun's acquisition. They weren't strangers; they were veterans of the same architectural battles. Chris Re, a Stanford CS professor directing the InfoLab, brought the machine learning research depth. Liang brought two decades of knowing exactly what happens when you try to run modern AI on hardware designed for a different era.
Built for Large, on Purpose
SambaNova's strategy has never been broad. Liang's clearest public articulation of the company's positioning is three words: we're built for large. Massive neural networks. Enormous data sets. The AI workloads that make conventional GPU clusters sweat.
Rather than competing with NVIDIA at volume, SambaNova positioned itself as the infrastructure layer for organizations that need AI in production, at enterprise scale, without the specialized in-house teams most companies don't have. The business model matches the architecture: instead of selling hardware, SambaNova deploys models as managed services. If the hardware isn't working, they know immediately. No lag, no finger-pointing across vendor boundaries.
AI is no longer a contest to build the biggest model. The real race is about who can light up entire data centers with AI agents that answer instantly, never stall, and do it at a cost that turns AI from an experiment into the most profitable engine in the cloud.
- Rodrigo Liang, February 2026The 2024 pivot from training to inference wasn't a retreat - it was a bet that the real volume play in AI was always going to be inference at scale, not the training runs that get the press coverage. SambaNova Cloud launched. SambaManaged products followed. The company had been building toward the agentic AI moment that arrived in 2025 and 2026, when enterprises stopped asking "can we run an LLM?" and started asking "can we run a thousand AI agents simultaneously, in real-time, at a cost we can justify to finance?"
SambaNova Funding Journey
The SN50 Moment
In February 2026, Liang stood behind a product announcement that felt like the culmination of everything SambaNova had been building toward. The SN50 chip - targeting 10-trillion parameter models for agentic AI workloads - arrived alongside $350M in Series E funding led by Vista Equity Partners and Cambium Capital, with Intel Capital, GV, Battery Ventures, T. Rowe Price, and others in the coalition.
The Intel collaboration was the strategic piece that signaled a shift. SambaNova and Intel would work together on a heterogeneous inference design: SN50 for the heavy AI lifting, Intel Xeon 6 CPUs handling the surrounding workloads. SoftBank signed on as the first SN50 customer, deploying the chips in Japanese data centers. A company that had spent years fighting for enterprise attention was now partnering with the processor company that built the PC era.
The sovereign AI angle has been equally deliberate. In 2025, SambaNova announced partnerships with SCX in Australia, Infercom in Germany, and Argyll in the United Kingdom - three "sovereign AI cloud" deals positioning SambaNova as the infrastructure backbone for governments and enterprises that want AI that runs in their jurisdiction, on their terms, without sending data to American hyperscalers.
Where the Dance Comes From
Rodrigo Liang's origin story is genuinely intercontinental. Born in Taipei. Raised in Brazil. Studied in Germany. Stanford-educated. Palo Alto-headquartered. Most tech executives have backgrounds that read as a straight line; his reads as a map.
The company name is not incidental. "Samba" is the rhythm of Brazilian street culture - improvisational, high-energy, rooted in community. "Nova" is new. The name reflects a founder who chose to name his AI chip company after a dance from the country where he grew up, which is either a deeply personal statement or the most confident act of branding in the semiconductor industry.
Former colleagues describe Liang as technically brilliant and operationally methodical - the kind of engineer who mastered the existing system completely before deciding it needed to be replaced. His 20 years at HP, Sun, and Oracle weren't detours; they were the education that made his argument against traditional chip architecture credible. He wasn't theorizing from the outside. He was diagnosing from within.
The agentic AI revolution demands 10X to 100X more inference compute. The infrastructure has to be built for that - not patched onto hardware that was designed for a different problem.
- Rodrigo Liang, 2025At the 2025 RAISE Conference in Paris, Liang shared a panel with Hugging Face's Thomas Wolf, Tony Kim, and CNBC's Arjun Kharpal on the subject of open source, fast inference, and the agentic revolution. He has contributed to the World Economic Forum's AI agenda and appeared at TEDAI San Francisco in 2025. The circuit isn't just for PR - Liang's message is consistent across every venue: AI infrastructure isn't a background concern. It's the entire game.
Whether SambaNova's dataflow architecture becomes the defining compute paradigm of the AI era - or a distinguished chapter in a longer story - is still being written. But Liang has spent eight years building toward a specific future, and the February 2026 announcements suggest he's still moving in the same direction he chose when he walked out of Oracle: forward, fast, and unconcerned with the instruction that told him this was impossible.