When ChatGPT launched in November 2022, most of Silicon Valley raced to fund the picks and shovels - compute, chips, foundation models, the infrastructure layer. Sonya Huang ran the other direction. She had spent four years watching enterprise software companies struggle to turn AI features into durable value, and she had a theory: the infrastructure would commoditize, and the money would move up the stack. So while the rest of the market fought over GPU allocation, she bet Sequoia's capital on the applications. OpenAI. Hugging Face. LangChain. Glean. Mercury. Harvey. Gong.
The thesis has a name now - "the application layer" - and it sounds obvious in 2026. It was not obvious in 2022. It required believing that foundation models would get good enough, fast enough, that applications riding on top of them would compound faster than the models themselves. It required betting on founders who were navigating a moving target: the platform could make them obsolete overnight if OpenAI added a feature. Huang's answer to that: proprietary data workflow integration vertical expertise network effects. Four moats. The ones that survive model improvements.
"If you're building something that only exists because of a deficiency in OpenAI today, we try not to back that."
- Sonya Huang, Sequoia CapitalBefore she became the face of Sequoia's AI thesis, Huang was a Princeton economics student who spent a semester training computer vision neural networks on brain scans and astrophysics data - for fun, essentially, as part of her senior thesis. This was 2013 or 2014. The compute wasn't there. The data wasn't there. The algorithms weren't sophisticated enough. She filed it away and went to Goldman Sachs.
At Goldman, her colleagues gave her a nickname: Slothya. The spirit animal she'd claimed as her own. She loved the irony of it: the slowest mammal on earth, moving through the financial world's fastest machine. From Goldman she moved to TPG Capital in private equity, where she learned to read business transformations at scale - how technology adoption drives enterprise value, how competitive moats form and dissolve. Then Sequoia called in 2018, looking to build out a growth investing practice focused on enterprise software and data infrastructure. She had trained neural nets in college. She understood financial models. She could see the whole board. She said yes.
What Sequoia got was unusual: a partner who had spent years at the White House (she interned for economist Alan Krueger on the Council of Economic Advisers when she was a Princeton freshman, after Krueger was appointed by Obama - she bumped into the President in the hallway), in investment banking, in private equity, and who had done actual machine learning research. Not one of those things. All of them. The 21mm lens, she calls it - the wide-angle view that takes in more than the scene most investors frame for themselves.
"Just because something hasn't worked before doesn't mean it won't this time around."
- Sonya HuangHer public work runs alongside her deal work. In February 2023, she co-published "Generative AI's Act Two" - a framework arguing that applications, not models, would capture the bulk of generative AI's economic value. Then she launched AI Ascent, Sequoia's annual gathering of AI leaders that has become one of the industry's most closely watched conferences. In January 2024 came the Training Data podcast, where she interviews the builders: OpenAI's team, Anthropic's researchers, the founders of companies reshaping every vertical. These are not vanity projects. They are intelligence operations - a systematic way of staying closer to the frontier than any investor has a right to be.
The data point she keeps returning to: ChatGPT's daily-active-to-monthly-active user ratio. In early 2023 it was below 20%. By May 2025 it had climbed to nearly 50% - approaching Reddit and Instagram levels of habitual engagement. That number, more than any funding round or valuation, tells the story of where AI is going. "AI applications aren't experimental anymore," she said at AI Ascent 2025. "They're habit-forming."
At AI Ascent 2026 she went further. Standing alongside fellow Sequoia partners Pat Grady and Konstantine Buhler, she laid out a framework projecting that 99.9% of cognitive work will eventually be performed by machines - and identified roughly $10 trillion in services revenue that software has never been able to touch. The number sounds like a stretch. It sounded like a stretch when Sequoia started deploying $1.5 billion into AI application companies too. The applications portfolio keeps appreciating.
Off the stage, she is the person who takes a 30-minute nap every day after work - "it's made me so much happier and more productive," she has said without irony. She golfs. She reads Murakami and Joan Didion and Econometrica in the same sitting. She thinks in pictures more than words. The visual thinking shows up in her work: her frameworks are always diagrams before they are arguments, stages before they are theses. Act One. Act Two. Act Three. Each one a panel in a longer story she's been drafting since she first trained a neural net on a brain scan and wondered what would happen when the compute finally caught up.