Before "LLM stack" was a phrase you'd hear at a dinner party, before every software company became an AI company, Matt Bornstein was drawing the diagram. Not metaphorically - literally drawing the architectural layers of how applications would eventually be built on large language models, and doing it at a moment when most people in venture capital were still trying to understand what a transformer was.

That is the thing about Bornstein that doesn't show up in a press release. He got here not by chasing the wave but by mapping the ocean floor while the surface was still calm. When he published "Emerging Architectures for LLM Applications" in 2023, co-authored with Rajko Radovanovic, it became one of the most widely read technical frameworks in AI - not because it had a great headline, but because engineers printing it out and taping it to their walls found it useful.

Scene 01
Brown University, Providence
Math major by training. Delivery startup founder by impulse. Co-founded DormSnacks as CTO - a snack delivery service for Brown students before anyone had heard of Uber Eats.
Scene 02
The VC Circuit
Greycroft. Correlation Ventures. Monitor Deloitte. Blumberg Capital. Six years building pattern recognition on what makes infrastructure bets work before a16z came calling in 2019.
Scene 03
The AI Thesis
Bet on AI/ML infrastructure when it was "almost working" and niche. That timing - unglamorous, quiet, ahead by years - turned out to be the whole game.

The Diagram That Changed How Engineers Think

In 2020, Bornstein and co-authors Martin Casado and Jennifer Li published "Emerging Architectures for Modern Data Infrastructure." The piece did something unusual for a VC blog post: it became canonical. Not because it was promoted particularly hard, but because the "Modern Data Stack" diagram it contained - a coherent map of data ingestion, transformation, storage, and analysis tooling - gave an industry that had been working in silos a shared vocabulary. Architects in enterprise software still reference it.

Infrastructure vendors are capturing the majority of value in generative AI - application companies are struggling with retention and margins.

Matt Bornstein, "Who Owns the Generative AI Platform?" (2023)

That 2023 paper - co-authored with Guido Appenzeller and Martin Casado - was a cold-eyed look at where the money was actually going in generative AI. The conclusion was not particularly comforting for application-layer startups: the infrastructure players were taking the majority of value, while companies building on top of foundation models were finding it hard to maintain margins or user retention when the underlying models kept improving. The paper was cited in boardrooms and pitch decks for months after publication.

LLM Application Architecture - Bornstein's Framework (2023)
3
Application Layer
User-facing products, agents, copilots - where value is delivered but competitive moats are thin
2
Orchestration Layer
Prompt management, context windows, chaining, memory - the connective tissue of LLM apps
1
Data & Infrastructure Layer
Vector databases, embeddings, fine-tuning pipelines, model hosting - where Bornstein bets

What He Actually Backed

The portfolio tells the story more precisely than any thesis statement. Cursor - an AI coding assistant that became one of the most talked-about developer tools of 2024 - got its Series A from Bornstein and a16z in August 2024. Mistral, the Paris-based open-weight model company that became a serious alternative to OpenAI. Hex, the data analytics platform. Labelbox for AI training data. Tecton for ML feature stores (acquired by Databricks). Tabular, which was also acquired by Databricks in a deal that validated the infrastructure thesis in concrete terms.

Then there is the list of exits that read like a scorecard: Character.ai, acquired by Google. Replicate, acquired by Cloudflare. These are not lucky outcomes. They are the result of investing in picks-and-shovels infrastructure at the moment when the gold rush was just beginning - when the underlying technology was, as Bornstein has described it, "almost working."

When Bornstein posted about Cursor's growth after the Series A, he used the word "product" four times in three sentences. That detail says something about how he evaluates infrastructure companies - not by the elegance of the architecture alone, but by whether people actually use it.

The AI Canon - Curation as a Form of Thesis

In May 2023, Bornstein co-authored the "AI Canon" with Guido Appenzeller and Derrick Harris - a curated reading list that attempted to cut through the noise of generative AI coverage and point readers toward the papers, posts, and frameworks that actually matter. The piece was instantly bookmarked, shared in Slack channels, and forwarded by engineers to their managers with a "read this before our next meeting" note attached.

Curation at that level is a form of intellectual positioning. It tells you what someone thinks is signal and what is noise. The AI Canon, like the Modern Data Stack diagram before it, suggests someone who thinks in taxonomies and frameworks - who finds the underlying structure of a messy market more interesting than any individual company within it.

The AI Canon exists because there's so much to read and so little signal on what actually matters.

Matt Bornstein, on the motivation for the AI Canon (2023)

The Promotion That Wasn't Surprising

In January 2026, Bornstein was promoted to General Partner at Andreessen Horowitz. His colleague Martin Casado, who leads the infrastructure practice, described him as having been "instrumental in making the infrastructure team what it is today" and as a "key thought partner on investing process." These are careful words from a careful firm - but they track with the record.

The promotion came after six years of consistent output: the frameworks, the portfolio, the exits, the published research that engineers print out and tape to walls. What Bornstein built at a16z was not just a set of investments but a perspective - a coherent, infrastructure-first view of how AI would develop that has aged well against subsequent events.

Fact
Spanish Speaker
Bornstein speaks Spanish - an unusual detail for a math-trained infrastructure investor that he lists on his LinkedIn profile without fanfare.
Fact
DormSnacks CTO
Co-founded a dorm delivery service as CTO at Brown in 2004 - fifteen years before food delivery became a trillion-dollar conversation.
Fact
Modest Following
6,000+ LinkedIn followers for a GP at one of the world's top VC firms. The low-hype infrastructure thesis applies to his personal brand too.

What He Is Building Toward

Bornstein's investing thesis has been remarkably consistent: back the foundational infrastructure that AI applications will depend on, at the moment before most investors understand why that layer matters. The pattern held in data infrastructure in 2020. It held in LLM tooling in 2023. In 2025 and 2026, his investments in companies like Hedra (generative media infrastructure), Unconventional AI, and Inferact suggest the same logic applied to the next generation of AI capabilities.

The bet, consistently, is not on which AI application will win. It is on the infrastructure layer that every application will need regardless of who wins. That is a more patient, more structural bet than most - and it requires a different kind of conviction, one that is grounded in architecture rather than narrative.

The math major who drew diagrams in 2020 that nobody had a name for yet is now a General Partner at the firm that may define the AI era. The diagram got him here. The portfolio proves the diagram was right.