The Mathematician Who Bets on Open
There is a Mountain Project profile somewhere in the internet's long memory with Rajko Radovanović's name on it - real climbing routes, real ascents, a tick list that predates his time on Sand Hill Road. Most venture partners do not have one. Most venture partners did not also teach Advanced Mathematical Optimization at Harvard, spend time studying Mandarin in Beijing, and attend the United Nations International School before any of that. Rajko is not most venture partners.
His job title says Partner at Andreessen Horowitz. His portfolio tells a more specific story: he is the person at a16z who bets systematically on open-weights AI infrastructure - the models, the tools, the grant programs, the communities - that exist outside the walls of the major labs. If the AI economy is a bet on who controls the stack, Rajko's wager is that the answer will not be one of the hyperscalers.
The path to a16z ran through the kind of places that produce people who think in systems. He graduated from Harvard with a BA in Economics and Computer Science - the exact pairing that produces someone equally comfortable reading a cap table and debugging a training loop. Before Harvard he was at the United Nations International School in New York, an institution where your classmates come from 120 countries and you develop an instinct for operating across cultural contexts. Before that: Belgrade. The last name is not subtle.
After Harvard, he joined Boston Consulting Group, which is where you go if you want to understand how large organizations actually move - not how they claim to move in press releases. The consulting years gave him a framework for evaluating enterprise technology that pure technical backgrounds often miss: adoption curves, procurement cycles, organizational inertia. That framework mattered later.
The NEA Chapter: Learning to See Early
From BCG he moved to New Enterprise Associates (NEA), one of the older and larger venture franchises in the country. There, before the term "AI-native" was being applied to every pitch deck, he built a portfolio that reads like a forecast of what the market would care about two years later: Perplexity AI, before AI search was a category. Sentry, the developer observability tool that became load-bearing infrastructure for engineering teams. Metabase, the open-source BI platform that removed the BI team from the data stack. Weaviate, the vector database that the RAG boom made suddenly very relevant. TimescaleDB, for the time-series infrastructure underneath everything modern.
The common thread across those picks: developer-led, often open-source, infrastructure-layer companies that grow through usage before they grow through sales. Rajko was not inventing a thesis so much as discovering one through reps.
Open source leads to the best technical solutions winning - it is the ultimate 'building in public.'
- Rajko Radovanović, MediumAndreessen Horowitz: First Call, First Investment
When Rajko joined a16z's infrastructure team, his first announced investment was Mistral AI - the French AI lab that chose to bet its entire identity on open-weight models at a moment when the prevailing wisdom was that model weights were a competitive moat to be hoarded. He co-led the Series A alongside Anjney Midha and Matt Bornstein, and he was excited enough to announce it on Twitter in Serbian-tinted all-caps enthusiasm.
The Mistral investment was a thesis in action. Rajko had written about the structural advantages of open source software for years - faster iteration loops, community ownership, transparency-driven security, global accessibility reducing barriers across geographies. Mistral was that argument applied to foundation models. Open weights meant developers could deploy without sending data to a third party. It meant the weights could be fine-tuned for specific domains. It meant Europe had a credible AI lab that was not beholden to American cloud pricing.
After Mistral came a run that made the thesis harder to dismiss. Cursor (Anysphere) - the AI-native code editor that has since become arguably the most consequential developer tool since GitHub. Udio - music generation from the original team. Luma AI - multimodal intelligence for the physical world. Black Forest Labs - founded by Robin Rombach, Patrick Esser, and Andreas Blattmann, the actual inventors of the latent diffusion architecture that powers DALL-E 2, DALL-E 3, and Sora. World Labs - spatial intelligence and 3D world generation. Braintrust - the developer platform for LLM product iteration.
What those companies share: they are mostly seed or early Series A. They are mostly technical-founder-led. They almost all have open-source components, open-weight models, or communities at the center of their distribution strategy. And they collectively cover the full stack of what "building with AI" means in 2025 - from the base models (Mistral, Black Forest Labs) to the development environment (Cursor, Braintrust) to the applications (Udio, Luma, World Labs).
The Grant Program Nobody Else Would Run
The most unusual thing on Rajko's resume is the a16z Open Source AI Grant Program, which he incubated and runs. It distributes grant funding - not equity investment - to hackers, researchers, and small teams doing important work to support AI development outside the major labs. Three cohorts in, the grantees include Common Crawl, Axolotl, SkyPilot, LMSys, LLaVA, Deforum, Lucidrains, ARC Prize, SWE-Bench, and a rotating cast of researchers working at the frontier of LLM evaluation and experimentation.
Grant programs at venture funds are common enough as marketing. This one is different in that the recipients are working on infrastructure that makes the open-source ecosystem more competitive with closed systems - benchmarking tools that reveal what proprietary models are actually capable of, evaluation frameworks that make it harder to obscure capability gaps, serving infrastructure that makes running open models economically viable. If the program works as intended, it raises the floor for what open-source AI can do, which in turn improves the market conditions for Rajko's portfolio companies. It is enlightened self-interest executed at a level of sophistication most grant programs do not approach.
The Person Behind the Portfolio
On LinkedIn, there is a post from Rajko after a ten-day trip that involved rock climbing, fly fishing, and what he described as "generally sleeping." Fifty-nine people commented. The comments are a mix of envy and admiration. He is someone who does this - actually disconnects, actually climbs walls and catches fish, actually comes back with a clearer head rather than a backlog of apologetic replies.
His Twitter feed (@rajko_rad) is a mix of investment announcements, conference organizing (he hosts luncheons at NeurIPS on LLM scaling and efficiency), Serbia references, and the kind of casual enthusiasm that does not perform being enthusiastic. He organized a luncheon at NeurIPS titled "LLM Scaling x Efficiency" and noted that attendees needed "directly relevant work in the field" - not a networking event, an actual working session.
There is a tweet where he shares a comic the a16z infra team generated with Ideogram, captioning it "Mafioso a16z Infra" and explaining the inside joke: every new team member's name ended in the phonetic sound "oh" for over a year. Rajko, Guido, Marco, Yoko. He documents team culture with the same matter-of-fact specificity he applies to investment theses.
He grew up with an awareness of what it means to work in a place where the infrastructure - educational, economic, technological - is less mature than what the talent deserves. A tweet about Harvard's deep learning curriculum being "1-2 years behind both Stanford and Berkeley" then pivoted to "growing up and working with talented youth in Serbia." He understands that geography still shapes opportunity, even in the internet age. That understanding is baked into his thesis on open source: the barriers go down when the code is free.
The aspiration implicit in the portfolio and the grant program is something like: make the best AI infrastructure available to whoever can build with it, regardless of where they are or which hyperscaler they can afford. That is not a pitch line. It is a pattern visible across a decade of consistent choices, from the open-source BI tools at NEA to the open-weight foundation models at a16z.