The platform that knows where every neural network has been.
Before W&B, AI experiments vanished into the ether. No record. No trail. No way to reproduce the model that actually worked. Lukas Biewald, Chris Van Pelt, and Shawn Lewis changed that - from a room behind a karate studio in San Francisco. Eight years later: 1 million developers, $1.7 billion, and a catalog of the world's most important AI research.
Somewhere in the Mission District of San Francisco, behind a karate studio, three people decided in 2017 that machine learning teams were working in the dark. Experiments ran, models trained, numbers flickered across terminal windows - and then most of it disappeared. No log. No audit trail. No way to recall, two weeks later, exactly which hyperparameters produced the run that actually worked.
The people who noticed this most acutely were the ones building AI at OpenAI. Lukas Biewald - Stanford CS, former Yahoo engineer, founder of data labeling company CrowdFlower - had sold his first machine learning tools company for $300 million, locked himself in an Airbnb to implement backpropagation by hand, built robots on Raspberry Pis in his garage, and then talked his way into an internship at one of the most competitive AI labs in existence.
What he found there was a universal frustration. Tracking experiments was manual, slow, and unreliable. The infrastructure for doing rigorous AI research well simply didn't exist as a product. So he built it - with Chris Van Pelt (his co-founder from CrowdFlower) and Shawn Lewis (a veteran Google engineer) - and called it Weights & Biases.
The name is a reference to the two core parameters in any neural network. Simple, literal, and - if you've trained a model - immediately obvious why it matters. These are the numbers you're always chasing.
I spent time building robots in my garage and then locked myself in an Airbnb to implement backpropagation by hand. The thing I kept running into was: how do you actually track what's working?
Add five lines of Python to a training script and W&B takes over. It logs every metric, captures system stats, saves hyperparameters, versions your dataset, and produces interactive charts you can share with anyone on your team - no setup required on their end.
That's the original product. But the platform has grown considerably. W&B Weave handles the LLM side: you trace an agent's outputs, compare responses across model versions, run automated evaluations, estimate costs before they show up on a cloud bill. W&B Inference gives access to open-source foundation models through an OpenAI-compatible API. And W&B Training adds serverless infrastructure for reinforcement learning post-training - the technique behind most modern alignment and reasoning model improvements.
The platform is built around a simple conviction: if you can't reproduce a result, you don't really understand it. Every feature traces back to that idea.
The original. Experiment tracking, hyperparameter sweeps, model registry, dataset versioning. The reason 1 million developers installed the wandb Python package in the first place.
LLMOps for the generative AI era. Trace your agents, evaluate outputs, compare models side by side, and catch regressions before they hit production.
OpenAI-compatible API with access to leading open-source foundation models. Usage tracking and cost monitoring included. Fine-tune, evaluate, and deploy in one place.
Post-training infrastructure for LLMs via reinforcement learning. Serverless, auto-scaling, built for multi-turn agentic tasks. Currently in public preview.
Collaborative model and dataset hub for teams. Version everything. Share internally or publicly. The artifact store that keeps your ML reproducible.
The new integration layer that connects W&B's developer tooling directly to CoreWeave's GPU infrastructure. Early preview, but this is where the acquisition thesis plays out.
OpenAI trained early models with W&B. Meta logs experiments there. NVIDIA is both a customer and an investor. Among the top 100 AI labs globally, seven in ten use the platform. And 80% of the top 50 universities send their PhD students into a world where W&B is already the default tool.
In March 2025, CoreWeave - the GPU cloud company that went public the same month - announced it was acquiring Weights & Biases for $1.7 billion. The deal closed in May 2025.
The strategic logic is clear: CoreWeave sells raw GPU compute. W&B is the software layer that makes GPU compute actually useful for AI developers. Together, they offer something neither could alone - infrastructure with intelligence baked in, from training run to model registry to deployment.
The first concrete product of the partnership was Mission Control, announced at the Fully Connected Conference in June 2025. It's an integration layer between W&B's tooling and CoreWeave's infrastructure - the beginning of a vertically integrated AI development stack that neither pure cloud providers nor pure software companies can currently match.
W&B Inference, launched in the same announcement, gives developers OpenAI-compatible API access to open-source models, with all the tracking and evaluation tools W&B is known for. For teams tired of black-box APIs, it's a genuinely different proposition.
Stanford CS. Yahoo engineer. CrowdFlower co-founder ($300M exit, 2019). OpenAI intern. Robot builder. Backpropagation-by-hand practitioner. Second-time AI tools founder.
Co-founded CrowdFlower with Biewald. Brought a decade of experience building ML infrastructure tools and data labeling systems to W&B from day one.
Former Google engineer who brought deep systems engineering expertise to the founding team. The technical backbone of the early W&B infrastructure.
Lukas Biewald is, by any measure, a repeat founder. His first machine learning tools company - CrowdFlower, later rebranded Figure Eight - was founded in 2007 and sold to Appen in 2019 for $300 million. That's a long arc: twelve years building a data labeling and crowdsourcing platform before anyone was calling it MLOps.
What he did next is the part worth paying attention to. Rather than starting another company immediately, he went back to school - not literally, but close. He built robots in his garage. He implemented backpropagation by hand in an Airbnb. He took an internship at OpenAI. He wanted to understand deep learning not as an investor or executive but as a practitioner. The company he then built reflects that experience directly.
W&B's product decisions trace back to what Biewald found frustrating at OpenAI and what his own garage experiments revealed: reproducibility is not optional, tracking should be automatic, and the tools ML engineers use should be as good as the tools software engineers have had for decades.
W&B was founded behind a karate studio in San Francisco. The metaphor - discipline, repetition, getting the fundamentals right - turned out to fit the product perfectly.
Lukas Biewald took an internship at OpenAI as an adult, after selling his previous company. The experiment tracking problem he encountered there became the first W&B feature.
40% of W&B's team holds PhDs. For a SaaS company, this is unusual. For a company whose customers are AI researchers, it makes every support ticket slightly more interesting.
45% of the team is women - well above the industry average for technical AI companies. The company has maintained this without making it the centerpiece of every press release.
The wandb Python package has been installed hundreds of millions of times via PyPI. That's not a marketing claim - it's in the download logs. pip install wandb is a daily ritual for ML teams worldwide.
80% of the top 50 universities use W&B. This means the majority of the next generation of AI researchers are learning ML with W&B as a default assumption - similar to how earlier generations learned with MATLAB.
Nat Friedman (ex-GitHub CEO) and Daniel Gross co-led the 2023 strategic round. These are not passive financial investors - both are active in the AI ecosystem and had worked closely with the founders.
The acquisition by CoreWeave closed in May 2025 - just 8 years after the company was founded. The name "Weights & Biases" refers to the parameters in a neural network. The founders still think like engineers.