Long before the phrase "synthetic data" entered the AI industry's vocabulary, McNamara was spending his days at Pixar building systems to make light bounce convincingly off imaginary surfaces. Then at Microsoft he was making virtual soldiers cast realistic shadows in digital warzones. Then inside Apple's Special Projects Group - the euphemism for a lab where engineers were quietly building the technology that would become the company's autonomous vehicle ambitions - he was doing something adjacent but stranger: generating synthetic environments good enough to train the perception systems of a machine that would one day navigate real roads.
The through-line is uncomfortable to ignore. At every stop, McNamara was building fake worlds that had to be indistinguishable from the real one. In 2017, he decided to make that the product.
Parallel Domain, the company he founded and still runs as CEO, is what happens when a computer graphics expert with Oscar-winning film credits meets the most pressing data problem in artificial intelligence. The platform generates synthetic labeled datasets, simulation environments, and controllable sensor feeds - giving autonomous vehicle developers, drone companies, and robotics teams the one thing they can't easily collect: a billion miles of edge cases.
The pitch is direct. In the real world, you can drive a million miles and never encounter a child chasing a ball across a fog-soaked mountain road at 3am. In Parallel Domain's virtual world, you can generate that scenario a thousand times before lunch, across every combination of lighting condition, vehicle speed, sensor configuration, and weather pattern you can imagine. The math of autonomous vehicle safety demands it.
McNamara calls it the "combinatorial explosion" problem: the universe of scenarios a perception model must handle is too vast to be covered by any reasonable real-world dataset. The only solution that scales is one you generate yourself.