He Didn't Wait to Be Invited
Most people discover the internet in their early teens. Tommy McGlynn discovered its seams. At 12, he found a gap in an ad referral platform and walked through it - not to cause chaos, but because the mechanism was interesting. He made money from it. He learned how the web worked not from a textbook but from the inside of a system that wasn't expecting him.
That instinct - to probe the edges, understand the architecture, and ship something useful - has defined every chapter since. He studied Interactive Media Design at The Art Institute of California-Los Angeles, but the degree was almost beside the point. By then he already knew how to build things.
Our goal is to help developers successfully create and publish VR experiences.
- Tommy McGlynn, announcing the Oculus Developer HubGames, Internships, and Earning the Right to Be in the Room
Before the app economy rewrote everything, McGlynn cut his teeth in Flash games. At Meteor Games LLC and Six Degrees Games, he built the invisible stuff that matters most: user onboarding flows and real-time multiplayer infrastructure. Onboarding is where games die or survive. Real-time multiplayer is where engineering debt compounds fastest. He learned both in an era where neither term had a best-practices playbook.
A stint at Fox Interactive Media - early enough that social networks still felt experimental - gave him community chops to go with the code. Then Burstly, a mobile ad and distribution platform in Santa Monica, where he worked as a Client Integration Engineer, helping developers get their apps seen at a time when the App Store was still a new idea.
The Early Stack
Python, PHP, JavaScript, HTML, Flash, Java. He published a technical guide on debugging HTML/JavaScript on iOS with Safari 5 and 6 - the kind of thing you write when you've spent enough time in the trenches that you want the next person to suffer slightly less.
Apple, TestFlight, and Three WWDC Stages
The Apple years are the pivot point. McGlynn joined and did what good engineers do at Apple: built infrastructure that millions of people would depend on without ever knowing his name. He designed scalable TestFlight server architectures and multi-datacenter web services - the backend machinery that lets developers test apps across thousands of devices before a public release. It is not glamorous work. It is load-bearing work.
Then the stage. WWDC - Apple's annual developer conference, where the announcements get transcribed and re-read for weeks. McGlynn appeared as a speaker for three consecutive years. At WWDC 2018, during Session 301, "What's New in App Store Connect," he stepped up to introduce TestFlight public links: a way for developers to invite testers with just an email address, no developer portal gymnastics required. A small feature that developers actually wanted. A specific detail that proves someone was paying attention.
If it sounds interesting to build AI characters that can listen, speak and move around, there may be a role for you.
- Tommy McGlynn, recruiting for Meta Reality Labs, 2025Meta, Oculus, and the Developer Ecosystem Nobody Saw Coming
When McGlynn joined Facebook's Reality Labs, VR was a platform in search of developers. His instinct - same as it was at 12 - was to reduce the friction for people trying to build. He wrote the official Oculus Developer Hub launch post, announcing a unified tool for device management, performance monitoring, and casting. The pitch was practical: "help developers successfully create and publish VR experiences." No hyperbole. Just the job.
He moved into engineering management and took on VR Developer Acceleration - a team name that is exactly what it says. The goal: compress the time between a developer having an idea for a VR experience and that experience actually existing. That's the compounding bet on the metaverse that nobody makes explicitly, but that McGlynn has been making quietly since he joined.
AI Characters and the Crossroads of Two Big Ideas
Since 2022, he's led the AI Character Platform team at Meta Reality Labs. The work sits where AI and mixed reality intersect - and that intersection is more complicated than either field's boosters usually admit. Building a character that can listen means real-time speech processing. Building one that can speak means voice synthesis that doesn't feel dead. Building one that can move means animation systems that respond to context rather than follow a script. McGlynn's team is solving all three at once, for an environment - mixed reality - where the stakes of getting it wrong are visceral in a way that flat-screen experiences are not.
In 2025, he was actively recruiting for Realtime Engine Technology engineers at Meta Reality Labs. The call was specific: people interested in AI characters that can listen, speak, and move. The word "exciting" appeared, but it was followed by the actual description of the work. That's a tell. Hype is for people who don't know what they're building. McGlynn knows what he's building.
Joshua Tree and the Other Bet
In 2018, the same year he was speaking at WWDC, McGlynn bought 2.5 acres of land in Joshua Tree, California for $25,000. He wrote about it on Medium: two articles, earnest and specific, documenting the reasoning behind an investment made on instinct and curiosity rather than a spreadsheet. Joshua Tree was not the obvious play. That was the point.
The same pattern runs through everything: find something interesting before it's crowded, understand its mechanics, build something in it. The ad platform at 12. Flash games when Flash was still viable. Mobile dev tooling at Burstly. TestFlight server infrastructure before TestFlight was a household name among iOS developers. VR developer tools before VR developer ecosystems were a proven thing. AI characters in mixed reality while everyone else argues about 2D chatbots. Twenty-five thousand dollars in the desert.
The through-line is not a grand theory. It's a working preference for being slightly ahead of the obvious answer.