Jonathan Gornet
AI Systems Engineer · Ph.D. in Systems Science & Mathematics, Washington University in St. Louis
Modern AI systems are extraordinarily capable but also very inefficient. Training and deployment consume massive computational and labor resources, creating barriers to iteration and scalability. As noted in Scientific American, the energy used to power large AI models is projected to reach 85.4 terawatt-hours per year by 2027 — enough to power more than ten million Tesla Model 3s annually. These costs highlight a deeper issue: AI progress is now constrained less by capability, and more by efficiency.
My work focuses on lowering those costs through automation and efficiency. I design systems that make AI training faster, stabler, and easier to deploy. My portfolio includes AI-Mission-Control, infrastructure for rapid training deployment across GPU server clusters and edge devices; RobotBandit, a suite of algorithms for black-box optimization; and HyperController, an autonomous trainer for efficient learning, achieving 1000× faster convergence while surpassing state-of-the-art baselines.
The result: AI models that can be trained, tested, and deployed at scale in days, not months. By stabilizing learning and automating infrastructure, I make AI not just smarter — but field-ready.
