AI has been hogging all the headlines over the last three years, but robotics has quietly made some impressive leaps as well.
Sony AI published research in Nature this week introducing Ace, the first robot to beat elite and professional table tennis players under official match conditions. The research, titled “Outplaying Elite Table Tennis Players with an Autonomous Robot,” marks the first time any robot has reached human expert-level competitive play in a commonly played physical sport — a benchmark that roboticists have been chasing since 1983.

What Makes Ace Different
Previous table tennis robots were largely limited to cooperative rallying and never surpassed amateur-level competitive play. Ace clears both bars by a wide margin.
The system combines nine frame-based cameras with three event-based vision sensors — hardware produced by Sony Semiconductor Solutions — to track the ball at 200 Hz with millimeter accuracy and measure spin at up to 700 Hz. Its end-to-end latency is 20.2 milliseconds, compared to roughly 230 milliseconds for elite human players. A custom-built eight-degree-of-freedom robotic arm, made from lightweight alloys, delivers the reach and acceleration needed to execute competitive shots.
Crucially, none of Ace’s striking skills were hand-coded. The control system was trained entirely in simulation using reinforcement learning, then transferred directly to the real robot without additional fine-tuning — a technically demanding feat given the extreme speeds involved.
The Results
In the evaluation underlying the Nature paper (April 2025), Ace faced five elite and two professional players under International Table Tennis Federation rules, with licensed umpires officiating. It won three of five matches against elite players and held its own in the others. Ace served 16 direct aces against elite opponents; they collectively returned the favor just eight times.
The team continued testing after publication. In December 2025, Ace defeated both elite players and one professional in a new round of matches. By March 2026, it beat all three professional opponents it faced at least once — with noticeably faster shot speeds and more aggressive placement near the table edges compared to earlier evaluations.
The Architecture Behind It
The learning system draws on a technique Sony AI previously used in Gran Turismo Sophy, its superhuman racing AI: a privileged critic. During simulation training, this critic has access to perfect physics data the real robot will never see. The policy itself learns only from realistic sensor inputs — but with a better-informed critic guiding it, the policy gradually learns to fuse sensor data and anticipate ball trajectories on its own.
“I didn’t think this was possible at all,” said Peter Dürr, Sony AI’s Director in Zürich and the project lead. “But with this kind of privileged information fed to the critic, it turns out the policy can learn how to do sensor fusion and anticipate the trajectory of a table-tennis ball.”
The project took five years, beginning in 2020 as one of Sony AI’s earliest research efforts. It progressed from simply keeping the ball in play, to cooperative rallies, to competitive matches against increasingly stronger opponents.
Why It Matters Beyond Sport
The implications reach well past table tennis. OpenAI has begun hiring for robotics roles and the broader industry is watching physical AI closely. Ace demonstrates something the field has long struggled with: a robot that can see, decide, and act within the same time window humans operate in — in an uncontrolled, adversarial environment.
“Once AI can operate at an expert human level under these conditions, it opens the door to an entirely new class of real-world applications that were previously out of reach,” said Peter Stone, Chief Scientist at Sony AI.
The team is candid that Ace hasn’t reached world-champion level. It still tends to hit the ball earlier after the bounce than top players would, narrowing its shot variety. And certain extreme smashes still expose subtle gaps between simulation and reality — the physics model slightly overestimates aerodynamic drag on very fast shots, causing the robot to expect the ball to drop faster than it actually does.
But the milestone stands. Sony AI has placed Ace in the lineage of landmark AI achievements — Deep Blue in 1998, AlphaGo in 2016, GT Sophy in 2022 — with one key distinction: this one happens in the physical world, in real time, against a human opponent standing on the other side of a net.