AI Is Helping Referee Games in Major Sports Leagues, but Limitations Remain - 7 minutes read




Basketball fans have recently gotten a glimpse at the future of artificial intelligence in sports. During many recent game broadcasts, the National Basketball Association (NBA) has displayed real-time calculations of how far away a player was standing from the hoop as they attempted a three-point shot. Similar to how baseball broadcasts display pitch speeds, these graphics add an extra layer of intrigue for fans watching from home.

But in some cases, the figures have been wrong. During a game last December, viewers were told that Peyton Watson of the Denver Nuggets made a corner shot standing 30 feet from the hoop—a distance that, in reality, would have placed him off of the court, behind the opposing team’s bench.

Errors such as this are a stark illustration of the limitations of AI-based motion-capture technology, which is being rolled out for high-stakes uses across sports. Several major sports leagues, including the NBA, Major League Baseball (MLB), the Association of Tennis Professionals (ATP) and some European soccer leagues, have begun using or testing AI-based technology to help call the shots. Whereas such AI systems can make calls more reliable and engage fans in new ways, inherent drawbacks could prevent them from being fully implemented in the big leagues.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

For most of sports history, the task of officiating has fallen largely to humans. People have been responsible for determining whether a ball was out-of-bounds or if a player was offside. Over time, technologies such as instant replays have provided referees with helpful information to make their calls. But the decisions still largely fall to human refs—and the human error that comes with them.

That’s where artificial intelligence comes in. In the mid-2000s tennis became one of the first sports to make use of motion capture and computer algorithms to determine whether a ball landed out-of-bounds. Today’s system, maintained by motion-capture company Hawk-Eye Innovations, is so much more accurate than humans that line judges will be eliminated entirely in ATP matches by 2025.

As the underlying tech improves, it’s no surprise that other leagues are looking for ways to leverage it. MLB’s automated ball-strike system (ABS), which has been undergoing Minor League testing since 2019, uses motion capture and AI algorithms to determine whether a pitch falls inside the strike zone, ostensibly more accurately than human eyes can manage.

But MLB and the NBA have both encountered a key problem with these real-time motion-capture applications: they often take too long to make an accurate decision or trade speed for accuracy, says Meredith Wills, a sports data scientist focusing on baseball at SMT (SportsMEDIA Technology), a tech company specializing in sports graphics and broadcasts. Depending on the complexity of the decision, these AI tools can’t always keep up with the fast-paced action on the field or court, she says. The robot umpire system will sometimes “spin its wheels” for these tougher calculations, some of which can take multiple seconds, whereas a human umpire can take less than one second to call a ball or a strike.

Delays are common and significant enough that, according to multiple sources who spoke with Scientific American, Minor League umpires using ABS in trials have been given discretion to abandon it and call the game themselves if they feel the system is interfering with the pace of play.

A spokesperson for MLB told Scientific American that delayed calls represent a “small fraction” of pitches and that MLB testing has identified the causes of the slowness but declined to comment further.

These long processing times can be the result of a playing field full of visual “distractions,” Wills says. On the basketball court, for instance, in order to identify and track the ball, computer algorithms must separate it out from 10 moving players and their limbs.

“The computer might not find the ball as easily as you’d like,” Wills says. Visual quirks such as lighting changes, background color and spectator movements in the stands can also throw off the computer’s calculations. “It [might] misidentify somebody’s hat as the ball,” she says. “Because of that, your tracking can end up off.”

As a result, human intervention is often required to verify the calls. But human intervention can also lead to inaccuracies. That is the case with video assistant referee (VAR) systems currently being used in European soccer leagues such as the English Premier League and Spanish LaLiga to help officials determine if a player is offsides, explains Pooya Soltani, who studies games technology at Staffordshire University in England. With VAR, a separate (human) operator helps review video footage, and a referee is responsible for the final decision.

Judging offsides in soccer requires knowing the location of players at the precise moment a ball is kicked. But in a study presented at a 2022 conference, Soltani found that while viewing the same kinds of replay angles used by actual officials, respondents thought the ball was kicked 132 milliseconds (about an eighth of a second) later on average than it actually was because of a combination of limits in human perception and video technology.

“The delay might not seem significant, but at high speeds, it can result in considerable [errors]” and an incorrect call, Soltani says. “The interpretation of these close calls tends to be subjective, and human perception may introduce errors in judgment.”

Similar problems could arise from the NBA’s use of Hawk-Eye’s motion capture to aid with certain calls. The league has already begun using it this season for goaltending reviews and is set to expand to other judgments such as out-of-bounds calls in the future—though these goaltending calls currently still require human assessment for the final decision, according to multiple league sources.

In cases such as the three-pointer graphics broadcast in the Nuggets game, where human review isn’t possible, the automated system could still have accuracy problems. Other real-time applications include using these stats in the near future to inform gambling, which the NBA recently rolled out in its app. These features may use real-time on-court information from motion-capture technology to inform live betting odds as viewers place in-game bets directly through streaming apps. In these cases, any inaccuracies in the AI-based outputs could have monetary consequences.

“All new technologies present both opportunities and challenges, and some early bumps in the road with the Hawk-Eye rollout do not diminish what we see as enormous upside to the system,” an NBA spokesperson told Scientific American. “We remain confident in the technology’s ability to improve not only the speed and accuracy of officiating decisions but also revolutionize the way fans experience our game.”

As these motion-capture systems become more sophisticated and are trained on more data, some of these technological limitations could lessen or disappear. For example, the models might get better at ignoring visual “distractions” in the playing field. And improvements in both hardware and software could help transmit data faster and minimize processing delays.

For now, though, when it comes to calling the shots, humans still play a vital role.

“The expectation is evident, with a perception of technology as the ultimate problem-solver. However, the reality is different,” Soltani says. “I believe the technology should be utilized as a tool to aid the decision-making process and not [replace] it.”



Source: Scientific American

Powered by NewsAPI.org