Yoodley is reader-supported. When you buy through links on our site, we may earn an affiliate commission.

Major League Baseball used the cameras of two iPhones to analyze the potential of prospects at its draft combine in Arizona last week. The organization partnered with biomechanics company Uplift Labs, which used artificial intelligence to measure the ability of the candidates who agreed to the use of the tech.

According to The Wall Street Journal, the process involves using AI, which can “translate the images captured by the phone cameras into metrics that can quantify elements of player movement.” The data is believed to be useful for documenting a player’s specific movement patterns and detecting weaknesses and even injury potentials. 

“We have metrics on things like kinematic sequence, stride length, ball contact timing,” said Uplift founder Sukemasa Kabayama told WSJ. “At the same time, we also have this new kind of very early injury warning detection. Let’s say if you have too much of an arm flare, you know there may be potential overload on the elbow, which can unfortunately lead to Tommy John surgery.”

It was not specifically explained why the lab used iPhones instead of other high-end cameras for the test, but the report says, “MLB is betting that the convenience of the product will give it an easy foray into offering biomechanical analysis at the draft combine.”

“Biomechanics and the analysis associated with it is something that we know is a pretty significant piece of where the game is headed,” said Bill Francis, MLB’s senior director of baseball operations. “Traditionally, it has been very hard to do at scale because of the expensive hardware.” 

The use of iPhone cameras for such complicated work is not surprising anymore. Compared to other camera systems of other smartphones, Apple’s camera is deemed as something far superior. Instead of conventional phone photography, iPhone cameras employ “computational photography,” which relies on digital computation instead of optical processes. This means the iPhone’s system itself corrects flaws and makes some adjustments using machine learning algorithms, allowing it to produce more ideal images. This also allows the camera system to identify and recognize photo elements, including faces, objects, and scenery.

MLB and Uplift are not the first to take advantage of these iPhone camera capabilities. In the past, other studies and research also tried to use iPhone cameras. For instance, researchers investigated the basic technical capabilities of novel built-in LiDAR sensors of the iPad Pro 2020 and the iPhone 12 Pro for geosciences applications in 2021.

In the future, the iPhone cameras could be used in more real-life applications. According to rumors, Apple will be introducing some camera system upgrades in its upcoming iPhone 15 lineup. One includes a telephoto camera feature with a variable zoom lens, which is now available on Galaxy S24 Ultra. In iPhone 16, more camera-related improvements and upgrades like this are expected. Currently, no other proof can confirm these claims, but knowing Apple, it is certain that iPhone cameras will improve more and more over the years.

LEAVE A REPLY

Please enter your comment!
Please enter your name here