While many of Apple’s investments in innovative technologies pay off, some just don’t: Think back to the “tremendous amount” of money and

Lidar is dull on iPads, but could go beyond AR on the iPhone 12 Pro

submited by
Style Pass
2020-09-05 06:22:21

While many of Apple’s investments in innovative technologies pay off, some just don’t: Think back to the “tremendous amount” of money and engineering time it spent on force-sensitive screens, which are now in the process of disappearing from Apple Watches and iPhones, or its work on Siri, which still feels like it’s in beta nine years after it was first integrated into iOS. In some cases, Apple’s backing is enough to take a new technology into the mainstream; in others, Apple gets a feature into a lot of devices only for the innovation to go nowhere.

Lidar has the potential to be Apple’s next “here today, gone tomorrow” technology. The laser-based depth scanner was the marquee addition to the 2020 iPad Pro that debuted this March, and has been rumored for nearly two years as a 2020 iPhone feature. Recently leaked rear glass panes for the iPhone 12 Pro and Max suggest that Lidar scanners will appear in both phones, though they’re unlikely to be in the non-Pro versions of the iPhone 12. Moreover, they may be the only major changes to the new iPhones’ rear camera arrays this year.

If you don’t fully understand Lidar, you’re not alone. Think of it as an extra camera that rapidly captures a room’s depth data rather than creating traditional photos or videos. To users, visualizations of Lidar look like black and white point clouds focused on the edges of objects, but when devices gather Lidar data, they know relative depth locations for the individual points, and can use that depth information to improve augmented reality, traditional photography, and various computer vision tasks. Unlike a flat photo, a depth scan offers a finely detailed differentiation of what’s close, mid range, and far away.

Leave a Comment