Apple Adds LiDAR Scanner to iPhone 12 Pro for "Instant AR"

Apple Adds LiDAR Scanner to iPhone 12 Pro for "Instant AR"

The Entire VR Industry in One Little Email

The Daily Roundup is our comprehensive coverage of the VR industry wrapped up into one daily email, delivered directly to your inbox. 

Apple today introduced its latest lineup of smartphones, including the iPhone 12 Pro and iPhone 12 Pro Max, both of which are equipped with a LiDAR scanner which will bolster AR capabilities.

Like the iPad Pro introduced earlier this year, Apple is now bringing a LiDAR scanner to its high-end smartphones, the new iPhone 12 Pro and 12 Pro Max.

LiDAR is a so-called ‘time of flight’ depth-sensor which measures how long it takes for light to bounce off of objects in the scene and return to the sensor. With precise timing, the information is used to judge the depth of each point. With rich depth information, augmented reality experiences can be faster and more accurate.

While existing iPhones are already capable of pretty good AR tracking, the current approach derives depth from computer-vision techniques like SLAM, which tracks points in the scene over time to infer depth. Typically this means that the system needs a few seconds and some movement from the camera before it can understand its frame of reference and begin to assess the depth of the scene.

Apple says that LiDAR in the iPhone 12 Pro and 12 Pro Max means the phones will be capable of “instant AR.” That’s because LiDAR captures depth information in the equivalent of a ‘single photo’, without any phone movement or the need to compare images across time.

One way to think about it is to think about the pixels in a photograph. When you take a picture, every pixel captures color and brightness information. Conversely, every pixel of a ‘LiDAR snapshot’ captures a distance value. So rather than needing to wave your phone around for a few seconds before an AR app can establish accurate tracking, tracking can start immediately.

Of course, you can also compare LiDAR depth data over time so that instead of a simple snapshot of depth you can build an entire depth-map of the scene. With LiDAR, you can ‘scan’ a space to create an accurate 3D map which can be very useful for augmented reality experiences. Building such 3D maps was possible before, but the increased depth accuracy of LiDAR will make them faster and more accurate.

You can be sure that this same tech will find its way into Apple’s upcoming AR glasses. Seeing the sensor come to the company’s latest iPhones means that Apple is one step closer to shrinking the tech and making it power efficient enough to fit into a head-worn device. We also wouldn’t be surprised to see other companies in AR and VR begin building LiDAR sensors into their own devices.

Apple’s iPhone 12 Pro is priced at $1,000 and launches on October 23rd, while the larger iPhone 12 Max is priced at $1,100 with a release date of November 13th. The company’s other newly introduced phones, the iPhone 12 and iPhoner 12 Mini, do not include the LiDAR sensor.

Source: Read Full Article