ToF vs. LiDAR: What’s the Difference? | MakeUseOf

Tips

Lately, there has been so much buzz around LiDAR on new Apple devices that it is easy to forget that mobile Augmented Reality can work any other way. But it can and does, particularly with the ToF tools reaching new heights in Samsung phones.

iphone 12 pro camera tof lidar feature

Whether you are a developer, in the market for a new device, or just curious, it’s worth taking some time to unpack these acronyms and learn the ins-and-outs of mobile-phone depth-sensing.

What Is ToF?

ToF is short for Time of Flight.

Technically, ToF refers to using the speed of light (or even sound) to determine distance. It measures the time it takes for light (or sound) to leave the device, bounce off an object or plane, and return to the device, all divided by two reveals the distance from the device to the object or plane.

Depth sensing has its basis in simple math.

So, the relationship is that all LiDAR is a type of Time of Fight, but not all Time of Flight is LiDAR. To keep things simple, when we talk about “ToF,” we mean optical distance measurement, not including LiDAR.

So, if LiDAR and optical non-LiDAR ToF both use light for distance determination and 3D mapping, how are they different?

What Is LiDAR?

LiDAR is short for Light Detection and Ranging. This technology uses a laser, or a grid of lasers, as the light source in the equation detailed above.

A single LiDAR reading can be used to measure things like the width of a room, but multiple LiDAR readings can be used to create “point clouds.” These can be used to create three-dimensional models of objects or topographical maps of whole areas.

While LiDAR may be new to mobile devices, the technology itself has been around for quite a while. In non-mobile settings, LiDAR is used to do everything from mapping underwater environments to discovering archaeological sites.

How Are LiDAR and ToF Different?

image depth map

The functional difference between LiDAR and other forms of ToF is that LiDAR uses pulsed lasers to build a point cloud, which is then used to construct a 3D map or image. ToF applications create “depth maps” based on light detection, usually through a standard RGB camera.

The advantage of ToF over LiDAR is that ToF requires less specialized equipment so that it can be used with smaller and less expensive devices. The benefit of LiDAR comes from the ease with which a computer can read a point cloud compared to a depth map.

The Depth API that Google created for Android devices works best on ToF-enabled devices and works by creating depth maps and recognizing “feature points.” These feature points, often barriers between different light intensities, are then used to identify different planes in the environment. This essentially creates a lower-resolution point cloud.

How ToF and LiDAR Work with Mobile AR

Depth maps and point clouds are cool, and, for some people and applications, they’re enough. However, for most AR applications, this data has to be contextualized. Both ToF and LiDAR do this by working together with other sensors on the mobile device. Specifically, these platforms need to understand your phone’s orientation and movement.

Making sense of the device’s location within a mapped environment is called Simultaneous Localization and Mapping, or “SLaM.” SLaM is used for other applications like autonomous vehicles, but it is most necessary for mobile-based AR applications to place digital objects in the physical environment.

This is particularly true for experiences that remain in place when the user isn’t interacting with them and for placing digital objects that appear to be behind physical people and objects.

Another important factor in the placing of digital objects in both LiDAR and ToF-based applications involves “anchors.” Anchors are digital points in the physical world to which digital objects are “attached.”

In world-scale applications like Pokemon Go, this is done through a separate process called “Geotagging.” However, in mobile-based AR applications, the digital object is anchored to points in a LiDAR point cloud or one of the feature points on a depth map.

Is LiDAR Better than ToF?

Strictly speaking, LiDAR is faster and more accurate than Time of Flight. However, this becomes more significant with more technologically advanced applications.

For example, ToF and Google’s Depth API have difficulty understanding large, low-texture planes like white walls. This can make it difficult for applications using this method to accurately place digital objects on some surfaces in the physical world. Applications using LiDAR are less likely to have this problem.

However, applications involving larger or more texturally varied environments are unlikely to have this problem. Furthermore, most mobile-based consumer AR applications involve using an AR filter on the user’s face or body—an application that is unlikely to run into problems because of large untextured surfaces.

Why Do Apple and Google Use Different Depth Sensors?

In releasing their LiDAR-compatible devices, Apple said that they included the sensors as well as other hardware for the sake of “opening up more pro workflows and supporting pro photo and video apps.” The release also called their LiDAR-compatible iPad Pro “the world’s best device for augmented reality” and touted Apple’s measurement apps.

Google hasn’t given such forthright explanations as to why their Depth API and the new line of supporting devices don’t use LiDAR. In addition to working around LiDAR, keeping Android devices lighter and more affordable, there’s a major accessibility advantage as well.

Because Android works on mobile devices made by multiple companies, using LiDAR would favor LiDAR-compatible models at the expense of all others. Furthermore, because it only requires a standard camera, the Depth API is backward compatible with more devices.

In fact, Google’s Depth API is device-agnostic, meaning that developers using Google’s AR experience-building platform can develop experiences that work on Apple devices as well.

Have You Explored Depth-Sensing?

This article has primarily focused on LiDAR and ToF in mobile-based AR experiences. That is largely because these more complex experiences require the most explanation. It is also because these experiences are the most fun and the most promising.

However, depth-sensing approaches like these are the basis of many simpler and more practical experiences and tools you might use every day without giving it much thought. Hopefully, reading up on ToF and LiDAR will give you some more appreciation for these applications.

best-tool-apps-iphone

The 10 Best Tools for Your iPhone: Ruler, Level, and Distance Measurement

Everyone needs some type of toolbox with a measuring device. You may have to measure an object or area, calculate a distance from wall to wall, or make sure that your project pieces are level. Unfortu…

About The Author

.

Products You May Like

Leave a Reply

Your email address will not be published. Required fields are marked *