LiDAR Technology on the new iPhone 12 Explained: Better than ToF sensors on Android Phones?

by Mayank
Featured

Back in March this year, Apple unveiled its 4th generation iPad Pro, a big, flat 12.9-inch tablet featuring the first on any smartphone, LiDAR sensor. Fast forward to Tuesday, October 13th, when Apple finally unveiled its much-anticipated iPhone 12 models. Apple confirmed the presence of a LiDAR sensor on the latest iPhones (only for the Pro models).

Now, LiDAR technology isn’t groundbreaking, or anything considering this has existed for a long time now. In fact, it has been in use since 1961. Even NASA’s 1971 Apollo 15 mission carried a laser altimeter (LiDAR instrument) which was used by astronauts to map the surface of the moon.

The last time you received an over-speed ticket from the Highway Patrol, the officer was most likely using a LiDAR speed gun to determine if you were over-speeding.

A cop holding a speed gun that uses LiDAR. Credits: Wikipedia

But why is Apple using it all of a sudden?

More about that later, but first of all, you should know how LiDAR technology works.

Contents

  1. What is LiDAR?
  2. How does a LiDAR System Work?
  3. Components of a LiDAR System
  4. LiDAR System inside the new iPhone 12 Pro Models
  5. LiDAR on iPhones vs. ToF sensor on Android Phones

What is LiDAR?

LiDAR stands for Light Detection and Ranging. Also, sometimes it’s called Laser Imaging, Detection, and Ranging, which I think makes more sense. The principle behind LiDAR is very simple: Shine light on a surface (various non-metallic objects, rocks, chemical compounds, clouds, etc.) and measure the time it takes to return to its source.

This technology is similar to that of RADARs and SONARs. The best way to explain this similarity in a single line is this — RADAR is to radio waves, SONAR is to sound waves as LiDAR is to lasers.

How does a LiDAR System Work?

When you shine a torch on a surface, the light from the torch hits the surface and reflects back to you. The reflected light is then returned to the retina of your eye that forms an image for your brain. LiDAR technology is similar to this except laser is used here instead of normal light.

The LiDAR instrument fires rapid, short laser pulses on a surface. A sensor installed in the instrument then measures the time it takes for each pulse to return back. The total time (from the moment each pulse leaves the laser emitter to the moment each pulse hits the sensor) is then calculated.

This total time is also called Time of Flight (ToF). The distance between the LiDAR instrument and the surface is then calculated by the following formula:

Distance= Time of Flight ×Speed of Light (3×108 m/s)

For every pulse, this process is repeated several times and then a complex map of the surrounding environment is formed. This map can then be used for various purposes, such as 3D mapping of nearby moving vehicles on self-driving cars, on smartphones such as the 4th generation iPad and the new iPhone 12 Pro and Pro Max, in military applications and for AR (Augmented Reality).

A 3D map of a room created using LiDAR

Components of a LiDAR System

The complete LiDAR system consists of 4 main components:

  • Laser- Lasers are important for the LiDAR technology to work. For various applications, lasers at different wavelengths are used. For non-scientific applications, 600-1000 nm (wavelength) lasers are generally used. At this wavelength, the maximum power for lasers is restricted so that they can be safe for the human eye. One popular alternative, 1550 nm lasers are primarily used for military applications as lasers at this wavelength are not visible in night vision goggles as opposed to low wavelength lasers. 1550 nm lasers are safe for the eyes because the human eye does not absorb them well.

Better target resolution is accomplished with shorter laser pulses provided the receivers and detectors have ample bandwidth.

  • Illumination Mechanism- In general , two types of illumination techniques are used to fire laser pulses to the target:
  1. Phased Arrays: This method uses an array of microscopic antennas. Each antenna produces a beam of lasers that can be controlled electronically to point in different directions without moving the antennas. As the antennas are fixed, LiDAR instruments that use phased arrays are often referred to as solid-state LiDAR. They are less expensive than their electromechanical counterparts, smaller in size and more reliable. The new iPhone 12 and also the 4th-gen iPad use phased arrays.
  2. Micro-electromechanical Machines: This technique uses a mirror that is rapidly spun. The laser beams strike the mirror, and the mirror reflects the laser to the target. This mirror can be redirected to any part of the target area. Two mirrors are usually used where the second mirror goes up and down to scan the target area fully. Since this technique requires mechanical parts, it is often disrupted by vibration and shock. As a result, they often need to be calibrated which makes them less reliable than phased arrays.
  • Scanner- The speed at which the image is produced by the LiDAR device depends on the speed at which the light pulses are scanned. Since light moves very fast (3×108 m / s), it is critical that the sensor is fast enough to detect reflected pulses for better accuracy and less attenuation. Dual-axis scanners are used to scan azimuth (angular measurement) and elevation. Dual-axis scanners are also used by laser printers.

NOTE: All LiDAR systems do not use scanners such as the ToF sensors on android phones are scanner-less systems while the LiDAR system inside the new iPhone and iPad uses a scanner. In the next section, I will describe the merits and demerits of both systems.

The LiDAR Scanner on the new 12.9-inch iPad Pro
  • Sensor- LiDAR uses active sensors that are fitted with an illumination source. The laser pulses from the light source reach the object and the received pulses are detected and measured by the sensor. Distance to the object is then determined by the formula: Distance= Time of Flight ×Speed of Light (3×108 m/s).

LiDAR System inside the new iPhone 12 Pro Models

LiDAR technology has a wide range of applications. However, the iPhone 12 leverages this technology to enhance the user’s AR experience. Using LiDAR, the latest powerful A14 Bionic chip inside the iPhone 12 will be able to map the surroundings faster and more effectively.

With better information on the position, elevation and length of objects in the room, apps can better integrate their digital artifacts with the physical objects in the room. Apple will also enable third-party iOS apps to use a map developed by the LiDAR system to boost the AR experience.

The iPad Pro’s LiDAR under operation.
The LiDAR system inside the new iPhone 12 Pro is similar in operation.

As you may recall, a range of flagship android phones have been using a ToF depth mapping sensor for the past few years. The application is the same as the LiDAR application on the iPhone-To improve the AR experience. But phones like the Samsung Galaxy S20 Ultra also use the ToF sensor to enhance the camera’s night mode and also to help detect the edges in the bokeh mode.

We don’t know for sure whether or not Apple will use the LiDAR sensor for such applications.

LiDAR on iPhones vs. ToF sensor on Android Phones

The ToF or Time of Flight sensors on some Android phones and the LiDAR sensor on the latest iPhone 12 Pro models are identical. Both use lasers as an illumination source and use the same Time of Flight technique to measure distances and map the surroundings.

So, what’s different then?

The difference is in the way both of them capture 3D information.

As I described in the previous section, Android phones employ a scanner-less LiDAR device, while the iPhone 12 Pro and the new iPad are using a scanner-type LiDAR (also called Flash LiDAR).

Each of these systems have their own advantages and disadvantages. I’m not going to go into the specifics of the scanner-less LiDAR because I covered it inside another post.

Also Read: Time of Flight (ToF) Sensors on Smartphones Explained

Advantages of Scanner-less LiDAR (or Flash LiDAR)

  • Both the scanner-type and flash LiDAR systems use a ToF camera to collect the information about the intensity of the reflected light. However, in a scanner-type LiDAR the ToF camera consists just of a point sensor whereas in Flash LiDAR, the camera consists of a complete 2D pixel array.
  • Since a Flash LiDAR (or scanner-less LiDAR) emits a single large laser flash compared to the small timed laser pulses emitted by a scanner-type LiDAR, the imagery becomes more accurate because the captured frames do not need to be stitched together.
  • Flash LiDAR also suffers from less motion distortion.
  • Flash LiDAR is cheaper to manufacture than scanner-type LiDAR because it does not need a scanner.

Advantages of Scanner-type LiDAR

  • When light is emitted from a source, it has multiple reflections from different surfaces. Consequently, a single flash of light receives all the reflections at the same time. As a result, it becomes difficult for the sensor to calculate the exact distances, so the data can be miscalculated. This miscalculation can be a problem for your bokeh mode shots as well as for your AR experience.
  • A scanner-type LiDAR like the one inside of the iPad 4th generation and the iPhone 12 Pro, maps the complete area point-by-point which minimizes the noise and makes the distance measurement more accurate.
  • A scanner-type LiDAR can measure longer distances compared to a scanner-less LiDAR.

Related Posts

Leave a Comment