Fast company logo
|
advertisement

Our source confirms Apple’s plans to put a 3D depth camera on the back of the iPhone. Here’s what it’ll do.

The next iPhone will get a ‘world facing’ 3D camera

[Photo: Thai Nguyen/Unsplash]

BY Mark Sullivan5 minute read

At least one of this year’s iPhones will feature a 3D depth camera on its back, a source with knowledge confirms to Fast Company.

The camera—actually a laser, sensor, and software system—emits light to measure the distance between the phone and various objects and surfaces in front of it. This detailed depth information will enable new photo and video effects, as well as better augmented reality experiences.

iPhone engineers have been working on the rear-facing, or “world facing,” 3D camera for at least two years now. It’s been on a short list of possible feature additions for new iPhones, but until this year hasn’t made the cut. In truth, Apple could decide to nix it this year, too. For now, though, it’s in the design, which we’ll hopefully get to see for the first time this fall (if the coronavirus doesn’t get in the way of Apple’s plans).

Apple will buy the laser needed for the new 3D from San Jose-based Lumentum, the same company that currently supplies the laser for the iPhone’s front-facing 3D camera.

Lumentum says it doesn’t discuss the use of its laser tech in yet-to-be announced devices.

Apple won’t be the first to add a rear-facing depth camera to it phones. Samsung’s Galaxy Note 10+ and Galaxy S20+ and Galaxy S20 Ultra, as well as other Android phones, already have them. But Apple may find some new and novel ways to leverage the technology for new user experiences. And it will likely be a bit more showy in the way it brands and markets those experiences, if history is a guide.

iPhones already feature depth cameras (TrueDepth) on their fronts. They’re used mainly for Face ID security and for some fun messaging effects such as Animoji.

The iPhone 11 Pro and iPhone 11 Pro Max, which came out last year, have three camera lenses on the back: a 12-megapixel wide angle, a 12-megapixel 2X telephoto lens, and a 12-megapixel ultra-wide-angle lens. While the three lenses provide options for the breadth of photo-taking scenarios, the 3D camera system would add depth information.

When you use AR apps without depth information, it’s a bit glitchy.”

Lumentum CEO Andre Wong
Right now the main depth effect in the iPhone is Portrait mode, which gives photos the “bokeh” effect that blurs the background layer and places the foreground subject (a person, usually) in sharp focus. The addition of the depth camera data may create a better-looking bokeh effect by more accurately distinguishing between foreground and background layers, and perhaps adding more depth layers to blur or focus. It might become possible to adjust which layers of a photo are blurry and which are focused after the fact in editing mode.

The upside of depth

The 3D mapping might be used in concert with other iPhone’s photo software features. Imagine a video where a skateboarder in the middle of a jump is disconnected from the background, rendered in full 3D and slow motion.

It’s likely that the depth camera will have the greatest effect on the quality of augmented reality apps. Apple released its ARKit framework for developing AR apps three years ago, but so far consumers haven’t exactly been clamoring for them.

“When you use AR apps without depth information, it’s a bit glitchy and not as powerful as it ultimately could be,” says Andre Wong, Lumentum’s VP of 3D Sensing. “Now that ARKit and (Google’s) ARCore have both been out for some time now, you’ll see new AR apps coming out that are more accurate in the way they place objects within a space.”

advertisement

Apple is now developing an AR app for iOS 14 that will let users point their iPhone at items in Apple Stores and Starbucks, and see digital information display around the items on the phone screen, 9to5Mac’s Benjamin Mayo reports.

Wong believes that rear-facing 3D cameras could enable users to create digital content that lends itself well to social media sharing. For instance, people might like to share images of themselves interacting with holographic images of animals or celebrities in their own living rooms. This is similar to what the Holo app by the developer 8i, but with the depth data those experiences might be greatly improved.

For example, a holographic image of a person standing on top of a real-world table might show the edges of the hologram’s feet in relation to the tabletop in a clean and convincing way. That’s tough to create without the depth data, when relying only on the phone’s existing cameras and software.

How the tech works

The core technology behind depth camera systems is a rear-facing vertical-cavity surface-emitting laser (VCSEL) that sends out lightwaves at a consistent rate then measures the time each takes to bounce off objects in the environment and return to a sensor. Light that returns from objects near the phone has a shorter “time of flight,” while light returning from objects farther away has a greater time of flight. The light emitted by the iPhone’s front-facing depth camera travels only a few feet—which is fine for locating the person holding the phone—but the Lumentum laser used in the rear-facing system will have a much longer range.

In Samsung phones, the 3D camera is the technology behind two features. Live Focus allows users to blur out the background in still images and video to put added emphasis on the person (or pet) in the foreground. Quick Measure lets you estimate the width, height, area, and volume of an object within the camera frame. Apple may take a similar approach and highlight two or three marquee features—such as the AR functionality reported by 9to5Mac—that are enabled by the rear-facing 3D camera.

A measure of skepticism is in order for the impact rear 3D sensing may have in this year’s iPhone, which will also be the first 5G iPhone. The photo effects will probably get more attention than the improved AR, at least in the near term. AR developers will indeed create better experiences with the new depth data. But even with better technology at their command, they still have to create experiences consumers will love and use often. And the huge limiting factor there is the awkward experience of viewing AR through a smartphone screen held in front of your face.

But that situation might not last much longer. It’s one of Silicon Valley’s worst-kept secrets that Apple is working on an AR headset or pair of glasses that—in the long run—could become the company’s main spatial computing device.

Recognize your brand’s excellence by applying to this year’s Brands That Matter Awards before the early-rate deadline, May 3.

PluggedIn Newsletter logo
Sign up for our weekly tech digest.
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Privacy Policy

ABOUT THE AUTHOR

Mark Sullivan is a senior writer at Fast Company, covering emerging tech, AI, and tech policy. Before coming to Fast Company in January 2016, Sullivan wrote for VentureBeat, Light Reading, CNET, Wired, and PCWorld More


Explore Topics