From Samsung to Apple, device manufacturers are striving to optimize security and functionality with dual sensor cameras.

3D sensing has long been used in smartphones for things like biometric scanning, gesture sensing, 3D modeling, and to enhance photos. And, while inarguably cool, they’re far from perfect, creating missteps in security they’d hope to solve—for example, a photograph of a person can sometimes be used to fool facial recognition programs.

As a new generation of smartphones work their way into the market, they promise the ability to do plenty more—things like 3D scans of rooms that can be used in virtual reality applications.

The Samsung Galaxy Note 8 is set to release on September 15, an attempt to win back customers after the exploding battery fiasco that overshadowed the last model.

 

The Samsung Galaxy Note 7. Image courtesy of Samsung

 

While the phone has some cool new features, the camera seems to be what’s garnering the lion’s share of the attention. This isn’t specific to the Samsung Galaxy Note 8—we’ve long been infatuated with smartphone cameras, especially as we move toward more dual camera setups that promise advances in security and capability.

 

How Dual Sensors Work

The Independent explains the various ways in which dual cameras can be implemented on smartphones. On the Nokia 8, there are two 13-megapixel sensors, one of which is monochrome and sucks in light at a more rapid pace than the one that is color. Together, both sensors work together to capture images that are sharper, still giving the photographer the option to choose one or the other.

The Samsung Galaxy Note 8 camera uses a pair of 12-megapixel sensors on the rear to provide telephoto zoom, one of which is wide-angle and the other telephoto. The iPhone 7 Plus operates in the same way—both work to overcome limitations related to optical zoom in the devices. If you choose to shoot with both sensors, it creates a bokeh effect, which is where the subject is in focus and the background is blurred.

 

An example of the bokeh effect. Image courtesy of Daniel Timm [CC-BY 2.0]

 

Some mirrorless cameras also operate in this way, simulating a background that isn’t focused by utilizing a second camera or shot to provide information about where objects are relative to the subject and blurring the background via an algorithm (as opposed to optically, like you’d see in a mirrored camera).  

While improved zoom is cool, dual camera setups can also work to create 3D images by capturing additional detail only available via a longer lens. And, though the dual camera setup is somewhat new, detachable lenses have offered similar capabilities since the iPhone 4.

 

Camera specs for the Samsung Galaxy Note 8. Image courtesy of Samsung.

 

What This Means for the Future of 3D Images

Sony, Samsung, and Apple are just a few of the industry leaders rumored to be working on devices with actual 3D sensors, which will lead to an expansion of the ways a smartphone can be used. Apple is rumored to be working on a rear-facing laser system that would allow facial recognition and augmented reality features, as well as the capability to take 3D selfies and even the chance to put a 3D version of your head on a character in a video game.

Sony subsidiary SoftKinetic demoed a 3D sensor on a Sony Xperia smartphone using facial recognition software from a Swiss company. The sensor, located on the front-facing camera, isn’t as simple as the face unlock from previous versions of smartphones. Instead, it uses 3D facial recognition to provide a security system that is much less hackable than its predecessors.

Of course, users who’d rather cover the front-facing lens may find a future in which 3D, front-facing cameras are the norm a little unsettling from a privacy standpoint. Still, this technology offers an undoubtedly more secure way of protecting your device.

 

Learn More About:

 

Comments

0 Comments