fbpx

google pixel 6 pro camera

Credit: Google

Google surprised all of us this week when it revealed a few official Pixel 6 series details, ranging from its new in-house Tensor SoC to the design and more. But one of the most important tidbits is that the new phones finally deliver major camera upgrades.

The Pixel 6 Pro, in particular, is finally getting a triple rear camera for the first time in the Pixel series’ history, albeit, in typical Google fashion, a couple of years later than rival Android brands. That means we’ve got a standard camera, ultra-wide shooter, and telephoto lens on the same phone, as opposed to previous flagship Google phones that made you choose between an ultra-wide and a telephoto camera.

There are more upgrades than just a triple-camera system though, and these could enable some really nifty additions and improvements to the Google Pixel camera experience. Here’s what we’re expecting from the Pixel 6 and Pixel 6 Pro cameras.

Related: The best camera phones you can get

The main sensor finally gets an upgrade

Dual cameras camera module Pixel 4

Perhaps the most notable photography upgrade is that the Pixel 6 series will be getting a brand new main camera sensor. This is a major deal because Google has used the same 12MP IMX363 sensor since the Pixel 3 days, with the Pixel 2’s IMX362 sensor being very similar as well.

A new, larger sensor could potentially deliver several improvements over the old IMX363 sensor, such as improved dynamic range and better detail. The latter in particular was a complaint in our Pixel 5 review, as we said images of detail-rich scenes, “look busy and overly contrasted.”

Google says the new camera sensor will capture 150% more light than the old sensor (via The Verge), which means we should expect improved photos at night. The Pixel camera app might not need to enable Night Sight mode as often as predecessors, as the increased light-gathering capabilities and faster processing should deliver brighter shots in the standard photo mode.

We previously saw Google offer a Night Sight Portrait mode on the Pixel 5 too, so improved light capture could result in more low-light portrait shots without using Night Sight or Night Sight Portraits in situations that would be too dark for the Pixel 5.

A new main camera sensor means Google might not have to lean on Night Sight very often, for one.

We’d also expect better results for astrophotography images and astro time-lapses. Here’s hoping we don’t need to wait 15 seconds to get a single astrophotography frame and four minutes for the full shot. Improved light-capturing capabilities open the door for more ultra-low light features too, such as low-light video recording and night time-lapses.

There’s no word on the specific camera sensor yet, but sensors like the 50MP Samsung Isocell GN2 also offer Dual Pixel Pro autofocus tech and support for more video quality options (e.g. 4K at 120fps and 8K recording). Google has generally been late to high-resolution video options compared to rivals, but we’re definitely expecting improved autofocus tech if it adopts a module like this.

Improved autofocus (and exposure) would also be in line with the philosophy espoused by Google camera engineer Isaac Reynolds around the time of the Pixel 4 series launch. That is, users should be able to simply open the camera app and get the correct shot without tapping on the viewfinder.

Great software zoom meets great hardware

The Super Res Zoom feature on the Google Pixel 3.

The Pixel 6 Pro is also getting a 4x telephoto camera — a major upgrade from the Pixel 4 series’ 2x telephoto lens. There’s no word on other details regarding the 4x camera, but this still has us excited about zoom on the new Pixels.

Google is right up there with Huawei and Samsung when it comes to great hybrid zoom technology thanks to its Super Res Zoom feature. In fact, our own Rob Triggs found that the Pixel 5’s software-only zoom offered good images up to 3x. But the Pixel 4 still showed us that pairing Super Res Zoom with a dedicated 2x telephoto camera could deliver even better results, with solid 4x and 5x images too.

Mega shootout: The best camera phones of 2021 so far tested

The combination of Super Res Zoom and a 4x telephoto camera means we can therefore expect the Pixel 6 Pro to offer great images beyond 4x zoom too. All phones have a threshold where the zoom goes from good to bad, but the Pixel 6 Pro’s threshold should be significantly higher than the Pixel 5 and Pixel 4 by virtue of the higher native zoom.

One potential challenge we’ve seen with phones packing a 4x or 5x telephoto camera (in addition to a main camera) is that zoom at short range tends to suffer. We’ve seen companies like Huawei, Oppo, and others rely on image fusion tech to combat this, combining results from the main and telephoto cameras to deliver solid 2x to 3x shots.

We’re guessing Google could do the same for shots below 4x, combining the main camera, telephoto camera, and Super Res Zoom. We have seen some phones suffer from reduced detail at the edges when using image fusion techniques like this though, so that might also be something Google needs to watch out for.

The renewed promise of machine learning silicon

google tensor processor

Credit: Google

The Pixel 6 series will also be the first Google phones powered entirely by an in-house chipset, dubbed the Tensor processor. We’re expecting Google to use Arm CPU and GPU tech, but the big news here is that the chipset has a TPU (tensor processing unit). This is a machine learning processor that promises to enable a host of mobile features that usually require an internet connection.

This isn’t the first time we see dedicated Google silicon in a Pixel phone though, as it previously used the Pixel Visual Core and Pixel Neural Core. These chips did everything from speeding up HDR+ processing and voice inference to enabling 3D face unlock. But the TPU promises an even bigger upgrade, delivering more horsepower while also being a more integral part of the overall chipset than Google’s earlier chips.

See also: Why the Pixel 6’s Tensor chip is actually a big deal (and why it isn’t)

Google is using the TPU to improve camera capabilities too, with the firm showing several demos to The Verge. One such demo saw a blurry photo of a toddler, with Google running this image through the TPU to deblur the child’s face. The Pixel maker says that in addition to existing multi-frame processing with the main camera, it’s able to use the ultra-wide camera as part of the process, resulting in the deblurred face.

Another demo touted by Google for the TPU is improved HDR video recording. This demo pitted the Pixel 6 against the Pixel 5 and iPhone 12 Pro Max, shooting an HDR video of a beach (in 4K/30fps). The Verge noted that the Pixel 6 came out on top because it didn’t artificially brighten the shadows compared to Apple’s device, while looking more natural than both.

Google told the outlet that it was using the same HDR process for HDR video that’s used for its still images. We can’t help but feel that this opens the door for an improved burst mode, which has been missing from Pixel devices for a while now.

Google demonstrated features like ‘deblurring’ faces and improved HDR video, but more powerful machine learning could bring plenty more features.

Another possibility raised with faster AI silicon is that we could finally see object removal tech as touted by Google back in 2017. Back then, it showed an image taken behind a chain-link fence, with machine learning used to remove the fence to deliver a clear shot. We haven’t seen this tech since then, but the TPU could theoretically enable it as well as features like object erasing in general (as seen on Samsung phones) and reflection removal (as seen on Huawei devices).

Google has also previously used machine learning hardware to enable enthusiast-focused features like Dual Exposure Controls. This nifty feature debuted on the Pixel 4 line and allowed users to adjust shadow levels prior to taking a shot. So hopefully the TPU offers more convenient features in the viewfinder, giving users one less reason to visit an editing app.

Former Googler Marc Levoy admitted back in 2019 that the company had yet to solve the challenge of capturing both a detailed moon and a moon-lit landscape in one shot. The former camera chief attributed the difficulty in overcoming this hurdle to the dynamic range between the bright moon and the dark landscape being too great, while urging people to “stay tuned” for developments. It stands to reason Google either shelved this work or simply waited for better camera hardware and machine learning silicon. And guess what we’ve got with the Pixel 6 series?


There are still plenty of unknowns surrounding the Pixel 6 camera, but there’s plenty to be excited about. Are you impressed by the Pixel 6 Pro camera hardware so far? Let us know via the poll above and in the comments.