The magic behind Google Pixel Camera

Pixel phones were always known for their cameras. That instantaneously led me to take a glance at its Tech specs. The specs were simply mediocre. It's got a 12 MP rear camera but still outperforms a phone that has a 48 MP rear camera? Crazy.

The pixels were always fast and smooth, with the best camera but their spec doesn't mean that. At least on paper. Thanks to high levels of optimization, the pixels were fast and snappy.

Similarly, the cameras had been so optimized and so much post-processing is done to make their photos speak.

In this article, let's take a look at how pixel cameras are so good and how you can get the Pixel camera app on your android app.


Making things clear.

Let's go back to 2013. The all-new Samsung Galaxy K zoom was hitting the shelves. That phone uses hardware-level optical zoom technology. It's great, you can get crazy amounts of zoom.

The Galaxy K zoom isn't alone. We also have our historic Nokia Lumia 1020 which seems like a phone that seems impossible to beat in terms of photography.

But to attain such great levels of Photographic pioneers, sacrifices were made. The K Zoom was lacking important and basic features like Calling. The Lumia 1020 was having a huge camera module which made the phone to wobble a lot.

The main thing is that these features were impractical to apply on all phones at that time. It was expensive to do this on all phones in all tiers of budget.

Now, it's 2021 and if we compare it with the Samsung flagship, the S21 Ultra, we'll be wowed by how much we've improved. Both can zoom at crazy levels, in fact, the Samsung flagship takes greater quality photos. That shows us how many software improvements we've made.

And we've also raised the bar for photo and video quality for every smartphone, including the budget ones. Phones with a low price ship with way better specs and hardware and software now than before.

To attain those results, Google heavily relies on Computation Photography. It is the process by which various digital filters like de-noising and etcetera, and processing techniques are used to make an image that looks outstanding.

Google pretty much uses the same hardware each year but gives revolutionary levels of results, which are hard to believe, thanks to its software improvements.

In their Google Pixel product page, they've mentioned various software features like HDR+ and Night Sight which makes their photos DSLR-like and that's why people love it.

Pixel phones heavily rely on software or are software-based rather than relying on pure hardware and megapixels. Google already knew this before whilst other companies knew it the hard way like Samsung.

Now, this computational photography process is not only used by Google's Pixels but is also by Apple's iPhone. The specs aren't very cool yet the phone manages to take amazing photos.

Now there are some caveats. You can't completely rely on pure software or computational photography. 

Marc Levoy, Engineer at Google says that “The hardware is separate from computational photography. Yes, it will always matter. For instance, the aperture will control how much SuperZoom you can get. If the lenses used are not good, there will be aberrations with the software. Optics do matter.”

Also, these 2 sets of phones are software optimized to make them feel snappy and capable. From this, we get to know that hardware isn't everything. If companies can combine optimized software and high-quality hardware then that would be near perfect.

While that was easy to say, it is actually more difficult to do it. Since the images now have more pixel density and colour data, it takes more compute power to process those images. 

They are better off using the lower resolution sensors. However, reports suggest that the new Pixel 6 may have a 48MP Sensor. And Google uses its own 'Whitechapel' or the Tensor chip. This optimizes its performance and software to the chip because it is made for more AI tasks and machine learning.

     


Therefore, it is not wrong to expect more performance from it.

A Quick thing to note: Google don't rely only on pure software. They have engineered separate chips to process those images.


The Principle of Computational Photography

Now, let's dive into Computation Photography a little bit. Let us consider the example of HDR (High Dynamic Range). I won't bore you with theory if you thought that I'd.

Dynamic range is the ratio of the brightest areas with the darkest areas. The brightest and the darkest areas in a photo are obviously areas with white and black colours respectively.

Now using HDR, you can achieve those photos with saturated colours and those fantasy like photos. But how does it work?

First of all, Several photos of various exposures will be captured. Some of those have very high exposure will others have low exposure. Then, with the help of Computation Photography, it combines all the photos to get the perfect exposure. 

Doing this will also expand the colours, add an extra bit of saturation and make the shadows look more darker. This will make the photo look more eye-pleasing.

Computational videography?

According to Marc Levoy, distinguished engineer at Google, "If you want to try and do these things responsively in software, then you need to compute power. The limitation on computational video is the computing power". 

That means that videography with this processing is not possible due to the fact that it requires more time and power which requires good hardware.

Google Camera

Thanks to google camera, for Google Pixel's amazing photos and videos. But, is it available for everyone?

Unfortunately, not. Google's ultimate camera app is only available for Pixel devices. Even the Pixel launcher, is only for Pixel devices. The launcher is so good; Quick and Snappy, that you want to buy a Pixel for that. With Pixel devices, you can experience the stock android and I love it.

However, I managed to get 90% of the stock android experience. If you're interested, check it out here.

But, some really humble members of the internet have ported this application and made it available for devices with Camera2API enabled. You can check that out in this XDA developers article here.

There are a few ways to check that:
  1. Through Command Prompt/Powershell or Terminal (Linux and macOS)
  2. Through terminal emulators
  3. Through Camera2API Probe App. Download here.
For the first 2 methods, type in the following command and you should get any 2 of the output : 

[persist.camera.HAL3.enabled]: [1]
[persist.vendor.camera.HAL3.enabled]: [1]
Command : 

  1. getprop | grep HAL3

Note that you have to enable ADB for the first method and type in ADB shell before the command for the first method. For more details, refer to the link available above.

My old phone did have that enabled, but luck isn't on my side. I couldn't get that installed. At least the generic one.

Final Verdict

Now that you know how Pixels and iPhones can produce stunning photos without a huge deal of specced hardware, Try the Google camera app. I won't be responsible for any damage caused to your device by the process. Viewers' discretion is advised. Thank you.

Sources

Some text quoted here is from Indian Express.

Comments