Picture setting 16 MP and 64 MP, Why 16 MP the image was better?
8806 11 2020-11-30
Uploading and Loding Picture ...(0/1)
o(^-^)o
TechCreator
lvl.3
Flight distance : 187274 ft
Malaysia
Offline

Hi All Expert;

I just try to capture picture using  DJI Pocket 2 (Auto) by setting up 16 MP and 64 MP respectively. The image and distance was set up identical.
The image i got was better for 16 MP as compare to 64 MP. I'm not able to share in this forum due to the file was too big.

I thought higher resolution will better outcome. Anyone have some ideas?

2020-11-30
Use props
djiuser_rNZW2vX7ZRVm
New

South Korea
Offline

i think so....
2020-11-30
Use props
CemAygun
Second Officer
Flight distance : 810 ft
Philippines
Offline

That is normal, and there are two main reasons for that:

First, the sensor is not a "full " 64MP sensor. While there are really 64M effective pixels on it, the color resolution is only 16 MP. So it has 1/4th the color resolution compared to a true 64MP sensor. When you take 16MP photos this is no problem; but when you take 64MP ones, the color of the individual pixels are highly interpolated.

Secondly, because the sensor is too small for this kind of resolutions, the individual phot sites (pixels) at native 64MP mode are not big enough to capture much light. If a bunch of them are used together as a single pixel instead (4 of them binned and used like a much bigger photo site in the case of 16MP), you end up getting better quality, less noise etc.

This is not specific to Pocket 2 though. All these so called "Quad Bayer Filter" sensors share the same properties.

Hope this helps...
2020-11-30
Use props
fansfe82067d
First Officer
  • >>>
Australia
Offline

So - in essence there's no point in using that mode?
2020-11-30
Use props
CemAygun
Second Officer
Flight distance : 810 ft
Philippines
Offline

fansfe82067d Posted at 11-30 16:02
So - in essence there's no point in using that mode?

I would not say so. Some people are quite happy with it, especially under good, bright light. So I guess it boils down to personal preferences.

And the 64MP to 16MP binned nature of the sensor enables some functions like the HDR video.
2020-11-30
Use props
TechCreator
lvl.3
Flight distance : 187274 ft
Malaysia
Offline

CemAygun Posted at 11-30 15:52
That is normal, and there are two main reasons for that:

First, the sensor is not a "full " 64MP sensor. While there are really 64M effective pixels on it, the color resolution is only 16 MP. So it has 1/4th the color resolution compared to a true 64MP sensor. When you take 16MP photos this is no problem; but when you take 64MP ones, the color of the individual pixels are highly interpolated.

Thanks and it was good explaination . Thus if i'm using 64 MP suppose i need to do some adjustment such as illumination condition and etc. I did observe that there is some angle of shooting using 64 MP will give better image.
2020-12-1
Use props
John Walker
Second Officer
United Kingdom
Offline

Out of interest.
Reduce the 64 Mp image to the same pixel dimensions as the 16 Mp image and then compare them side by side,  Only a guess but I think the reduced 64Mp will look better.
2020-12-1
Use props
CemAygun
Second Officer
Flight distance : 810 ft
Philippines
Offline

TechCreator Posted at 12-1 04:53
Thanks and it was good explaination . Thus if i'm using 64 MP suppose i need to do some adjustment such as illumination condition and etc. I did observe that there is some angle of shooting using 64 MP will give better image.

You are welcome And yes, given how small the pixels are at 64MP, good amount of light is a must I would say...
2020-12-1
Use props
CemAygun
Second Officer
Flight distance : 810 ft
Philippines
Offline

John Walker Posted at 12-1 05:11
Out of interest.
Reduce the 64 Mp image to the same pixel dimensions as the 16 Mp image and then compare them side by side,  Only a guess but I think the reduced 64Mp will look better.

That would be interesting to see. Most of these "Quad Bayer" devices brag about using sophisticated binning algorithms to achieve higher image quality (than a regular 16MP sensor for instance), so it would be nice to have a comparison
2020-12-1
Use props
TechCreator
lvl.3
Flight distance : 187274 ft
United States
Offline

CemAygun Posted at 12-1 15:38
That would be interesting to see. Most of these "Quad Bayer" devices brag about using sophisticated binning algorithms to achieve higher image quality (than a regular 16MP sensor for instance), so it would be nice to have a comparison

I google the term quad bayer....interesting on the filter and how the footage captured....so many things to learnt.....and understand.....thanks..
2020-12-1
Use props
CemAygun
Second Officer
Flight distance : 810 ft
Philippines
Offline

TechCreator Posted at 12-1 20:11
I google the term quad bayer....interesting on the filter and how the footage captured....so many things to learnt.....and understand.....thanks..

You are welcome again The best article I have found so far is this one:

https://www.gsmarena.com/quad_ba ... ined-news-37459.php

I have no affiliation with the site or the author by the way; I just think they did a great job summing everything up
2020-12-1
Use props
TechCreator
lvl.3
Flight distance : 187274 ft
Malaysia
Offline

CemAygun Posted at 12-1 21:39
You are welcome again  The best article I have found so far is this one:

https://www.gsmarena.com/quad_ba ... ined-news-37459.php

Great  sharing.... Thanks Aygun....i paste below ;


What’s a Bayer filter?

Let’s start from the beginning – a Bayer filter is a colorful mosaic of Red, Green and Blue filters that allows a digital sensor to capture color photos. Semiconductor pixels don’t “see” color, they only capture the amount of light that hits them, so without a filter you will get a Black & White photo. The Bayer filter makes sure that the light reaching each pixel is of one of the three primary colors.

The way it works out is that a 12MP sensor, for example, has 6 million pixels that see green, and 3 million pixels each for red and blue. Green gets more pixels because the human eye is the most sensitive to that color. An algorithm called demosaicing is used to interpolate a full 12MP resolution image.


A Quad Bayer filter is a bit of a misnomer as it’s actually the same as a regular Bayer filter. What really changes is not the filter but the sensor behind it – these new sensors put four pixels behind each color square instead of just one.

So, really these 48MP Quad Bayer sensors can’t offer much more detail than a 12MP sensor. Sensor and phone makers alike will tell you that smarter demosaicing algorithms can capture more detail, but our experience is that the gain is small – if there’s a gain to be had at all.

Where can you find Quad Bayer filters

The 48MP sensors are the most popular lately - they are already on several dozens of phone. Yet it all started with Huawei's 40MP sensors on the P20 Pro, Mate 20 Pro and Mate 20 X. The Chinese maker even came up with a second version of its sensor for the Huawei P30 and P30 Pro, which switch to Red, Yellow, Yellow, Blue, instead of RGGB, but the principles are the same).

There are also several phones with 32MP selfie cameras with a Quad Bayer filter (e.g. vivo V15 Pro and Samsung Galaxy A70). These have the same pixel size (0.8µm), but are physically smaller and so have a lower resolution.

Samsung recently announced a 64MP Quad Bayer sensor, which again keeps the same pixel size but changes the sensor’s dimension – it’s 33% larger than the current crop of 48MP sensors.

What a Quad Bayer sensor can do

As we said, the true strength of a Quad Bayer lies elsewhere – it can treat a group of four pixels sharing a color filter square as one or as separate sensors.

Right out the gate, these are some of the largest sensors ever put in a phone. E.g. the Sony IMX586 – the first 48MP sensor and one of the most popular – measures 8mm in diagonal. The IMX363 (used in the Pixel 3) and the Samsung S5K2L4 (used in the S10 phones) measure 7.06mm in diagonal. That’s about a 30% gain in surface area.


The pixel sizes are vastly different, however, 0.8µm for the 48MP sensors and 1.4µm for the traditional sensors. All marketing material about the 48MP sensors boast that they can use pixel binning to work like they have 1.6µm pixels.

This creates a 12MP image. The “1.6µm” number is repeated often, but shouldn't get all the credit for improving the low light performance of the sensor. Noise is a random process and if the large pixel of a traditional sensor captures noise instead of signal, there’s little to be done (other than covering it up by interpolating data from neighboring pixels).

If one of the four pixels on a Quad Bayer sensor captures noise, however, that’s only 25% of the information lost – a 4x noise reduction that doesn’t diminish the sharpness of the image.

Alternatively, the sensor can be split up into two logical sensors – one that captures a short exposure and one a long exposure. This is used in daylight for real-time HDR capture.


You could do noise reduction and HDR with a single non-quad Bayer sensor by taking two (or more) photos one after another and combining them. That’s what the Pixel phones do and they are quite good at it.

But there’s a problem – moving objects change position between sequential exposures. A Quad Bayer filter takes two photos at the same time, so there’s no need to use AI to correct for artifacts caused by moving objects. Here’s what it looks like when the correction fails.

HDR done with sequential exposures HDR shot with a Quad Bayer sensor
HDR done with sequential exposures • HDR shot with a Quad Bayer sensor

The simplest way to think about a Quad Bayer filter is that it allows the camera software to capture two photos at the same time. This enables the image processing (HDR and night mode) that is the real reason modern smartphones capture great quality images – the hardware isn’t all that different.

And what it can’t do

Unlike an LCD, which has R, G and B sub-pixels for each pixel, a classic image sensor has only one sub-pixel per pixel. But they can get away with claiming the resolution that they do because these pixels are close together and demosaicing mostly does a solid job of reconstructing the original image (though pixel peepers will know it’s not perfect).

Samsung Galaxy A80: 12MP Samsung Galaxy A80: 48MP
Samsung Galaxy A80: 12MP • 48MP

In a Quad Bayer filter, the pixels of different color are further apart, so demosaicing is less effective (despite what makers claim). So, you’re definitely not getting 4x the detail in 48MP mode than you do in 12MP. In fact, since the HDR and other image processing modes are disabled at 48MP, the 12MP photos sometimes come out with better detail (and much smaller file size, win-win).

Running the demosaicing algorithm on the raw 48MP data may result in a sharper image, but it changes from phone to phone and from scene to scene. If detail levels are critical to a particular shot we've found that the best strategy is to shoot in both modes and then pick whichever comes out better. Most of the time, however, you’re better off sticking to 12MP mode.

Oppo Reno 10x zoom: 12MP Oppo Reno 10x zoom: 48MP
Oppo Reno 10x zoom: 12MP • 48MP

Not to mention that reading out the full 48MP image was also beyond the capabilities of some early sensors and chipsets, so they just got the 12MP image and upscaled it – this is just a waste of storage.

One of the most frequently mentioned advantages of Quad Bayer sensors is superior zoom. While the Nokia 808 PureView did have an impressive zoom, its enormous 41MP sensors had a classic Bayer filter. As discussed, Quad Bayer can offer only a limited gain in sharpness (if that), so it’s really no different than doing digital zoom on a 12MP sensors.

Asus Zenfone 6: 12MP Asus Zenfone 6: 48MP
Asus Zenfone 6: 12MP • 48MP

Marketing departments really want you to believe you’re getting a Hasselblad-like image sensor, but the reality is that the Quad Bayer filter is just a clever (and effective) way of getting better-quality 12MP shots.

Huawei 20 Pro: 12MP Huawei 20 Pro: 48MP
Huawei 20 Pro: 12MP • 48MP

Note: you can use our image comparison tool to see if there's any real benefit of shooting in 48MP mode. Click the little icon and select two images.

Then there's also the matter of optics. We won’t get into the details, but such high resolution cameras are often diffraction limited – meaning that the smallest spot of light can’t be focused on an area smaller than the pixel. You can read about Airy disks for mode detail. Long story short, there is a limit imposed by physics on the maximum resolution small optics and sensors can have.
2020-12-2
Use props
Advanced
You need to log in before you can reply Login | Register now

Credit Rules