snubbr.com

Reducing noise in post-processing for high MPixels camera
Hi,.

I was wondering if it was possible to reduce noise by reducing the resolution of the picture in post-processing from RAW files. For example, if I had a 4000x3000 picture in RAW format, could I make it a 2000x1500 picture with signal to noise ratio being 4x stronger?.

Stated otherwise, could I make a 12Mp camera have the noise figure it would have had with the same sensor size if it had been a 3Mp camera through RAW file post-processing ( at the cost of reduced resolution )?.

Regards..

Comments (33)

Beemer76 wrote:.

Hi,.

I was wondering if it was possible to reduce noise by reducing theresolution of the picture in post-processing from RAW files. Forexample, if I had a 4000x3000 picture in RAW format, could I make ita 2000x1500 picture with signal to noise ratio being 4x stronger?.

The noise would be less visible, but it would not decrease..

Stated otherwise, could I make a 12Mp camera have the noise figure itwould have had with the same sensor size if it had been a 3Mp camerathrough RAW file post-processing ( at the cost of reduced resolution)?.

Ah, if only! By reducing the resolution, you are keeping the signal-to noise-ratio the same, as you are dividing the numerator and the denominator by the same amount. Fewer photosites/pixels is a good thing because it often means larger photosites, which have a better s/n ratio..

Image control:Zoom outZoom 100%Zoom inExpand AllOpen in new window..

Comment #1

Beemer76 wrote:.

Hi,.

I was wondering if it was possible to reduce noise by reducing theresolution of the picture in post-processing from RAW files. Forexample, if I had a 4000x3000 picture in RAW format, could I make ita 2000x1500 picture with signal to noise ratio being 4x stronger?.

No. Theoretically, the S/N ratio will improve, as you are averaging several pixels together (this assumes the same noise is not is all these pixels, ie, the noise is random). But the improvement is much less than 400%! Perhaps 10%?.

Stated otherwise, could I make a 12Mp camera have the noise figure itwould have had with the same sensor size if it had been a 3Mp camerathrough RAW file post-processing ( at the cost of reduced resolution)?.

You can approach this, but all the tests I have seen and done show that it doesn't quite reduce the noise that much..

Try a good NR program?.

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #2

You already have answers. Just thought it would make sense to add noise is much less visible on smaller prints. The same applies to screen images..

Chris Elliott.

*Nikon* D Eighty + Fifty - Other equipment in Profile.

Http://PlacidoD.Zenfolio.com/..

Comment #3

Chuxter wrote:.

You can approach this, but all the tests I have seen and done showthat it doesn't quite reduce the noise that much..

True. One reason being that the fill factor of the 3mp sensor is usually better than the fill factor of a 12mp sensor...

Comment #4

AmanitaM wrote:.

The noise would be less visible, but it would not decrease..

Noise less visible effectively means less noise..

Ah, if only! By reducing the resolution, you are keeping thesignal-to noise-ratio the same, as you are dividing the numerator andthe denominator by the same amount. Fewer photosites/pixels is a goodthing because it often means larger photosites, which have a betters/n ratio..

The signal to noise ratio is improved by down sampling. Averaging 4 pixels together is similar to a photosight that is four times larger..

Noise is a random + or - value to the signal. 4 pixels together is 4Xsignal +- (noise of pixel 1) +- (noise of pixel 2) +- (noise of pixel 3) +- (noise of pixel 4). So you end up with 4 times the signal with the noise having a 50% less standard deviation. So yes, you have a higher signal to noise ratio at the cost of resolution...

Comment #5

Clint Sanders wrote:.

Noise is a random + or - value to the signal. 4 pixels together is4Xsignal +- (noise of pixel 1) +- (noise of pixel 2) +- (noise ofpixel 3) +- (noise of pixel 4). So you end up with 4 times the signalwith the noise having a 50% less standard deviation. So yes, you havea higher signal to noise ratio at the cost of resolution..

This is not my strength but I am not sure I agree. Adjoining pixels have different colour filters so surely averaging 4 of them wil not average out the plus and minus noise because of that colour difference. 4x signal will contain pixels of R-G-B-G on a Bayer pattern. Thus the 2x Green may sometimes average out but not the Red and the Blue? So does that mean 25% less standard deviation for a reduction to 25% of resolution?.

Chris Elliott.

*Nikon* D Eighty + Fifty - Other equipment in Profile.

Http://PlacidoD.Zenfolio.com/..

Comment #6

Chris Elliott wrote:.

This is not my strength but I am not sure I agree. Adjoining pixelshave different colour filters so surely averaging 4 of them wil notaverage out the plus and minus noise because of that colourdifference. 4x signal will contain pixels of R-G-B-G on a Bayerpattern. Thus the 2x Green may sometimes average out but not the Redand the Blue? So does that mean 25% less standard deviation for areduction to 25% of resolution?.

Good point. The Bayer pattern will probably change that a bit..

*grabs calculator*.

Let's see. 2569 + 55439..

*turns calculator upside down*.

Ha ha. It says BOOBS...

Comment #7

Let me help out..

Chris is confusing a "photosite" with a "pixel". A photosite is the smallest photosensitive element and in Bayer sensors, has a colored filter on top. 4 of these photosites (in a Bayer sensor) comprise a pixel..

Chris Elliott wrote:.

Clint Sanders wrote:.

Noise is a random + or - value to the signal. 4 pixels together is4Xsignal +- (noise of pixel 1) +- (noise of pixel 2) +- (noise ofpixel 3) +- (noise of pixel 4). So you end up with 4 times the signalwith the noise having a 50% less standard deviation. So yes, you havea higher signal to noise ratio at the cost of resolution..

This is not my strength but I am not sure I agree. Adjoining pixelshave different colour filters so surely averaging 4 of them wil notaverage out the plus and minus noise because of that colourdifference. 4x signal will contain pixels of R-G-B-G on a Bayerpattern. Thus the 2x Green may sometimes average out but not the Redand the Blue? So does that mean 25% less standard deviation for areduction to 25% of resolution?.

Chris Elliott.

*Nikon* D Eighty + Fifty - Other equipment in Profile.

Http://PlacidoD.Zenfolio.com/.

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #8

Clint Sanders wrote:.

... Noise is a random + or - value to the signal....

I was wondering about that..

There are different kinds of noise, to me one of the most irritating and most common sort of noise is when the photosite registers a photon falsely, either because of electronic interference, heat, or some other cause. This would mean that this kind of noise is mostly +, not -..

Of course there is also the situation that a sensor doesn't register a photon, even though it should, but if I look at noisy images, to me the most distracting are the intense coloured dots, not the dots that are slightly less bright than the rest..

Therefore a very basic approach to reduce noise by reducing the resolution might be to just take the lower of two values, on the basis that + noise is more distracting than - noise..

Of course the next step would be to evaluate not only in horizontal or vertical direction, but in both dimensions, and add a evaluation function too use the middle of the four values..

If you further sophisticate this approach you will end up with current demosaicing and denoising algorithms..

(The main difference to just averaging the values would be, that you take the median and not the mean value and thereby hopefully completly discard the noisy pixels instead of just reducing their influence.).

Bye,Philip..

Comment #9

Chuxter wrote:.

Let me help out..

Chris is confusing a "photosite" with a "pixel". A photosite is thesmallest photosensitive element and in Bayer sensors, has a coloredfilter on top. 4 of these photosites (in a Bayer sensor) comprise apixel..

That is not correct. Your 10 Mpixel camera has 10 million(ish) photosites..

The terms are almost interchangeable but I suppose strictly photosites describes the receptor. Pixels describes the output from those sensors but the numbers are the same. There are approximately 2.5 million each Red & Blue filtered photosites and 5 million Green on a 10Mpixel camera..

To be pedantic there are pixels/photosites around the edge of a sensor that are used for reference purposes and not to capture the photo image. These usually occupy less than 1Mpixel..

So really your 10Mpixel camera is only a 3.3 Mpixel colour camera..

The Foveon sensor is different it reads all 3 colours from the same photosite..

I suggest you go and do some reading Charlie..

Chris Elliott.

*Nikon* D Eighty + Fifty - Other equipment in Profile.

Http://PlacidoD.Zenfolio.com/..

Comment #10

Charlie,.

You may care to read this for starters:.

Http://www.photosa.co.za/tips/digital_b.php.

Not the most definitive source but it is the first I came across..

Chris Elliott.

*Nikon* D Eighty + Fifty - Other equipment in Profile.

Http://PlacidoD.Zenfolio.com/..

Comment #11

Chuxter wrote:.

Let me help out..

Chris is confusing a "photosite" with a "pixel". A photosite is thesmallest photosensitive element and in Bayer sensors, has a coloredfilter on top. 4 of these photosites (in a Bayer sensor) comprise apixel..

Actually, you are the one that is confused. On a Bayer sensor, while they use an RGBG array, all photosites become a pixel (except for the other edge ones as noted by Chris). It just happens that each pixel color is interpolated from the surrounding pixels. You might want to read up on things before you post...

Comment #12

Chris Elliott wrote:.

Charlie,.

You may care to read this for starters:.

Http://www.photosa.co.za/tips/digital_b.php.

Gosh, that was a really bad reference! And dated too....

Not the most definitive source but it is the first I came across..

Yep..

After reading your previous post, I think our difference is semantic. Everything you said I agree with! Except the parts about me. .

I must not have expressed myself well?.

I was commenting on an earlier post, on Tuesday:.

This is not my strength but I am not sure I agree. Adjoining pixelshave different colour filters so surely averaging 4 of them wil notaverage out the plus and minus noise because of that colourdifference..

By definition, a "pixel" is the final combination of the RGBG photosites. A pixel can assume ANY color and luminance (within the color space selected)..

Since this is the Beginners Forum, we should teach more than argue. Here is a simplified diagram that explains graphically how de-Bayering is done:.

Image control:Zoom outZoom 100%Zoom inExpand AllOpen in new window.

Note that I ignored luminance. I'm not sure if anyone actually uses a simple algorithm like this. It might be useful for computationally challenged machines? Note also that the center of a matrix with an even number of cells is not the center of any of these cells. This is NOT a problem!.

A better, but still simple algorithm:.

Image control:Zoom outZoom 100%Zoom inExpand AllOpen in new window.

It should be clear why some extra cells must be provided OUTSIDE the stated sensor field...it is to let the de-Bayering algorithm start working so that it produces good results at the edges..

We don't seem to know what de-Bayering algorithm is used by a specific camera. Often, the manufacturer hides this detail. Some computer picture editing software tell us about the optional de-Bayering choices in their programs. A clue can be inferred from the width of the border around the stated sensor field. Some manufacturers have a rather wide border, implying that their de-Bayering algorithm extends that far..

The semantic issue we have (I think) is whether the "pixel" is the 2x2/3x3/4x4 matrix of values used to "estimate" the center color or the virtual 1x1 representation of that color externally. It can be argued either way....

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #13

Chuxter wrote:.

Chris Elliott wrote:.

Charlie,You may care to read this for starters:http://www.photosa.co.za/tips/digital_b.php.

Gosh, that was a really bad reference! And dated too....

It was the first I came across. Could not be bothered to look further your comment was so plainly wrong..

Not the most definitive source but it is the first I came across..

Yep.After reading your previous post, I think our difference is semantic.Everything you said I agree with! Except the parts about me. .

I must not have expressed myself well? I was commenting on an earlier post, on Tuesday:.

This is not my strength but I am not sure I agree. Adjoining pixelshave different colour filters so surely averaging 4 of them wil notaverage out the plus and minus noise because of that colourdifference..

By definition, a "pixel" is the final combination of the RGBGphotosites. A pixel can assume ANY color and luminance (within thecolor space selected)..

The fundamental point is that a pixel is the output from one photosite not a combination of four of them. If it were otherwise then my D80 would be a 2.5 MPixel camera and the specification of literally every digital camera (except for those with a Foveon sensor) is a reckless misrepresentation. If by "the final combination" you mean the 2x2 matrix that is a RGBG then you are talking nonsense..

Since this is the Beginners Forum, we should teach more than argue.Here is a simplified diagram that explains graphically howde-Bayering is done:.

Image control:Zoom outZoom 100%Zoom inExpand AllOpen in new window.

Beginners should expect to be educated not baffled..

Note that I ignored luminance. I'm not sure if anyone actually uses asimple algorithm like this. It might be useful for computationallychallenged machines? Note also that the center of a matrix with aneven number of cells is not the center of any of these cells. This isNOT a problem!.

And this is helpful to the beginner?.

A better, but still simple algorithm:.

Image control:Zoom outZoom 100%Zoom inExpand AllOpen in new window.

It should be clear why some extra cells must be provided OUTSIDE thestated sensor field...it is to let the de-Bayering algorithm startworking so that it produces good results at the edges..

And this is helpful to the beginner?.

We don't seem to know what de-Bayering algorithm is used by aspecific camera. Often, the manufacturer hides this detail. Somecomputer picture editing software tell us about the optionalde-Bayering choices in their programs. A clue can be inferred fromthe width of the border around the stated sensor field. Somemanufacturers have a rather wide border, implying that theirde-Bayering algorithm extends that far..

And this is helpful to the beginner?.

The semantic issue we have (I think) is whether the "pixel" is the2x2/3x3/4x4 matrix of values used to "estimate" the center color orthe virtual 1x1 representation of that color externally. It can beargued either way....

Bunkum! A pixel is the output of a single photosite that can, according to the filter applied to it, be used to produce a particular colour. You are just plain wrong!! We all are some times! Have a nice day!.

Chris Elliott.

*Nikon* D Eighty + Fifty - Other equipment in Profile.

Http://PlacidoD.Zenfolio.com/..

Comment #14

Chris Elliott wrote:.

Clint Sanders wrote:.

Noise is a random + or - value to the signal. 4 pixels together is4Xsignal +- (noise of pixel 1) +- (noise of pixel 2) +- (noise ofpixel 3) +- (noise of pixel 4). So you end up with 4 times the signalwith the noise having a 50% less standard deviation. So yes, you havea higher signal to noise ratio at the cost of resolution..

This is not my strength but I am not sure I agree. Adjoining pixelshave different colour filters so surely averaging 4 of them wil notaverage out the plus and minus noise because of that colourdifference. 4x signal will contain pixels of R-G-B-G on a Bayerpattern. Thus the 2x Green may sometimes average out but not the Redand the Blue? So does that mean 25% less standard deviation for areduction to 25% of resolution?.

I think the demosaicing issue is probably a red herring. The signal-to-noise ratio is a property of the capture, which happens before the RAW data is subject to any processing..

Regarding the actual numbers, it's beyond my limited knowledge to calculate this from first principles, but here is a brilliant resource:.

Http://www.microscopyu.com/...ls/java/digitalimaging/signaltonoise/index.html.

In particular, use the Java applet near the beginning to compare SNR results when you toggle the pixel binning value between 1 and 4..

If you examine photon noise only, simply by moving the Read Noise slider down to the minimum (unfortunately it doesn't go to zero but the trend is obvious) you'll find that 4x pixel binning results in 2x SNR...

Comment #15

Steve Balcombe wrote:.

I think the demosaicing issue is probably a red herring..

I tend to agree now that I think about it. While each photosite has a SNR, and these are combined with interpolation to form a pixel, this pixel will end up having a certain SNR that's value will be determined from how it was interpolated..

Once you determine it's SNR, further calculations don't really care where the SNR came from..

Now that we have the SNR for a pixel, the standard deviation of the noise of each pixel can be reduced by 50% by combining 4 of them...

Comment #16

Well I said this was not my strength. I shall have to go and read that article slowly several times!!!!.

Chris Elliott.

*Nikon* D Eighty + Fifty - Other equipment in Profile.

Http://PlacidoD.Zenfolio.com/..

Comment #17

Chris Elliott wrote:.

Chuxter wrote:.

Chris Elliott wrote:.

Charlie,You may care to read this for starters:http://www.photosa.co.za/tips/digital_b.php.

Gosh, that was a really bad reference! And dated too....

It was the first I came across. Could not be bothered to look furtheryour comment was so plainly wrong..

The point in offering link references is to strengthen your position, as w/o them, your opinion is just your opinion. My comment was that your reference didn't comment on the issues you and I are discussing. It's like you Googled the subject, but didn't "bother" to actually read the site. Did you know that they gave a receipe for chili? .

After reading your previous post, I think our difference is semantic.Everything you said I agree with! Except the parts about me. .

I must not have expressed myself well? I was commenting on an earlier post, on Tuesday:.

This is not my strength but I am not sure I agree. Adjoining pixelshave different colour filters so surely averaging 4 of them wil notaverage out the plus and minus noise because of that colourdifference..

By definition, a "pixel" is the final combination of the RGBGphotosites. A pixel can assume ANY color and luminance (within thecolor space selected)..

The fundamental point is that a pixel is the output from onephotosite not a combination of four of them..

That would only be true of a monochrome sensor..

If it were otherwisethen my D80 would be a 2.5 MPixel camera and the specification ofliterally every digital camera (except for those with a Foveonsensor) is a reckless misrepresentation..

It IS otherwise. Depending on the particular "demosaicing" algorithm used, each output "pixel" in influenced by MANY sensor photosites..

If by "the finalcombination" you mean the 2x2 matrix that is a RGBG then you aretalking nonsense..

I can assure you that I'm not "talking nonsense"..

Since this is the Beginners Forum, we should teach more than argue.Here is a simplified diagram that explains graphically howde-Bayering is done:.

Image control:Zoom outZoom 100%Zoom inExpand AllOpen in new window.

Beginners should expect to be educated not baffled..

I agree. It's obvious that YOU are baffled. Perhaps that is because you don't consider yourself a "beginner"? You came here to educate others and you are having difficulty switching roles. You need to be educated before you try to educate others!.

Note that I ignored luminance. I'm not sure if anyone actually uses asimple algorithm like this. It might be useful for computationallychallenged machines? Note also that the center of a matrix with aneven number of cells is not the center of any of these cells. This isNOT a problem!.

And this is helpful to the beginner?.

I hope and think so..

A better, but still simple algorithm:.

Image control:Zoom outZoom 100%Zoom inExpand AllOpen in new window.

It should be clear why some extra cells must be provided OUTSIDE thestated sensor field...it is to let the de-Bayering algorithm startworking so that it produces good results at the edges..

And this is helpful to the beginner?.

Perhaps a beginner can comment?.

We don't seem to know what de-Bayering algorithm is used by aspecific camera. Often, the manufacturer hides this detail. Somecomputer picture editing software tell us about the optionalde-Bayering choices in their programs. A clue can be inferred fromthe width of the border around the stated sensor field. Somemanufacturers have a rather wide border, implying that theirde-Bayering algorithm extends that far..

And this is helpful to the beginner?.

This is an important concept. Until users understand what Dr. Bayer invented, it can be tough to cope with other concepts...like understanding how the Foveon sensor and the recent Nikon sensor patent are different..

The semantic issue we have (I think) is whether the "pixel" is the2x2/3x3/4x4 matrix of values used to "estimate" the center color orthe virtual 1x1 representation of that color externally. It can beargued either way....

Bunkum! A pixel is the output of a single photosite that can,according to the filter applied to it, be used to produce aparticular colour. You are just plain wrong!! We all are some times!.

That's just your misguided opinion. .

Have a nice day!.

In spite of your hostility and sarcasm, my day is going well..

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #18

Chris Elliott wrote:.

Well I said this was not my strength..

Agreed..

I shall have to go and read that article slowly several times!!!!.

And then go look at my nice diagrams that show simple Bayer demosaicing. You really need to understand how this is done... .

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #19

Chuxter wrote:.

Chris Elliott wrote:.

The fundamental point is that a pixel is the output from onephotosite not a combination of four of them..

That would only be true of a monochrome sensor..

This is complete rubbish...

Comment #20

Steve, you and Chris have different ideas about what a "pixel" is than I do. Saying that what I say and believe is "rubbish" w/o trying to explain WHY, is cruel and pointless. Rather than call each other names, why don't we calmly try to:.

1. Understand what each other are thinking.2. Explain what we are thinking..

My best guess is that you think that a "pixel" on a Bayer sensor camera is one of the RGB photosites. Can you confirm this?.

Steve Balcombe wrote:.

Chuxter wrote:.

Chris Elliott wrote:.

The fundamental point is that a pixel is the output from onephotosite not a combination of four of them..

That would only be true of a monochrome sensor..

This is complete rubbish..

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #21

Chuxter wrote:.

Steve, you and Chris have different ideas about what a "pixel" isthan I do. Saying that what I say and believe is "rubbish" w/o tryingto explain WHY, is cruel and pointless..

A mild understatement. Steve and I and the rest of the world have a different idea of what a pixel is from you..

Rather than call each other names, why don't we calmly try to:1. Understand what each other are thinking.2. Explain what we are thinking..

I have not called you names. I have rubbished your thinking which is wholly wrong. You can believe what you like but please do not expect to pass it on to others on the Beginners Forum without criticism from me and others..

I would have thought my thinking was crystal clear from what I have said already. You are the one that is saying there is only a semantic difference. I do not accept that..

My best guess is that you think that a "pixel" on a Bayer sensorcamera is one of the RGB photosites. Can you confirm this?.

Your error can be summed up in the following three way exchange:.

The fundamental point is that a pixel is the output from onephotosite not a combination of four of them..

That would only be true of a monochrome sensor..

This is complete rubbish..

To quote you from an earlier post of yours in this thread:.

By definition, a "pixel" is the final combination of the RGBG photosites. A pixel can assume ANY >color and luminance (within the color space selected)..

The first sentence is complete rubbish. I have no quarrel with the second..

To quote your first erroneous post:.

Chris is confusing a "photosite" with a "pixel". A photosite is the smallest photosensitive element >and in Bayer sensors, has a colored filter on top. 4 of these photosites (in a Bayer sensor) >comprise a pixel..

Again this is rubbish. Each photosite produces a pixel..

A pixel is the "dot" - a single point of light and colour - of which all digital photographic images are made. The origiinal unaltered source of each of those pixels is a single photosite (lt is best to leave scanned images out of the equation). When discussing the reduction of noise by reducing resolution - the subject of this thread - one need go no further..

Each pixel gets modified along the way by the demosaicing process, camera settings etc etc. If you reduce resolution (or increase it by interpolation) the original pixel will get lost or at least is likely to be modified..

Thus a pixel is not the same as "one of the RGB photosites". (Your language is very loose) It has a much broader definition. But each photosite produces a pixel and you do not need four of them for one pixel. Just one..

Is that clear enough so there is no wriggle room for you?.

Chris Elliott.

*Nikon* D Eighty + Fifty - Other equipment in Profile.

Http://PlacidoD.Zenfolio.com/..

Comment #22

Thanks for responding again, Chris - I didn't really want to get involved in the detail as you are covering it more than adequately!.

Just one small, possibly rather pedantic point:.

Chris Elliott wrote:.

Each pixel gets modified along the way by the demosaicing process,camera settings etc etc. [snip].

Thus a pixel is not the same as "one of the RGB photosites". (Yourlanguage is very loose) It has a much broader definition. But eachphotosite produces a pixel ....

There's more than one way to skin the cat that is Bayer sensor RAW data. Some (most, I believe) algorithms place the processed pixels in the same positions as the originals, using a number of surrounding pixels (not necessarily the immediate eight) to interpolate RGB values. It would be true to say "each photosite produces a pixel". But (and this is the pedantic bit) in theory you can place the calculated pixels 'out of phase' with the originals, so that they fall in-between. Here the 1:1 relationship is less obvious, because each processed pixel doesn't correspond directly to a specific raw pixel..

Here is a somewhat technical but very good paper on the subject, which I found a year or two ago when I was looking for material to learn from:.

Http://graphics.cs.msu.ru/en/publications/text/prog2004lk.pdf.

My maths is not good enough to understand it 100%, but I find I can extract most of the meaning by accepting the mathematics and concentrating on the accompanying description..

The normally excellent Cambridge in Colour tutorials by Sean McHugh include one on sensors, which gives an introduction to the concept of Bayer demosaicing. Unfortunately in this case Sean has dumbed it down to the point where it is too much simplified to be of great value - but what it does show is how, when the 'out of phase' technique is used, there is still one processed pixel for each raw one..

Http://www.cambridgeincolour.com/tutorials/sensors.htm.

... and you do not need four of them for onepixel. Just one..

I wouldn't have phrased it that way - because Bayer demosaicing requires reference to numerous pixels, of course. But there is a 1:1 correspondence between raw and processed pixels...

Comment #23

Steve, your two links were good and appreciated. They clearly show what is going on. I made a comment or two below....

Steve Balcombe wrote:.

Thanks for responding again, Chris - I didn't really want to getinvolved in the detail as you are covering it more than adequately!.

Just one small, possibly rather pedantic point:.

Chris Elliott wrote:.

Each pixel gets modified along the way by the demosaicing process,camera settings etc etc. [snip].

Thus a pixel is not the same as "one of the RGB photosites". (Yourlanguage is very loose) It has a much broader definition. But eachphotosite produces a pixel ....

There's more than one way to skin the cat that is Bayer sensor RAWdata. Some (most, I believe) algorithms place the processed pixels inthe same positions as the originals, using a number of surroundingpixels (not necessarily the immediate eight) to interpolate RGBvalues. It would be true to say "each photosite produces a pixel".But (and this is the pedantic bit) in theory you can place thecalculated pixels 'out of phase' with the originals, so that theyfall in-between. Here the 1:1 relationship is less obvious, becauseeach processed pixel doesn't correspond directly to a specific rawpixel..

That is why I wanted to isolate the "input pixel" and the "output pixel", as they are quite different..

... and you do not need four of them for onepixel. Just one..

I wouldn't have phrased it that way - because Bayer demosaicingrequires reference to numerous pixels, of course. But there is a 1:1correspondence between raw and processed pixels..

I too think Chris's statement is not only confusing, but wrong. As both your links show, some demosaicing algorithms consider the values of MANY photosites in order to create an "output pixel". I believe that the operative word here is "create"...the output pixel is CREATED by processing MANY input pixels. That I choose to call these "input pixels" "photosites" doesn't seem a big leap to me..

See 2nd part, to follow....

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #24

Looking for somebody to agree with me, I went to the Vincent Bockaert pages here on DPR. I removed the diagrams, BTW..

Each "pixel" on a digital camera sensor contains a light sensitive photo diode which measures the brightness of light. Because photodiodes are monochrome devices, they are unable to tell the difference between different wavelengths of light. Therefore, a "mosaic" pattern of color filters, a color filter array (CFA), is positioned on top of the sensor to filter out the red, green, and blue components of light falling onto it..

The GRGB Bayer Pattern shown in this diagram is the most common CFA used. Mosaic sensors with a GRGB CFA capture only 25% of the red and blue and just 50% of the green components of light..

As you can see, the combined image isn't quite what we'd expect but is sufficient to distinguish the colors of the individual items in the scene. If you squint your eyes or stand away from your monitor your eyes will combine the individual red, green, and blue intensities to produce a (dim) color image..

Note that Vincent calls the "input pixels" "photodiodes"..

Continuing with Vincents quote:.

The missing pixels in each color layer are estimated based on the values of the neighboring pixels and other color channels via the demosaicing algorithms in the camera. Combining these complete (but partially estimated) layers will lead to a surprisingly accurate combined image with three color values for each pixel..

This is a pretty good explanation, especially if you understand what hes talking about before you read it. I dislike his reference to "layers", as there is only one layer in a Bayer sensor (Foveon sensors have 3 layers)..

I believe that what Vincent calls missing pixels are not really missing. What is missing is complete RGB data for each pixel! Im saying that at least 4 photosites(and in most high IQ cameras, more than 4 photosites) are used to estimate the RGB values for ALL photosites&even; though each photosite can, at best, only detect 1/3 of the RGB data. In doing this, the algorithm "creates" RGB pixel data..

My biggest semantic point is that its CONFUSING to call both photosites on the sensor (which only detect one color) and the RGB picture elements in a computer file, pixels. At least 3 photosites must be sampled to have RGB data. I dont think it makes sense to refer to a Red Pixel! A Pixel (as most beginners think of them) have information about both luminance and color. And its equally confusing to call a red photosite a red pixel and then later (after interpolation of the green and blue values) call the composite RGB value at that red photosite a RGB pixel..

I know that to most people, the details of how a Bayer sensor works is bizarre. But they need explanations that non-technical users can understand. Goggling demosaic algorithms only finds stuff about bilinear, bicubic, spline, etc. Reading a bicubic equation doesnt help much!.

There is scant information about what demosaicing algorithm a particular camera uses. Yes, I know that when saving as RAW, the demosaicing is not performed in the camera, but later when the RAW data is converted. But a JPEG file requires a composite RGB value for each photosite and that means the camera must demosaic the Bayer data. But HOW? Ive never seen a spec that says that some camera uses a bicubic spline demosaic algorithmSome reviewers DO comment when a camera creates poor quality JPEG images, but they never seem to connect it to the way that camera does demosaicing. Yes, I know that it's difficult to connect the dots when the camera manufacturers hide them for competitive reasons..

If you guys object to my usage of photosite and pixel, I apologize. I was trying to make it simple for the Beginners& J.

I worked for 15 years in the OCR industry. While my part was mostly the "datalift"...the illumination, imaging, detection, and analog-to-digital processing of the data...I was aware of the digital processing that was going on downstream. Our images were monochrome, but we did several matrix/mosaic operations, mostly to find edges and normalize the background levels..

From that industry, I tend to call the sensor elements, "photosites", although technically, they are photodiodes or phototransistors. When there is processing between the detector and the final image, I reserve the term "pixel" until later in the stream, to avoid confusion..

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #25

Chris Elliott wrote:.

Chuxter wrote:.

Steve, you and Chris have different ideas about what a "pixel" isthan I do. Saying that what I say and believe is "rubbish" w/o tryingto explain WHY, is cruel and pointless..

A mild understatement. Steve and I and the rest of the world have adifferent idea of what a pixel is from you..

Rather than call each other names, why don't we calmly try to:1. Understand what each other are thinking.2. Explain what we are thinking..

I have not called you names. I have rubbished your thinking which iswholly wrong..

I agree that it is wrong to "rubbish my thinking". .

You can believe what you like but please do not expectto pass it on to others on the Beginners Forum without criticism fromme and others..

OK. Now that I have been criticised for the way I think, can we discuss this?.

I would have thought my thinking was crystal clear from what I havesaid already. You are the one that is saying there is only a semanticdifference. I do not accept that..

My best guess is that you think that a "pixel" on a Bayer sensorcamera is one of the RGB photosites. Can you confirm this?.

Your error can be summed up in the following three way exchange:.

The fundamental point is that a pixel is the output from onephotosite not a combination of four of them..

That would only be true of a monochrome sensor..

This is complete rubbish..

But by your comment, I and others don't understand what you believe is factual. You are being too emotional..

To quote you from an earlier post of yours in this thread:.

By definition, a "pixel" is the final combination of the RGBG photosites. A pixel can assume ANY >color and luminance (within the color space selected)..

The first sentence is complete rubbish..

I disagree. You may choose to think it's rubbish, but it's not. it agrees well with what Vincent Bockaert, Kubasov Lukin, and Sean McHugh think and say. Go look at the diagrams in Sean's site:.

Http://www.cambridgeincolour.com/tutorials/sensors.htm.

And note how similar his are to mine! Sean avoids the "pixel" vs "photosite" issue by calling the photosites "cavities"..

I have no quarrel with the second..

To quote your first erroneous post:.

Chris is confusing a "photosite" with a "pixel". A photosite is the smallest photosensitive element >and in Bayer sensors, has a colored filter on top. 4 of these photosites (in a Bayer sensor) >comprise a pixel..

Again this is rubbish. Each photosite produces a pixel..

No, each photo site produces PART of the data needed to create a "pixel"..

A pixel is the "dot" - a single point of light and colour - of whichall digital photographic images are made. The origiinal unalteredsource of each of those pixels is a single photosite (lt is best toleave scanned images out of the equation). When discussing thereduction of noise by reducing resolution - the subject of thisthread - one need go no further..

Yes, a "pixel" is one "picture element"...a dot. In a color picture, this "dot" is a multi-color dot. Get up close to a TV screen...those red, blue, and green "dots" are NOT "pixels"! It takes 3 of them to be a "pixel"..

Each pixel gets modified along the way by the demosaicing process,camera settings etc etc. If you reduce resolution (or increase it byinterpolation) the original pixel will get lost or at least is likelyto be modified..

Thus a pixel is not the same as "one of the RGB photosites". (Yourlanguage is very loose) It has a much broader definition. But eachphotosite produces a pixel and you do not need four of them for onepixel. Just one..

Wrong again. In case you are thinking that I'm saying that the data from a sensor with (for example) 10,000,000 total photosites (5,000,000 green, 2,500,000 red, and 2,500,000 blue) is processed to give 2,500,000 total RGB pixels, thats not true. I know and am saying that the data fram a sensor with 10,000,000 photosites can be processed to produce 10,000,000 pixels. I am further saying that the data from one photosite does not resemble the final pixel data associated with that photosite. They are DIFFERENT!!! I choose to make tha difference clear by assigning different names to them..

Is that clear enough so there is no wriggle room for you?.

Yep...I'm not wiggling. .

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #26

Statross wrote:.

Http://www.steves-digicams.com/techcorner/aug_2007.html.

Yes, that's a good reference too. Chris, you should read this! In case you don't, and for other's use, I'll quote a couple of sentences from Mike Chaney's referenced article:.

"Wikipedia defines a picture element, or pixel, as 'the smallest complete sample of an image'. Others use similar terminology such as 'the smallest discrete component of an image'. The key here is that we are talking about the smallest element in an image: that is, the final picture or photograph.".

"A pixel then, must have all three (red, green, and blue) components to be a complete sample of the final image.".

"While a '10 megapixel' claim is accurate with respect to how many pixels are in the final (developed) image, somewhere along the way, the megapixel moniker has gotten confused with 'camera resolution'. A typical camera claimed to be a 10 megapixel digital camera may produce 10 megapixel images, but by definition, the camera itself (the sensor) does not contain 10 million pixels. Far from it in fact! While a "10 megapixel" claim is accurate with respect to how many pixels are in the final (developed) image, somewhere along the way, the megapixel moniker has gotten confused with "camera resolution". A typical camera claimed to be a 10 megapixel digital camera may produce 10 megapixel images, but by definition, the camera itself (the sensor) does not contain 10 million pixels. Far from it in fact! This "10 megapixel digital camera" actually contains no pixels whatsoever on it's sensor. Instead, the sensor is a conglomerate of 5 million green photosites, 2.5 million red photosites, and 2.5 million blue photosites.".

Did you notice that sentence which reads, "This "10 megapixel digital camera" actually contains no pixels whatsoever on it's sensor?".

Mr. Chaney understands: Bayer sensors have photosites and their images, after demosiacing, have pixels!.

Please study this technology some more and resist commenting about things that you don't understand on the Beginners Forum. Or if that is too much to ask, then at least slow down, read what others have to say, and comprehend it before you start to argue. .

Thanks again to statross for the link to Steve's....

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #27

Chuxter wrote:.

Statross wrote:.

Http://www.steves-digicams.com/techcorner/aug_2007.html.

Yes, that's a good reference too..

No it's not, it's the standard nonsense trotted out to try and justify the Foveon sensor...

Comment #28

Steve Balcombe wrote:.

Chuxter wrote:.

Statross wrote:.

Http://www.steves-digicams.com/techcorner/aug_2007.html.

Yes, that's a good reference too..

No it's not, it's the standard nonsense trotted out to try andjustify the Foveon sensor..

Steve, I realize that Mike Chaney's article (this month) is about how the Foveon sensor is different from a Bayer sensor. I'm not picking sides in this argument..

He DOES seem to present the differences well and takes a reasonable position. I didn't notice him "selling" the Foveon idea over the Bayer idea. His article is mostly about the industry and their bastardizing of language and ideas..

I don't have a Sigma camera and don't plan to buy one. But I have a friend with one. It does take good pix..

I was just saying that his remarks in the 2nd and 3rd paragraph, that I quoted, were relevant to our discussion. If you reject everything Mr. Chaney says, because he dares to talk about Foveon sensors, that says a lot! He has been publishing monthly articles since July 2004. I've read some other articles he wrote and he is a pretty good writer and seems informed about photographic technology. One of his articles (Jan 2006) trys to "sell" one of his products, Qimage. It uses a unique interpolation algorithm, called "pyramid resampling".



He DID quote Wikipedia, which is respected, but occasionally can be slanted. If you object to the definition of "Pixel" there, go change it!.

I googled "pixel definition" and Google compiled a list of 28 definitions found on the Internet. I looked through all of them and none even mentioned a "sensor". A typical definition is the first one: "The information stored for a single grid point in the image." Universally, these definitions reffered to a pixel as part of an IMAGE. While you may object to that usage, you and Chris are in the minority..

I googled "photosite definition" and Google returned 3 definitions. The first one was relevant: "A single photosensitive element in a CCD which translates to one pixel in the resultant image." It was a bit narrow, however, as other types of sensors (CMOS, for example) have photosites too. .

Instead of jabbing at me, why not do the work to show us evidence that even a minority think and talk the way you guys do...and that is is more correct than the way the rest of us think and talk?.

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #29

Talking to you is a waste of time. I tried once before, a month or so ago. I'm done with it...

Comment #30

Correct me if i'm wrong...but the page and a half or so has been arguing whether a photodiode is a pixel?.

If so: http://dictionary.reference.com/browse/pixel.

And in any case, one photodiode cannot possibly be a pixel as a pixel is a picture element (as recorded and displayed, i.e. in JPEG form after data is compiled)..

In the same vein, nor are foveon sensors' photodiodes pixels, because a pixel needn't be red green and blue and nothing else..

I could have 10,000,000 light receptors with 10,000,0000 slightly differing filters on each and all that data could be used to form just 1 pixel in a final image..

Pixel = picture elemnt.

Photodiode = magic light-catching bucket..

On the other hand, I may be missing the point, it was alot to wade through:P..

Comment #31

Statross wrote:.

Correct me if i'm wrong...but the page and a half or so has beenarguing whether a photodiode is a pixel?.

[snip].

Pixel = picture elemnt.

Photodiode = magic light-catching bucket..

The issue began simply as whether there is a 1:1 relationship between the magic light catching buckets and the picture elements. I (and I think I can speak for Chris Elliott here too) really couldn't care less what people call them in the privacy of their own homes, although chuxter seems to have created his own issue out of that..

There is, of course, a 1:1 relationship. This is true of all conventional Bayer sensors, and it is equally true of Foveon sensors. The only exception I can think of is Fuji's SuperCCD technology..

Chuxter, way back, claimed there was a 4:1 relationship. Here are his exact words:.

Chris is confusing a "photosite" with a "pixel". A photosite is thesmallest photosensitive element and in Bayer sensors, has a coloredfilter on top. 4 of these photosites (in a Bayer sensor) comprise apixel...

Comment #32

Steve Balcombe wrote:.

Statross wrote:.

Correct me if i'm wrong...but the page and a half or so has beenarguing whether a photodiode is a pixel?.

[snip].

Pixel = picture elemnt.

Photodiode = magic light-catching bucket..

The issue began simply as whether there is a 1:1 relationship betweenthe magic light catching buckets and the picture elements. I (and Ithink I can speak for Chris Elliott here too) really couldn't careless what people call them in the privacy of their own homes,although chuxter seems to have created his own issue out of that..

There is, of course, a 1:1 relationship. This is true of allconventional Bayer sensors, and it is equally true of Foveon sensors.The only exception I can think of is Fuji's SuperCCD technology..

Chuxter, way back, claimed there was a 4:1 relationship. Here are hisexact words:.

Chris is confusing a "photosite" with a "pixel". A photosite is thesmallest photosensitive element and in Bayer sensors, has a coloredfilter on top. 4 of these photosites (in a Bayer sensor) comprise apixel..

That last sentence was clearly bad. In hindsite, I wish I had used another word in place of "comprise". If you read my subsequent posts, I think it's clear what I meant..

Let's go back to the beginning of this. Chris, responding to an idea from Clint, said:.

"This is not my strength but I am not sure I agree. Adjoining pixels have different colour filters so surely averaging 4 of them wil not average out the plus and minus noise because of that colour difference...".

I was commenting about his "...Adjoining pixels have different colour filters..." which id obviously wrong. I was simply trying to convey that those things that have color filters are "photosites" and that many of them are combined to produce a "pixel". I used 4 because he had suggested it. However, I think Chris thought I meant that the number of output pixels was 1/4 of the total number of photosites. It went down from there....

Now it's gotten to the point where pride is in the way of judgement. Not talking about me, of course! .

And in the process, we have demonstrated to the OP the value of this new forum, where anybody can answer their questions. .

Charlie DavisNikon 5700 & Sony R1HomePage: http://www.1derful.infoBridge Blog: http://www.here-ugo.com/BridgeBlog/..

Comment #33

Click Here to View All...

Sponsored Amazon Deals:

1. Get big savings on Amazon warehouse deals.
2. Save up to 70% on Amazon Products.


This question was taken from a support group/message board and re-posted here so others can learn from it.

 

Categories: Home | Diet & Weight Management | Vitamins & Supplements | Herbs & Cleansing |

Sexual Health | Medifast Support | Nutrisystem Support | Medifast Questions |

Web Hosting | Web Hosts | Website Hosting | Hosting |

Web Hosting | GoDaddy | Digital Cameras | Best WebHosts |

Web Hosting FAQ | Web Hosts FAQ | Hosting FAQ | Hosting Group |

Hosting Questions | Camera Tips | Best Cameras To Buy | Best Cameras This Year |

Camera Q-A | Digital Cameras Q-A | Camera Forum | Nov 2010 - Cameras |

Oct 2010 - Cameras | Oct 2010 - DSLRs | Oct 2010 - Camera Tips | Sep 2010 - Cameras |

Sep 2010 - DSLRS | Sep 2010 - Camera Tips | Aug 2010 - Cameras | Aug 2010 - DSLR Tips |

Aug 2010 - Camera Tips | July 2010 - Cameras | July 2010 - Nikon Cameras | July 2010 - Canon Cameras |

July 2010 - Pentax Cameras | Medifast Recipes | Medifast Recipes Tips | Medifast Recipes Strategies |

Medifast Recipes Experiences | Medifast Recipes Group | Medifast Recipes Forum | Medifast Support Strategies |

Medifast Support Experiences |

 

(C) Copyright 2010 All rights reserved.