A few weeks ago I wrote an article explaining why the angle of view is more helpful than the focal length when comparing lenses of different camera systems. I also pointed out why there is the need for a reference and why the 35mm format (36x24mm, also called “full frame”) is the one we tend to use. If you haven’t read it, I invite you to check it out!
This time I want to talk about another topic that often causes confusion and triggers fierce discussions when comparing camera systems: aperture equivalence.
I admit that it took me some time to write this post. At first, equivalence might appear easy to explain. However if we start to scratch beneath the surface, we discover that there are many more technical aspects to take into account.
The reason I was inspired to write this article is because I often encounter comparisons such “an Olympus 300mm f/4 “in reality is” a 600mm f/8 or a Fuji 56mm f/1.2 “in reality is” a 85mm f/1.8″ online.
Well let’s talk about this “reality.”
[toc heading_levels=”3″]
When it comes to choosing a lens, photographers usually check two specifications first: the focal length and the fastest aperture. I am pretty sure that many of you know what the aperture is and what it does but let’s just summarise for the sake of this post. The aperture is basically a hole inside the lens whose diameter can be narrowed or enlarged (thanks to the diaphragm iris mechanism) depending on the f-stop value you choose. A fast aperture (for example f/1.4) will produce a larger hole while a slower aperture (f/11) will have a smaller hole.
This aperture affects two things: the amount of light that hits the sensor when you take the picture and the depth of field of your image. Take note of the first point (the amount of light) and save it somewhere because I want to talk about depth of field first. Why? Because this is usually what matters to people when it comes to comparing camera systems.
Aperture and Depth of Field
Depth of field refers to the distance between the nearest and farthest points from the focal plane that appear in focus. In other words, it is the area that appears sharp (in focus) in the photograph. A fast aperture like f/1.4 will produce a shallow depth of field so that distance will be shorter and your image will have more out-of-focus areas. A smaller aperture like f/11 will produce a deeper depth of field so the distance will be longer and more areas will be in focus.
Depth of field matters because it can play an important role in the look/style you want to achieve. It’s not just a technical aspect but also an artistic/creative element.
Photographers are interested in knowing how sharp a lens is wide open, how beautiful the bokeh (quality of the out-of-focus area) can be and how capable the lens is at isolating a subject. They are also interested in the overall look and shallow depth of field a fast lens can deliver. Therefore it has become common practice to compare depth of field between lenses designed for different sensor sizes to get a clearer picture of how your image will look when taken with a smaller sensor. The easiest way to do this is to compare equivalent f-numbers.
The question is: is it correct to compare lens apertures between camera systems by considering just the depth of field? Let’s look into this a little more.
Let’s get out of the way what has already been explained many times. We have three lenses with different focal lengths but almost identical angles of view on their respective formats:
- 50mm f/1.4 on 36×24
- 35mm f/1.4 on APS-C
- 25mm f/1.4 on m4/3
First let’s look at the aperture. In the introduction, I referred to it as a hole inside of our lens. To be more precise, aperture refers to three distinctive terms:
- The Physical aperture a.k.a. the iris mechanism inside the lens
- The Virtual aperture (also called entrance pupil): the optical image of the physical aperture seen through the front element of the lens
- The Relative aperture (f-ratio or f-stop): the quotient of the focal length and the virtual aperture.
The virtual aperture is what interests us here because it is used to calibrate the opening and closing of the diaphragm aperture. We can calculate the virtual aperture with the following formula: Focal length/f stop = virtual aperture. So for the three lenses mentioned above, it would work this way:
- 36×24 format: 50/1.4 = 35.7mm
- APS-C format: 35/1.4 = 25mm
- m4/3 format: 25/1.4 = 17.9mm
The 50mm lens has a larger aperture for the same f-stop number and that makes sense. The virtual aperture depends on the focal length, and the focal length for the 36×24 format is longer than the other two lenses because it has to cover the same angle of view on a larger sensor. What if I do the math a second time to match the same virtual aperture of the m4/3 lens?
- 36×24 format: 50/17.9= 2.8
- APS-C format: 35/17.9 = 2.0
- m4/3 format: 25/17.9 = 1.4
This second calculation tells us that if you take a picture at the same distance with the 25mm at f/1.4 on the m4/3 camera, that aperture corresponds to f/2 on the 35mm and f/2.8 on the 50mm as far as depth of field is concerned. This last part of this sentence is very important so I will repeat it again and make it bold: as far as depth of field is concerned.
You can also calculate the depth of field via some complicated equations but fortunately there are expert websites that offer the possibility to do so quickly by inputing basic parameters. The simplest calculator is at Cambridge in Colour and you can see a numerical comparison below with the same focal lengths mentioned before.
Because of the results shown above, many people believe that the fastest aperture indicated on lenses designed for smaller sensors is “misleading”. One year ago, Tony Northrup made a series of popular videos where he explains why all camera companies are “cheating” when it comes to aperture. Given the popularity of Tony’s channel, you can imagine the success (and controversy) these videos attracted. Now I don’t want to go into Tony’s argument here but I’ll take one of his points which is:
- Why does the specs list always include the focal length equivalence but not the aperture equivalence?
For example Panasonic writes that its 25mm f/1.4 has an equivalent focal length of 50mm on 36×24 format but doesn’t include the equivalent aperture of f/2.8 for the same format.
Is there any reason to not make this aperture equivalence official on every specification list? Well, to answer that we need to turn to our second chapter.
Do you remember what I asked you to save at the beginning of the article? Yes, it’s time to talk about the amount of light passing through the aperture.
Aperture and Exposure
With the focal length you can refer to the angle of view, which is a simple unit of measurement. The aperture equivalence is a little more complicated because the f-stop number affects both the depth of field in a real-world shooting situation and your exposure.
The reason you choose a fast lens is not necessarily just for the depth of field. You might want one because it allows you to gather more light. If you often work in low-light conditions, a faster aperture allows you to have a better exposure, keep your ISO down or maintain a faster shutter speed. A quick example: a f/2.8 or f/2 telephoto lens can be more useful than a f/4-5.6 zoom lens when shooting indoors.
You might need a fast wide angle lens for astrophotography: a larger aperture will make it easier to capture all the very distant points of light in the sky.
Below is a different example. With this long exposure shot I wanted to blur the water as much as I could. I used an ND filter, set my ISO to 100 (Low on the E-M1) and the aperture to f/14 so that I could use a 50s shutter speed. Here again the depth of field didn’t cross my mind while setting up the camera, although f/14 rendered the entire image in focus.
So now let’s go back to our 3 lens example. If I say that a 25mm 1.4 on m4/3 is like a 50mm f/2.8 on full-frame, you could also think that the amount of light passing through is involved.
Actually it is a common belief that Micro Four Thirds lenses gather less light than a full frame lens. The latest example is on the well respected and experienced Dpreview site. This is what they state about the Olympus 300mm f/4 Pro:
- Although its F4 maximum aperture is equivalent to F8 on full frame in terms of depth-of-field and light gathering (in total image terms)… From Dpreview closer look article
Dpreview’s statement above is not completely incorrect but it oversimplifies and might easily confuse those who don’t have extensive knowledge of how exposure and image sensors work. The easiest way to explain this is to start by seeing how this applies in real-world shooting.
Below you can see three images taken with three lenses I own that are the closest in terms of angle of view. I matched the exposure for the three cameras: f/1.8 aperture, 1/100 shutter speed and 200 ISO:
- Sony 55mm f/1.8 on A7r II (43°)
- Fuji XF 35mm f/1.4 on X-T1 (45°)
- Panasonic 25mm f/1.4 on GX8 (47°)
Are we seeing an overexposed image by two stops from the Sony or an underexposed image from the Micro Four Thirds sensor? Nope.
For reference purposes, here is how the X-T1 and A7r II images would look if I set them to the “equivalent” aperture of f/2.5 and f/3.5 respectively, giving both lenses the same virtual aperture (and therefore the same depth of field) as the m4/3 lens.
Now let’s play a little more. Below I included two additional images. One is the A7r II with the same lens at f/2.8, the second was taken with the Sony RX10 II at f/2.8. The latter has a smaller sensor than Micro Four Thirds (1 inch). See for yourself!
The test above shows the following: if the focus distance, ISO and shutter speed of different camera systems with different sensor sizes and the virtual aperture inside the lenses are the same (so different f-stop numbers), you get the same amount of depth of field but not the same exposure. If the virtual aperture diameter is different (but with same f-stop values), the depth of field produced is different but the exposure is the same.
I can summarise this by referring once again to the calculations shown in the first chapter:
- Same exposure, different DoF:
- 36×24 format: 50/1.4 = 35.7mm
- APS-C format: 35/1.4 = 25mm
- m4/3 format: 25/1.4 = 17.9mm
- Different exposure, same DoF
- 36×24 format: 50/2.8= 17.9mm
- APS-C format: 35/2 = 17.9mm
- m4/3 format: 25/1.4 = 17.9mm
Now you might ask: isn’t a camera/lens combo with a smaller sensor supposed to gather less light than an equivalent combo with a larger sensor? How can the exposure match if with the same f-stop value the aperture inside the full frame lens is twice as large and the sensor is four times larger than the m4/3 camera?
Well, the first answer is that the focal length on the 36×24 camera is double the focal length of the m4/3 camera (50mm vs 25mm) so the full frame camera requires more light to get the same exposure on its larger sensor. That’s why the virtual aperture is larger.
This also helps us explain why we have f-stop numbers in the first place. They were introduced as a unit of measurement to compare the speed of lenses (a.k.a. the amount of light they can transmit). This means that with the same f-stop value, we can expect lenses to give us the same exposure on any camera regardless of the focal length or sensor size.
Note: I am purposely skipping a few things here to make this article as easy as possible to read. If I wanted to be strictly scientific, I should also mention the T stop for example (effective light transmission of a lens where the f-stop value is often but not always a good approximation). For instance, the Pana/Leica 25mm f/1.7 has a T-stop of 1.7 while the Sony 55mm f/1.8 has a T1.8 transmission (based on DXOmark results).
This also tells us that the sensor size itself is not directly responsible for our exposure. Let’s dig a little bit more into this, shall we?
Aperture and Noise Ratio – Or: this is where I learned to stop caring
Let’s have another look at the Dpreview statement seen before (note that I have nothing against Dpreview, it is just one of the most recent examples that came to mind):
- Although its F4 maximum aperture is equivalent to F8 on full frame in terms of depth-of-field and light gathering (in total image terms)…
Now let’s extract the last part of that sentence: “light gathering (in total image terms)” which we can simplify by calling it Total Light. What does Total Light mean exactly? Well, it is the total number of photons (elementary particles of light) that fall onto the sensor and can also be explained with the following equation: Total Light = Exposure x Sensor Area.
What is important to explain here is this: the total light hitting the sensor doesn’t determine your exposure. It is the amount of light passing through the lens that determines it.
The exposure is in fact determined by the following factors:
- Scene luminance: how much light there is in your scene
- F-stop: how large you set your aperture inside the lens
- Shutter speed: how much time you let that light hit the sensor
The light density that enters your lens remains the same because it is determined by how much light there is in your scene in the first place, not by how large your sensor is. Then you decide how much light you want to reach to the sensor by closing or opening the aperture or increasing/decreasing the shutter speed. I repeat, the sensor size itself doesn’t affect your exposure.
You might have also noticed that I didn’t include the ISO settings in the list above. This is because the ISO settings don’t directly affect the amount of light passing through the lens and hitting the sensor. It can be used indirectly to change the exposure (decreasing the ISO and opening the aperture or slowing down the shutter speed) or to increase the brightness of your photograph (for example if there isn’t enough light in a room). It works on an electrical level, not a light level and in fact it increases the brightness of your image (brightness is different from exposure).
Ok, so the exposure remains the same with the same F-stop number, but the Total Light factor must affect something, right?
Yes, of course it does. Because the larger sensor captures more Total Light (Photons) on a larger surface, it will gather more “data” to increase the Dynamic Range and decrease the Noise density (NSR – Noise to Signal Ratio). In other words, your picture will have less noise, look “cleaner” and have more information in the highlights and shadows.
Let’s analyse another comparison below with low-light images shot at ISO 6400 with the three cameras. We can notice that the exposure once again is similar but the A7r II has a cleaner image. That is where the larger sensor helps.
If we decide to compare the same shots with the same amount of noise, we could calculate the following: the 36×24 sensor is 4 times larger than Micro Four Thirds and 2.5 times larger than APS-C. For a sensitivity of 6400 on the full-frame sensor, the equivalent sensitivity would be:
- 6400/4= 1600 ISO on m4/3
- 6400/2.25=2845 ISO on APS-C
So to have the same noise density in my three images above and therefore match the Total Light gathering, I should have taken my GX8 shot at 1600 ISO and my X-T1 shot at 3200 ISO (closest value to 2845). Of course the two shots would have also been underexposed. To return to the same exposure, I would have had to change my shutter speed or adjust my aperture to f/1.4 for the m4/3 camera and f/2 for the X-T1.
Note: I’ll leave out the fake ISO controversy concerning Fujifilm cameras so as not to make this article more complicated.
So after all these analyses, can we finally say that a 25mm f/1.4 on m4/3 “in reality” is a 50mm f/2.8? Well, it does seem to be correct if we consider Depth of Field and Total Light a.k.a. Noise Density. But here lies my personal disagreement.
First, the f-stop value is related to exposure and we’ve seen that exposure is not influenced by the sensor size or even the ISO sensitivity. It is only related to the amount of light passing through the lens. So to say that “25mm f/1.4 = 50mm f/2.8” means we are mistakenly using the f-stop to compare Noise Density as well.
To me, not only does this argument starts to appear a little forced but we also lose perspective of what matters most when you are out shooting. I think the f-stop value is enough to compare depth of field without making everything more complicated.
Second, this quick comparison between lenses and f-stop numbers doesn’t take into account the sensor technology itself. Granted most image sensors nowadays are CMOS. But some of them have BSI technology for example which means they gather more light than standard CMOS chips. Some sensors are even larger than 36×24 but are based on CCD technology which performs worse at high ISOs. I’m not sure how the Noise Density would compare there. The same reasoning can be applied to the Sigma Foveon APS-C sensor: what would be the equivalent ISO knowing that past ISO 400 this sensor produces very noisy images? As you can see, this reasoning is not universally compatible.
Third, it comes down to simple logic. When you are out there shooting, you don’t set your camera according to the Noise Density you will get in comparison to another system.
The same old rule applies to every camera, from your smartphone to your medium format gear: you try to keep your ISO value as low as possible and give your sensor as much light as possible for an optimal exposure.
If you can’t gather enough light with either your aperture or your shutter speed, then you increase the sensitivity. Finally if we come back to the ISO equivalence calculated above, I mentioned that to match the same exposure I could also have changed the shutter speed, which is something that you can do in real life.
Preface to the conclusion: Specs vs. Marketing
What I find interesting is that after this in-depth analysis concerning equivalence between camera systems, we come back to two basic arguments. A smaller sensor will give you a deeper depth of field and more noise with the same aperture or ISO settings in comparison to a larger sensor. This is true for smartphones, compact cameras, bridge cameras, m4/3 cameras, APS-C cameras and 35mm format cameras.
Actually, just recently Phase One announced the first full-frame medium format camera. Which one is the real full-frame then? Should we do all the math again?
I understand that the 35mm format remains a reference because film used to be so popular but today in a digitally dominated era, some of the most popular cameras are smartphones, so more or less the 1/3 inch format. Maybe we should reset everything and start from that (it would be fun for sure!).
You might have noticed that I don’t use the term “crop” for a smaller sensor and I try to avoid the term ”full frame” as much as possible. Of course, they are a popular way to refer to these different sensor formats so I can’t ignore them completely but I feel they are overused. In actually fact, full frame could refer to any sensor format because it also means that the entire sensor surface is used to take a picture. Put that way, a m4/3 sensor is a full frame sensor. “Crop sensor” simply refers to sensors smaller than 36×24 and the fact that there is a crop factor (1.5x for APS-C, 2x for m4/3). It is used because once again “full frame” (35mm format) is the reference.
Perhaps companies could add additional specs for Transmission, Depth of Field, Total Light, Noise Density and implement them into the official spec sheets of their photography products so that you would get “100% transparent” information. However I am not sure that doing so would create less confusion. In fact I can already imagine the increased number of heated discussions on forums and social media groups because there are so many more numbers to compare!
Besides, further mention of all these specs and numbers and how they compare to the full-frame format will only penalise smaller systems that find their virtues elsewhere.
Another thing worth remembering is that companies make products to sell them. A brand will always show the positive side of its products, not the negative. Otherwise it would be marketing suicide. It’s up to reviewers, camera testers, specialised magazines and websites to list the pros and cons. That is true not only for cameras but any product you can find on the market (smartphones, computer, cars, etc).
Do you think that every phone commercial shows you images or video footage taken with that specific product? We could talk about the “world’s fastest AF” quote that we often see in press releases. I remember some banners for the launch of the first A7 and A7r that were plastered with the slogan “Faster than a DSLR”.
Or maybe try to watch various promotional videos from different camera brands. You will often discover that many of them start with an ambassador photographer taking a picture with emotional music in the background and his voice-over saying “I am a photographer and I tell stories with my images“. As much as this sounds banal and repetitive, in the end, it’s what matters.
Olympus, Panasonic, Fujifilm, Sony, Nikon, Canon – they all create boxes that convert light into an electrical signal. Each one has its own set of advantages and disadvantages. It’s up to the photographer to do the rest.
Yes, it’s true that some product specs can appear confusing especially for someone who doesn’t know a lot about gear. But I also think that the additional information can confuse people even more. Should we condemn Olympus or Panasonic because they don’t state the equivalent aperture or Total Light gathering? Or should we focus more on the advantages a smaller system can give you? A Sony RX10 II advertised with a 24-200mm f/2.8 will appear misleading to some but it also describes a compact camera with a fast lens that gives you the same zoom range (and the same exposure) as more expensive and larger cameras. Of course, with the sensor being smaller, you’ll have more noise and less dynamic range.
One last question could be: why don’t these mirrorless brands produce faster lenses to compensate for their so-called “inferiority”?
Now, that is actually a very good question and I would love to ask a proper lens engineer for a more complete answer. But we can definitely list some of the potential causes: faster lenses are more expensive to produce which means they will also be more expensive for the customer. And let’s face it, lots of people complain about price. These lenses would also be larger, heavier and lots of people complain about size and weight. Look at size and weight of the Voigtlander f/0.95 series for m4/3 and remind yourself that they don’t even have OIS or AF, and they aren’t perfectly sharp wide open either.
Conclusion – Or why I love smaller sensors as much as larger sensors
Is there an explanation we have yet to discover that would make everyone agree? I don’t think so because once again it comes down to points of view and what is more important to each individual.
I am sure that this article won’t necessarily make people change their minds about the subject. It’s a topic that will continue to divide people into two groups that disagree.
The first group will consider a good photograph something with the perfect exposure, something that is 100% noise free and has more shallow depth of field. The second group will consider a good photograph something where the author captured the light in the best way he/she could, regardless of the sensor size or “real” aperture used.
I respect everyone’s opinion, and actually some excellent essays written by photographers who would disagree with me completely helped me fill this article with more accurate information. But personally, building this website with Heather and learning to use all these mirrorless cameras has made me choose the second group.
One thing that is often missing from these kinds of discussions is that a deeper Depth of Field is not always a negative thing. This is one of the reasons I started using Micro Four Thirds cameras for my event photography two years ago. Shooting at f/2.8 with an equivalent depth of field of f/5.6 can be helpful indeed. A camera system with a smaller sensor offers advantages: this is why I like them and why I learned to not worry about the technical differences.
It’s 2016 and every camera has a good sensor that allows you to do everything you need to do. I’ve said it once and I’ll say it again: when I test a camera, image quality is often the most boring part unless there is something particular to talk about. However I can understand that what’s irrelevant to me might be relevant to you.
If you are curious to read more about equivalence from a purely technical point of view, I invite you to check out this excellent essay written by Joseph James. It’s really long, requires some knowledge but is also very interesting. I’ve also included this excellent YouTube video by John P. Hess of Filmmaker IQ. (Thanks to our readers for bringing it to our attention.) John explains Depth of Field by referring to the Circle of Confusion, which is an extra piece of information worth knowing.
You may also like reading:
sandeep krishnan says
Thanks Mathieu. I couldn’t complete reading the article completely, but I realised that you have put in a great amount of effort in explaining a concept which confuses a lot of amateur photographers. I have following this topic and I have seen the video which you mentioned, where Tony has debated the same topic.
Your explanation was very appropriately positioned, without being offensive, to any other professional, who might have a different opinion, on this topic. At the end of it, I go back a little more wiser and clearer on the topic of aperture equivalence. Thanks very much
Mathieu says
Totally agree with you! There is a lot you can do with a camera and learn to exploit it at its full potential. Sometimes it takes even one year before knowing it perfectly at 100% 🙂
any old name says
Thanks Mathieu. This has to be the best discussion I’ve seen. I especially thank you for keeping it free of the camera system bias that usually creeps into this discussion.
I would like to state that all of this discussion is rendered moot if you simply learn the characteristics of the camera that you’re using. Comparing characteristics in this way is like trying to learn a second
language but still doing the translation in your head all the time. There’s no need (in fact, it’s needlessly confusing) to keep comparing the characteristics of the camera you are using to the one you didn’t buy, or to the one that you left at home (or whatever).
Pick the tool that’s right for the job that you want to do, learn it’s characteristics (both strengths and weaknesses), and spend your shooting-time on your composition rather than worrying what you might have been able to do with a different camera system.
Mathieu says
I think that comparison between camera systems make sense to give some reference especially to those that aren’t familiar with a smaller sensor.
To answer your question, your focus distance is still at 5m. It is independent from the crop factor.
Soup says
On point, Mathieu! And finally we have a comprehensive overview to link when this silly discussion comes up again and again. Now we know why an iPhone can’t render DoF even with a f/2.2 lens and has heavy noise even at ISO 200: Because its small 1/2,7″ sensor has an FF-equivalent of an f/16 aperture and ISO 200 has a similar noise pattern to ISO 1600. And that could have been seen by everybody!
On another popular blog, I once read, that we should just stop comparing specs between different sensor sizes because in real world it makes no sense. Everyone of us has its preferred system. We should know that that particular focal length gives me that angle of view and that aperture gives me that DoF, noise pattern or shutter speed. And it doesn’t matter if these values are different on another sensor size because we own just one. FF shooters know what they have to consider as an Normal lens, MF shooters do the same and LF shooters for their system. I never heard of any LF shooter comparing their lenses to an 35-mm-equivalent. Nor do iPhone users do. Why do APS-C or M4/3 do? This makes only sense if you need to compare focal lengths and apertures across multiple sensor sizes. Who needs that? Only gear-heads who talk about specs, rather then features, rendering and context. Let’s talk about how much we love our fast normal lenses: I know, for my APS-C camera, I’m talking about a 35mm f/1.8, the next guy thinks of his 50/1.4 and the other guy thinks his 25mm f/1.4 – and it even doesn’t matter that f/1.8 on APS-C is not the same or equivalent to an FF f/1.4 and M4/3 f/1.4 because be just talk about the fastest lens available/in my kit.
However, I have one question (just because I have been to lazy to measure it myself): What happens to the focus distance marks on a (FF)-lens on APS-C? Let’s say I have a old MF lens (because modern lens don’t have focus distance marks), use them on a APS-C sensor with a 1,5 factor and set the focus distance to 5 m by turning the focus wheel to the 5m-mark. Is my focus distance now at 5m or 7,5m? I expect it’s still 5m because I just crop from an image, which is independent from focusing. Or not?
Mathieu says
“Since the pixels are of the same size and the light gathered by each pixel is also the same, how will a larger sensor give you better noise performance, is the doubt I have!”
The answer is in what you wrote in the precedent paragraph: the 24×36 sensor will have four times the pixels compared to the MFT camera. It is easier to understand if you think at the sensor’s area: the 24×36 is four times larger so even if the pixel size is the same, the 24×36 sensor will gather more light on a larger area.
“If you crop the image to one fourth to simulate a MFT sensor will the cropped image have different noise characteristics at pixel level, compared to uncropped image?”
The crop would be 2x. Then it depends if you resize the cropped image to look as large as the original. In that case the noise will appear larger.
Ashok Kandimalla says
Very interesting and thought provoking. I have the following doubt which has been intriguing me for long.
You said, quote ” Yes, of course it does. Because the larger sensor captures more Total Light (Photons) on a larger surface, it will gather more “data” to increase the Dynamic Range and decrease the Noise density (NSR – Noise to Signal Ratio). In other words, your picture will have less noise, look “cleaner” and have more information in the highlights and shadows.”
Now let us consider this case. There are two sensors one 24X36mm and one MFT. Assume both have been built using the exactly the same sensor tech and the pixel size of each is same. Also assume that at each pixel the noise characteristics, S/N, etc. are the same. Let us also assume that the rest of the camera circuitry like A to D conversion, amplifiers, firmware, etc. is the same.
Since the pixels are of the same size, the 24×36 sensor will have four times the pixels compared to the MFT camera. Let us assume that these figures are, as an example 48MP and 12MP. Let us say these cameras are equipped with 50mm f/1.4 and 25mm f/1.4 lenses.
The doubt I have is follows. The 24X36 camera’s larger sensor will gather more light as you say, but the light per pixel has to be the same IMHO, as it is spread over larger area. (That is why both cameras have the same exposure as you have so nicely explained). Since the pixels are of the same size and the light gathered by each pixel is also the same, how will a larger sensor give you better noise performance, is the doubt I have!
I can reframe my doubt in another way. Let us consider just one camera a 24x36mm. If you crop the image to one fourth to simulate a MFT sensor will the cropped image have different noise characteristics at pixel level, compared to uncropped image?
The only way I can think of this 48MP camera can give better noise performance is simply due to down scaling. If you are making a print of size that needs let us say 24MP at 300dpi, then the 24X36 camera’s image has to be down scaled thus improving the noise (I guess there is no doubt that down scaling reduces noise). Conversely, the MFT image has to be up scaled thus causing image deterioration. This could give perhaps a two stop advantage but I am not sure.
If the 24×36 camera say has 24M (or even 36M) pixels then of course each pixel will be larger and will give better noise performance. No questions about this!
I will be delighted to hear your comments on my doubt.
Anthony White says
I learnt photography on film, at high school in the early nineties. So aperture equivalence discussions always amuse me.
It seems to me that the easily adjustability of ISO, and the development of high functional ISO with acceptable noise, has led people to forget (or not learn) that aperture primarily a ‘quantity of light’ control that also has an effect on depth of field. It’s not a ‘depth of field control’ that also affects quantity of light.
We use to put a film in the camera, and then we would be fixed at, say, 100ASA for the next 36 frames. And we didn’t waste frames, because film was expensive. Then we had to make decisions about shutter speed and aperture to capture enough light, and trade off motion blur and depth of field. Or add lighting.
When I went properly digital, the flexibility of being able to adjust ISO between frames was one of the things that took time to get used to. I could suddenly change my ISO by four full stops to change my depth of field, without changing my shutter speed?! Amazing.
Mathieu says
Yes the sensor technology plays an important part.
Espen Braathen says
As such, will diffraction issues get worse when the megapixels on the sensor moves up, i.e. from the 16 MP (GH4) vs 20 MP (i.e GX8)?
Guido Gloor Modjib says
I think the potential confusion when it comes to total light, noise levels and ISO comes down to a simple fact:
* Focal length is a physical measurement, which mathematically interacts with sensor size to produce angle of view. It’s completely independent of camera technology and sensor – the image circle and the mount determine what cameras the lens can be mounted on, that’s all.
* f-stop is a ratio between virtual aperture size and focal length. It’s also physically measurable, and independent of camera technology and sensor.
* Exposure times are just exposure times. A second is always a second. Also independent of camera technology and sensor.
Thus two of the three things in the exposure triangle are only determined by physics, just like the angle of view.
ISO, on the other hand, is completely different. ISO is determined by correct exposure with a given sensor and processing pipeline, when focal length and f-stop and exposure are given. An f/4 image at 1/200s of the same scene with the same ISO should look the same with any camera. But those cameras detect a different number of photons in each of their pixels. So essentially, the number of detected photons is multiplied with the ISO and then a sensor-and-camera-specific factor, and that number is the detected brightness.
So there is actually a factor involved that is never published: The factor between the number of detected photons in a pixel and the brightness that should result, at any given ISO. This factor is determined by sensor technology and sensor size alone. And this factor is directly proportional to the amount of noise.
A very interesting article on equivalency, by the way, is the dpreview one – it’s the one I like best on the subject: http://www.dpreview.com/articles/2666934640/what-is-equivalence-and-why-should-i-care
Guido Gloor Modjib says
I absolutely agree that m43 optics start having diffraction issues at earlier f-stops than 35mm optics. The reason for that is very simple: Diffraction is not related to f-stops, but to physical aperture size in relation to either print size and viewing distance, or pixel pitch, depending on what you’re looking at.
As the article correctly pointed out, f/16 on m43 produces an entrance pupil that’s exactly as large as f/32 on 35mm sensor, given equivalent focal lengths and thus equal field of view.
When pixel peeping, diffraction is also more impactful the smaller the pixel pitch is. A m43 16MP camera starts having noticeable diffraction while pixel peeping starting from f/7 (that’s where the airy disk is larger than the pixel pitch). So a 16MP 35mm camera like the Nikon D4S will have noticeable diffraction starting at f/14. But a 36MP 35mm camera has a much earlier diffraction limit, when you’re pixel peeping, and starts having noticeable diffraction at a pixel level at f/8 again.
When printing, the same viewing distance and print size has the same diffraction limits independent of resolution (given “high enough” resolution, that is), because our eyes don’t necessarily notice pixel-level diffraction if its’ beyond the resolution power of the eye.
Ayatollyahso says
Agree:i’ve had my arguments with Nortrup and co.NO hand held lightmeter has settings for different formats or sensor size.
EXPOSURE WISE: 1/125th @f5.6 ISo 200 IS 1/125th @f5.6 ISo 200: m4:3, 35mm, 6×7, 4x5or 8×10.If anyof his b.s.is true we’d have seen it shooting 35mm and 4×5 the same day. The difference in exposure (light gathering) (if any) should have been MUCH more pronounced going from 35mm film to cut film 4″x5″.then going from aps-cto “FF” digital…
Espen Braathen says
Seriously, for the smaller m43-lenses diffraction related problems are fare worse than i.e. 35 mm systems. That’s why lenses tend to have good sharpness over a smaller f-stop range than 35mm lens designs. Going beyond f16 i.e. is not a gode idea with most m43 optics, wheras with a 35 mm lens design you can happily shoot at up to f32 without to much diffraction limiting resolution.
The lower than f2.8 m43 optics are all primes and the really low f numbers like the f1.2 42,5 mm is heavy and expensive. As a system the alternate f1.7 42,5 lens is probably the reasonable limit of price and weight for most m43 shooters (like myself).
A constant 12-35 og 14-42 f1.4 zoom could be a dream on the paper to optimise DOF effects, but serioulsy there is probably no practical way of designing such things without them beeing insanely expensive and large and heavy.
Mathieu says
Thanks Antonio. I know the video you mentioned, it does add interesting info. Personally I skipped the circle of confusion topic here because I wanted to keep things simple. I think it is more interesting for other subjects like to calculate the hyperfocal distance.
Mathieu says
Thanks 😉
Mathieu says
Thanks. I agree with you, it’s a useful “rule of thumb” to get an idea because we have to compare this different system with something. However if we dig too much into it there are so many things to consider and we loose perspective of what really matters 🙂
Mathieu says
Yes but in that case John is talking about the same lens used on different sensor sizes so the angle of view changes (narrower on the smaller sensor). In my article I decided to keep the angle of view as reference so the same composition on the three systems because I believe it is more interesting from a photographic point of view.
But what he says is not incorrect and just brings a different point of view on the matter. For videomakers his example can be more interesting because often they adapt full frame lenses or cine lenses on smaller sensors. For example I remember shooting video with the Panasonic AF100 (m4/3) using Nikkor Fx lenses and there I had a shallower DoF because of the crop factor.
There are other arguments but I didn’t want to make the article too long and I wanted to stick on specific case. For example, the perception of DoF also varies according to the distance you are viewing a photograph (or a print) at.
Turbofrog says
Thanks for the thorough discussion!
I think that aperture equivalence can be a useful “rule of thumb” or “back of the napkin” reference for photographers who want to understand a little bit more about differences in formats. But it is super important to remember that it is only a rule of thumb, and as you mentioned, the individual sensor technology matters a whole lot more. Once you get into the real world, you need to make comparisons based on specific cameras if you are trying to talk about equal “image quality” (in terms of SNR, dynamic range, etc…).
As a case in point, we all know that FF cameras let in 2 EV more Total Light than M4/3 cameras, as in your above examples. And yet, if you look at DXOMark (take with a pinch of salt, of course):
Olympus E-PM2: 932 ISO
A7S = 3702 (+1.99 EV)
A7R II = 3434 (+1.88 EV)
Nikon Df = 3279 (+1.8 EV)
Nikon D750 = 2956 (+1.67 EV)
A7 II = 2449 (+1.39 EV)
Canon 5D III = 2293 (+1.29 EV)
Leica Q = 2221 (+1.25 EV)
You can see that the only camera that actually matches equivalence theory in terms of performance is a 24×36 sensor with 50% fewer pixels than the 3-year M4/3 camera with 1/4 the sensor size. And the next best performer is a brand new $3200 camera with a state-of-the-art BSI sensor.
…So if SNR equivalence for bread-and-butter FF cameras like the A7 II or 5D III is “only” 1 1/3 stops, to match the DoF and exposure of a 25mm/1.4 on M4/3 you need to shoot your 50mm lens at f2.8 (and 4x the ISO) on FF, but you’ll actually be compromising on the technical image quality. Probably not why you decided to shoot FF!
Anyway, since the real world is complicated, that’s where equivalence breaks down. Like I said, I enjoy using it for a quick rule of thumb, but anyone who lets it dominate their photography will be disappointed when the results don’t match the theory.
dasar says
very clear and well thought.
Thank you
Albert says
Talking about DOF You need to take into consideration Circle Of Confusion (COF) as Filmmaker IQ well explains here https://youtu.be/lte9pa3RtUk?t=339
He said: “With Digital the COF is constrained by the pixel size” and “Higher Resolution means Shallower DOF” so “All other factors equal – the smaller sensor will have the shallower DOF” https://youtu.be/lte9pa3RtUk?t=606
Antonio Samagaio says
Mathieu
Thank You for another great article.
I appreciate your work here and on the new “curation” as well.
People – myself included – keep forgetting that 135mm (in photography terms) were always a small format. It became popular, affordable, and so on.
Some time ago I let myself immerse in that kind of thoughts and discussions. But when I pick up my trusty “out dated” and old book “Basic Photography – Michael Langford” It’s all there!
I guess people waste more time (again I’m also a bit guilty of this) writing words then taking pictures and mastering what they have in the hands and on the brain.
I found this video a very nice addition to what you’ve wrote and a good complement on Tony Northup’s series. Pay attention to the “errata” on the description.
https://www.youtube.com/watch?v=lte9pa3RtUk
Just my 2 cents here.
All the best
António
Mathieu says
Haha yes Optical vs EVF could be a good one 😉
Lazlo says
Fantastic article, thank you. A lot of ‘equivalence’ nonsense is school ward level arguments on who’s got the ‘biggest’…
You are absolutely right, all cameras these days have good quality sensors. So instead of arguing about specs, let’s argue about photo quality. If we do this, a lot of those who talk loud will disappear in the shadows, since they don’t understand that photography is an art form and the camera is just a tool.
Does anyone see photography museums and galleries saying ‘this picture was taken a 1/125, f1.8, ISO 400’….
Leica or iPhone, if you can take good pictures, then good for you!
Thanks for your articles.
Richard says
Love this sort of discussion and another good read. It might be a good idea to think of a camera and lenses in terms of “characteristics.” F.O.V, sensitivity, noise all come into it. In my experience usability is right up there. Or should I say “be there” and as to whether it was f8 or not is another question based on format size! Thanks Mathieu. How about a review of the best viewfinders for an article? Now let me think….Optical or EVF……
TomTom says
I went with an m43 because I always wanted more depth of field! My prior camera was film, and I was always fighting for more.
Not sure if your readers would find this interesting, but the reason for full stops going 1, 1.4, 2, 2.8, 4, 5.6 etc is that the f number is defined as the focal length divided by diameter of the aperture. As area of a circle is pi * r^2, decreasing the diameter of the aperture by the root of 2 (about 1.414) halves its area.
Mathieu says
Well dynamic range and noise especially are also influenced by the sensor tech itself and how the camera process extract the information from each pixel. For example when the first A7s was announced a lot of people where hoping for a substantial difference concerning dynamic range but then we saw that the A7r was doing even slightly better (except at high ISOs of course). So if we want to go more in-depth there are definitely more arguments to take into account which would make this whole equivalence topic even more complicated 🙂
Lyn Rees says
Replying to myself… can’t be good…
Or to put it another way, shoot the same scene with a Sony A7Rii, once in full-frame and once in crop. The aperture, shutter-speed and ISO will stay the same, but you’d obviously need to change the focal length to get a different field of view.
The dynamic range and noise characteristics will be exactly the same. It’s the same sensor, you’re just ignoring much of it by only using a portion in the middle. Now, if you choose to make billboards from both images, they’ll look different because one has been enlarged more than the other.
Mathieu says
Yes that is the advantage of a full frame camera. I remember that some photographers that always shoot a 1.4 o 1.2 with prime lenses (35, 50mm) asked me advice about switching to Fuji or m4/3. I told them that for their style, what they were owning was perfect already.
Lyn Rees says
“Yes, of course it does. Because the larger sensor captures more Total Light (Photons) on a larger surface, it will gather more “data” to increase the Dynamic Range and decrease the Noise density (NSR – Noise to Signal Ratio). In other words, your picture will have less noise, look “cleaner” and have more information in the highlights and shadows.”
I don’t belive that to be true. Neither noise or dynamic range are influenced by the size of the sensor if all things are equal i.e. same sensor tech and pixel densities. The eventual enlargement will influence the perceived noise however i.e. for a given image/print size, you don’t need to enlarge the larger sensor image as much as the image from the smaller sensor (which will have fewer pixels), and will therefore appear less noisy.
That’s how I understand it anyway… But then, It’s been a while since I read the following in detail:
http://www.cambridgeincolour.com/tutorials/digital-camera-sensor-size.htm
Riccardo Campaci says
I am a MFT user and I had only a full frame camera but I drop it because it was too heavy. But I do still miss the FF ability to get easily a creamy shallow depth of field with a wide 35mm lens. This is the only thing a really miss about the full frame format, something that I cannot find in the equivalent MFT gear.