Understanding the basic physics of light will help us achieve the best photographic results. Here are some simplified explanations on how the science we learned at school can influence what we do with cameras.
There’s a lot more to light than meets the eye. Okay, so that’s an awful pun, but it is both metaphorically and literally correct. It’s a complex topic that we don’t need to understand in order to live our daily lives. There is also a lot of light that hits you, more than your retina is capable of converting into nerve impulses sent to the brain. There is less light than that landing on your camera sensor that gets turned into an electronic signal and transmitted to the camera’s processor.
In other words, our eyes can’t see all the light there is, much of it is invisible to us, but they see more than our cameras.
Take Control of Your Camera’s dynamic range
You can see the most details on a sunny day in the highlights of the sky as well as the shadows of objects under the ground. Your camera, with a single exposure, can’t see as wide a range of tones. Modern sensor technology is much improved and as you can see in the image above they are able to capture both details and highlights even in the darker areas. It wasn’t that many years ago when I would have had to bracket the exposures of that scene and then combine them to create a high dynamic range (HDR) image to be able to see the top of the lighthouse when pointing the camera directly at the sunrise.
Our eyes lose the ability to distinguish as many tones as they did when we were young. We should still be able to see between 18-20 stops between black and gray. Cameras at the top of the range are capable of 12 to 15 stops. However, a new sensor released this year boasts 24.6 stops. Don’t worry too much about that, though; your camera will still take fabulous pictures.
What is the importance of our Visible Spectrum?
The visible spectrum is only a small fraction of what we can see with our eyes and cameras. I find it quite incredible that we can differentiate the different colors of the spectrum that occur in a range that’s only 320 nanometers wide. The white light is made up of seven different colors, and all the combinations.
We are lucky that the majority of photons hitting the Earth fall within the range from 380-700 nanometers. This is the result of our planet’s location in the Goldilocks zone. It is exactly at the right distance to the right kind of star. In addition, our atmosphere contains an ozone film that prevents the harmful ultraviolet light from reaching us. We would be cooked if we were exposed to more energetic particles such as UV and gamma rays. On the other end of the spectrum, if we had longer wavelengths, it would require us to have larger eyes and make it difficult to thread a noodle.
What is the impact of light bending on our images?
As the earth spins and dawn arrives, we see the sun appear before it’s physically above the horizon. That’s because, like passing through a prism or raindrops, the light bends when it hits the atmosphere. This bending, or refraction of light, allows us to see below the horizon. It’s the same effect as when you put a spoon into a glass of water and see it bend.
White light is composed of seven colors: red orange yellow green blue indigo violet. You can see them in a rainbow or on the Pink Floyd album The Dark Side of the Moon.
Each color has its own wavelength, and so travels with a different speed. The colors bend in different directions, causing the white light to be split into its constituent parts. The red light has the least speed and is diffracted the least. Violet light, on the other hand, is the slowest and is diffracted the most. Dispersion is the process of splitting up light. The more refraction is present, the greater dispersion. Diamonds, as expected, have the highest index of refractive indices, which is what makes them sparkle.
This splitting of the light is generally not wanted in photography. Rainbows are the last thing that we want from our lens. We see this as color fringing, or chromatic distortion around edges with high contrast. This fault is common in cheap lenses. The perfect lens would be free of aberrations, and all wavelengths would converge on one point on the sensor. In order to achieve this, manufacturers of lenses use multiple glass components within the lens which help bring the wavelengths together. Modern lens technologies are constantly improving and modern professional lenses have no visible distortions.
When light strikes an edge, it can also cause diffraction. Imagine water ripples hitting an obstacle. They will bend around the obstruction. Light behaves the same way.
Why We Avoid Small Apertures
Look at your shadow on a sunny day. You’ll notice that the outer edges aren’t as dark as the center. The light is bent around you, which causes the lighter edge. In fact, the darker edge of the shadow is known as the penumbra and the darkest part is called the umbra. The further you are from the source of light, the bigger the umbra in relation to the penumbra and the sharper the shadow. This is something to take into consideration when using flash or studio lights.
This light-bending occurs when photons bend and bounce around the edges of aperture blades. This bouncing and bend is more prominent the smaller the aperture. That’s because the proportion of diffracted light to un-diffracted light is high at small apertures.
The reason why most photographers avoid using the smallest apertures is because of the increase in diffracted light.
Why the sky is blue
Not only does this bouncing – properly called scattering – occur when light hits boundaries, but also when light encounters other particles. Blue has shorter wavelengths and is therefore scattered more readily than red. This is why the sky is blue during the day.
When we look towards the horizon, it can seem whiter. It is because more air is available for the blue light, which scatters repeatedly and removes its blueness. Furthermore, looking obliquely through the atmosphere the extra particles scatter other colors too, plus there’s also the light reflected from the planet’s surface affecting it.
All of these factors lead to the mixing of different wavelengths and the production white light. Over the sea the sky can turn blue when the blue water is reflected into the air.
CPL Filter Use
The scattered light is polarised. The blue light waves will travel in a specific direction as opposed to moving randomly. This polarized movement runs perpendicularly to the source of light. Fitting a circular (CPL) polarizing filter to your lens will allow you to cut out much light from the sky that is 90 degrees away from the Sun. The sky will appear darker.
Polarizing filters are great for taking away reflections off the water’s surface, allowing you to see more clearly what lies beneath because the reflection is polarized. You can remove glare, for example, from damp autumnal leaves. This allows their colors to be richer.
Physics Behind Those Glorious Sunsets
As the sun lowers in the sky, more of the atmosphere must be passed through before the light reaches you. The atmosphere is usually filled with dust, water vapor and other particles at lower levels. The blue light is scattered even more, and only the warm colors of red and orange reach our eyes.
Warm Colors Aren’t Warm at All
When I refer to warm colors, I am referring to those that are psychologically warm. We tend to think that reds, oranges and yellows are warm colors while blues and greens are cool. In physics it is the opposite. Imagine a blacksmith heating up a piece metal. It glows at first red and then yellow. As the flame gets hotter, it becomes a white-blueish color. Gas torch used by welders is very hot, and can melt steel. It’s a blue flame.
There’s A Reason Your Camera Used Blue, Green and Red
A question is often asked. Why are photographic sensors and computer monitors using red, Green, and Blue to reproduce colors, instead of Red, Orange, Yellow, Green, Blue, Indigo, or Violet? This is the main difference between engineering and science. From an engineering perspective, it is easy to get white by combining only those three colors. Combining all seven colors on a computer monitor or camera sensor would be expensive and complex.
There is, however, a compromise, as with all things in photography. Cameras and computer screens that use the primary colors red, green and blue can’t reproduce the full range of colors we see in real life.
Even that isn’t as simple as it first seems. The device’s color palette varies. The virtual image is the most accurate version. It’s what your camera records when shooting raw and what your editing software understands your image to be. You can also choose to view the colors on a computer monitor or in your camera, if you shoot jpegs. You can also print a different version.
To achieve this, we use color management. Color management is the best way to do this. We define maximum and minimal values of red, blue, and green with color management. Whole books have been written about this, and there’s far too much information to include in this article. You should set your camera, printer and screen to the same colour space if you do not shoot raw.
The most widely used color space is sRGB. Adobe RGB was a more common profile with more colors. It was also the standard for digital printing at high-end. ProPhoto RGB is even more expansive and offers almost all colors for most printers. But things have changed, and now the printers that I use create color profiles for every type of paper and print. These profiles provide the best possible color accuracy.
It is sufficient for most photographers to simply remember to use color spaces that are no larger than the gamut on your device.
What’s the difference between Subtractive and Adding Light?
When we combine projected light, the primary colours are red, blue, and green. When they are combined, they create secondary colors, including magenta and cyan. The light is additive, so mixing green and red produces yellow. By mixing these colors in different proportions, we can create a wide range of colors.
Printer inks remove or subtract color from white light. Inks reflect some light while absorbing others. By mixing magenta with cyan and yellow inks, in various proportions, we can create a new range of colors. This range of colors is called a gamut in color management.
By using a single color space we can ensure that we only use colors where two gamuts intersect. We would get strange results if we tried to display or print colors outside of the capabilities your screen or printer.
The Color You Choose is Not a Complete Control.
In a similar vein, you’ll find control buttons on the screen to adjust, at a minimum, brightness and contrast. So will everyone else’s. Calibrating your screen is an important step to ensuring your prints’ colors and tones match what you print. Of course, when you share an image online, most other people won’t have calibrated screens. Your images could appear darker or brighter, with less or more saturation and a different contrast. You can’t do anything about it. You should calibrate your screen if you want to print photos accurately or share them with photographers who have calibrated screens.
Please read more
This article is, of course, just scratching the surface of these topics and there is plenty to be learned under each topic I’ve covered. There’s an abundance of information in the 33,000+ Fstoppers articles. Some of them go into greater detail on the topics that I briefly touched upon here.