Photography "vision" has more contrast than does human vision.
As we learned in the previous section, sensors "see" the world differently than we do.
We see this with our eyes:
While our photographs are like this:
What we see is NOT what we get (WWSINWWG).
The increase in contrast is more pronounced when you're using light coming from the side—sidelighting—or from behind your subject—backlighting.
That's because the shadows created by sidelighting and backlighting appear much darker with a sensor—than they appear to your eyes.
This has been a problem since the beginning of photography.
Eadweard Muybridge, famous for his studies of movement, was also an accomplished landscape photographer.
When photographing Yosemite he could not record the sky and the landscape on the same wet plate.
Muybridge solved the problem using two different methods.
Muybridge combined a negative of clouds with a negative of a landscape, when making a print.
Today, we use software to combine two files, one file exposed for the clouds, and the other one exposed for the landscape.
Muybridge also used a board flap inside his camera to block the brighter light from the sky during a portion of an exposure.
He called the board a sky shade.
This is similar to how we use a graduated neutral density filter today.
The increase in contrast can be both detrimental and beneficial to your photography.