Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Why don't cameras of any sort have the same view of the human eye?

Whenever I look at myself in a mirror the lighting, definition and angle seems perfect, however, when i take a picture the picture comes out with lighting that fails to show the same definiton, skin tone and only comes out completely realistic when the light is extremely bright

8 Answers

Relevance
  • 1 year ago

    Because when you view a scene by eye, your eye can automatically chane exactly where you looking, alter the perspective, adjust the the lighting , and the mind can make its own selection for viewing. All done automatically, and without being conscious of it.

    A camera lens cannot do that, hence the often frustration at the result.

  • Anonymous
    1 year ago

    Bsbsbsnsnbssnsbsbbsnsbsbsbbsnsns

  • joedlh
    Lv 7
    1 year ago

    The camera freezes the image in a brief moment of time. Our brains are constantly processing input from the eyes. The brain forms a complex representation of the scene that integrates commonalities over time, which eliminates minor, temporary variations. A good example of this is depth of field. The eye lens/cornea structure,  like any other lens, has depth of field. In cameras, out of focus parts are blurry. Yet the brain's representation lacks this because the brain ignores out of focus parts. When you look at a person against a distant background, the background would be blurry, except for the fact that when you focus on the background, the brain uses this in its representation. It's like the brain focus-stacks images in real time.

  • snafu
    Lv 7
    1 year ago

    The camera lens equivalent to the human eye would be around 55mm.  Which is not a common focal length.  Obviously the human eye is a very complex piece of biological engineering,  

    attached to a brain that works it all out.  The quality of the lens, the exposure system, the dynamic range of the monitor, sensor or film and the skill of the photographer all have a bearing on the final image.  There are so many variables in the image making process that calibrating all the elements to correspond as closely as possible to the human eye, would be a truly Herculean task.

  • How do you think about the answers? You can sign in to vote the answer.
  • Sumi
    Lv 7
    1 year ago

    The sensor in a camera doesn't have the same dynamic range of the human eye.  It's quite stunning just how shallow the dynamic range of a camera actually is compared to the human eye.  Furthermore a sensor doesn't see color the same way the human sees it, too.  Most sensors are comprised of pixels in a Bayer pattern.  This checkerboard pattern doesn't have an equal number of red, blue and green pixels so some colors are going to be more or less saturated than others.

    I would argue that it's not necessarily the brightness of the light but instead the quality of the light in combination with the exposure, the dynamic range of the sensor, how the camera "cooks" the raw data into a jpeg.  If your exposure is off, the resulting image will be too dark or the highlights will be blown out.  The dynamic range of the sensor will greatly affect how much detail is rendered.  

    You also have to consider how the camera processes the raw data from the sensor.  Raw data is rather unattractive as it's contrast is very low as is the color saturation and overall sharpness.  Cameras will take the raw data and add contrast, color correction (e.g. white balance and color modes such as vivid or neutral) and add sharpness.  Not all cameras or smartphones do this the same way.

    I would be willing to bet that you're probably not using a camera with an APS-C or full-frame sensor, but instead you're probably using a point-and-shoot camera or a smartphone.  These cameras, especially the $5 cameras inside $1,000 phones, have very inexpensive, low-quality lenses which aren't capable of showing the details that you see in the mirror.  Plus the coatings (if they even have coatings) will affect the overall color of the image.  Nikon lenses, for example, produce more neutral or cooler images than Canon lenses.

    So it's not one thing that is causing the issues that you're experiencing but a variety of technical issues that have to do with the camera, lens and your choices in exposure.  Unlike a sensor or film, the human eye can't accumulate light.  A sensor can make a scene that appeared dark to the human eye look fairly bright or even very bright if overexposed.

    If you were to use a better camera (aps-c or full frame) with a better lens and get the exposure right, you'd end up with an image that comes much closer to what you saw than what you're currently getting.  And, of course, get the exposure right, shoot in RAW and properly process your images yourself.  It won't ever match what you saw, but it can come close enough to be more than acceptable to you.

  • keerok
    Lv 7
    1 year ago

    Mirrors are easy to use. Just stand in front of them and look.

    Cameras on the other hand are complicated instruments which never take exactly as how human eyes see. They never did with film and they won't ever do either today with digital. They require a good set of skills to get the picture right so when you get a bad picture, don't blame the camera. It's always the photographer's fault. The camera is only a tool.

  • ?
    Lv 7
    1 year ago

    What is missing in every camera you can buy is a human brain.

  • jehen
    Lv 7
    1 year ago

    Any number of reasons:

    Dirty lens.

    Quality of the camera.

    Color temperature of the light (artificial light can look 'normal' to your eye but is actually an unnatural color cast that your camera may not compensate for very well).  

    And lastly the quality of the mirror, if that is the image you are shooting.  A mirror can add it's own color variances, and filter out some colors of light.

    Your vision compensates for all of this variation because you know what it is supposed to look like.  The camera not so much - but cameras are getting better and better and fixing our mistakes and compensating for less than ideal conditions that seem ok to our eye.  

Still have questions? Get your answers by asking now.