Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Why can't NASA use color cameras?

Seeing the new pictures of Mars in black and white is pretty rediculous. I see no reason why they couldn't use color cameras. This would also help them determine whether they really found ice or not. Please enlighten me on why NASA is still using black and white cameras.

Update:

Even compressed color images would be better then black and white (and probably just as small). Although this is probably the reason, it's a bad one.

4 Answers

Relevance
  • spumn
    Lv 5
    1 decade ago
    Favorite Answer

    The camera on the Phoenix lander's robotic arm has three color channels: green, red, and near-infrared. If NASA decides to combine data for a particular image from all three channels into a single, false-color image, it will do so; on several occasions, it has done so. Below is a link to such an image.

    Although as humans we have a very strong urge to just want to see Mars as the lander does, this is not terribly scientifically valuable. The three channels chosen do not exactly correspond to the red, green, and blue channels of human color vision, because NASA has determined that more compelling data can be collected elsewhere in the EMR spectrum, for example, in the near-infrared region. When you spend many millions of dollars and send it hurtling to another planet, you think these things through to get the most bang for your buck, not the prettiest pictures.

  • 1 decade ago

    Any digital camera works by taking a series of black and white images through different colour filters. "One-shot" colour cameras have a patterned mask of filters which means that the resolution is only one third that of the total number of CCD or CMOS cells. A black and white image preserves the full resolution of the imaging chips.

    The cameras on Phoenix have a number of filters and black and white images taken through them can be combined to give "true-colour" images (some have been composed and released.) However there are filters that show up other wavelengths and combining such "false-colour" images can be important to demonstrate various properties of the soils and rocks around the lander. The point is that the equipment carried is there to achieve a scientific purpose, making pretty pictures for the rest of us to put up on our walls or computer screens is only a secondary concern.

  • ?
    Lv 7
    1 decade ago

    The cameras on the lander, like all serious astronomical research cameras, synthesize colour images by combining images made through different coloured filters. This is necessary to maintain the scientific integrity of the images. Colour cameras are notoriously inaccurate for serious scientific purposes. Virtually _all_ astronomical images you see, whether by the Hubble, the Mars landers, or advanced amateur astronomers, are made from multiple monochrome images shot through carefully calibrated filters. That's the way scientific photography works.

  • 1 decade ago

    Bandwidth is the simple answer. A colour image takes a lot more time to transmit when they can actually reconstruct the colours using the known characteristics of the CCD and a reference plaque in view (against which they calibrate).

    The processing is done on Earth and is called "False Colour". It is a good approximation of the real situation without clogging the channels with information that is not relevant to the *science* of the mission. Remember, that colour might make good tourist snapshots, but things like chromatography, infrared and ultraviolet etc are much more useful to us in understanding that environment.

Still have questions? Get your answers by asking now.