Initial Publication Date: April 3, 2007
LuAnn's Responses to Your Questions from the Measuring Distance and Area in Satellite Images Workshop
- I am still unclear if the pixels are 500m by 500m (or 250 000 sq. m) or 500 sq.m. The SERC site says the later but Luann said the pixels were 500m on a side (the former). Could you please clarify? There is a very big difference between the two.
Each pixel represents a square on Earth's surface that is 500 m on a side. The area represented by each pixel is 500 m x 500 m or 250,000 square meters.
I apologize for the confusion I caused: the EET chapter contained the error and I have corrected it. Thanks for bringing the issue to my attention!
- Why didn't some of us have enough memory available to open the landsat_aral_triptych.jpg image into ImageJ?
It turns out the triptych image that was distributed on the CD was the BIG one that is available from Earth Observatory. Even with the ImageJ set to use as much memory as possible when it launches, most computers would never be able to open that 8.1 MB image. Again, I apologize that you received the wrong image file.
If you go to the Earth Observatory story and download the image that appears right on that page (NOT the one you can acces by following the link to the full-size image), you shouldn't have any problem opening it into ImageJ.
- I'm just wondering how the satellite coloring works, why some images show features in certain colors and other images use a different color scheme? What do the colors correspond to?
This is a fairly complex question, but I will try to answer it as simply as possible.
Grayscale images are the easiest to understand; each pixel value is essentially a brightness measurement. For instance, when we changed the landsat_aral_triptych image to grayscale (an 8-bit image*), the dark pixels (the ones that represented water) had very low pixel values because water is dark and the light pixels (such as the ones that represent sand along the beach) had relatively high pixel values because they are bright.
In the "color" images, the color of each pixel represents the amount of light reflected from its area in three different wavelengths. For instance, if you measure the intensity of light reflected from an area at the wavelength of red light, green light, and blue light, you could recreate that color on a RGB computer screen by showing appropriate amounts of each color of light for that pixel.
To see how this works, open a color image in ImageJ and place your cursor on any pixel. You can read the "recipe" for the pixel's color in the status bar. The three numbers reported after "value=" show how much red, green, and blue light (respectively) are used to make that color. The values for each color range from 0 (no light) to 255 (maximum light).
That said, I'll now reveal that satellite images have some additional complexity over images from digital cameras... Satellite-based instruments usually measure light intensity in a combination of infrared and visible light wavelengths, but they can only represent these measurements using visible light. Hence, satellite images are usually called false-color images. In case you want to explore this issue further, you can access an interactive tutorial from NASA that compares photographs and satellite images.
Please keep in mind that you can still use the measuring capabilities of ImageJ as you continue developing deeper knowledge about the images.
*You don't need to worry about this, but if you're trying to understand about the name, 8 bits refers to 8 digits in the binary number system: In an 8-bit image, each pixel's value is some number from 0 and 255 represented by 8 zeros and ones.
- I was not sure about the answers given in part 3.
The results shown in the Teaching Notes section were used to calculate the sample answer that follows the Questions in Step 3 of Part 3.
- Can you set a scale based on a known distance that you can find on the image? How do you set the scale from images that do not tell you the scale?
Additional info on setting scales in provided on the Going Further page under Other Techniques.
Also, Peter Garrahan offered this advice that will allow you to set the scale on images you take yourself: When our lab took video images from microscopes or of fish tanks, we included calibration scales (an item of known distance) in each image we took. To set the scale for each image, we measured the known length on the screen (in pixels) then entered the appropriate distance (in microns, meters, mm, etc.) in the set scale box. This way, we were able to accurately measure fish larvae, copepods, or squid statoliths from images taken at any magnification.