Abstract
Remote sensing images acquired in various spectral bands are used to estimate certain geophysical parameters or detect the presence or extent of geophysical phenomena. In a majority of cases, the raw image acquired by the sensor is processed using various operations such as filtering, compression, enhancement, etc. In performing these operations, the analyst is attempting to maximise the information content in the image to fulfill the end objective. The information content in a remote sensing image for a specific application is greatly dependent on the gray‐scale resolution of the image. One of the measures to quantify information content is classification accuracy. In such applications, the loss in information content as a result of degraded gray‐scale resolution may not be significant. This is because, although the value of a pixel may change as a result of the decrease in the resolution, the same pixel may, in most cases, be correctly classified. Our research reveals that the loss in information is exponential with respect to the number of gray levels. The model is seen to be applicable for Landsat TM and SIR‐C images. Using our mathematical model for the information content of images as a function of gray‐scale resolution, one can specify an “optimal” gray‐scale resolution for an image.