All eyes on the blue planet

Where exactly are the Earth observation satellites? What do they actually see? What is really meant by spatial and spectral resolution?We present a brief guide to remote sensing.

Steam captured by Météostat 8, a geostationary satellite that “sees” the whole of the Earth’s surface. 6 March 2004. Steam captured by Météostat 8, a geostationary satellite that “sees” the whole of the Earth’s surface. 6 March 2004.
Image of the Betania dam and surroundings (Colombia) taken in the visible and infrared bands. The vegetation is shown in red and the water in black. Image of the Betania dam and surroundings (Colombia) taken in the visible and infrared bands. The vegetation is shown in red and the water in black.

As you may imagine, there is more to remote sensing than space satellites alone. The term refers to all the many technologies that make it possible to obtain information on an object using instruments that are not in direct contact with it. The instruments fitted to aircraft so that they can observe our planet are also remote sensing devices. As for Earth observation satellites, they cover a variety of devices with very different characteristics located at varying distances from the subject of scrutiny.

The first observation satellites were equipped with film cameras that, once in orbit, took a series of photos before returning to earth where the film was developed. These were later replaced by satellites equipped with TV cameras that were able to transmit pictures live from space. These sensors have since been further refined and have become more specialised. Today we have digital cameras that are able to cover the visible and invisible spectrum, such as the infrared, as well as scanners and orbiting radars.

The orbits

Depending on what they are observing, satellites are placed in different orbits. A geostationary meteorological satellite such as Météosat 5, for example, ‘floats’ above the Earth at a height of around 36 000 km. Its ground resolution is far from excellent and it is unable to pick out details – but it is not expected to. At this altitude it stays above the same point on the planet’s equator observing it continuously. This stability is very useful for monitoring the atmosphere and any changes to it.

Remote sensing satellites generally operate at lower altitudes, between 450 and 1000 km. At these altitudes an orbit is completed in an hour or so. Spot 4 is one such satellite and it completes its path every 101.5 minutes at a height of 830 km.

The plane of the orbits chosen forms an angle with the plane of the equator. The satellites can move on a polar orbit (passing over the poles), a direct orbit (the plane is inclined at an angle of between 0° and 90° compared with the equator, and movement is fromWest to East) or a retrograde orbit, where the angle is between 90° and 180° (East to West movement).

The inclination of a satellite’s orbital plane also determines what part of the Earth’s surface can be observed. Placed in a 50° orbit, the satellite will travel between 50° latitude North and 50° latitude South. That means it will never fly over Oslo (NO) that lies at 60° N.

What is known as the heliosynchronous orbit is particularly interesting as it is an orbit that is constant in regard to the Sun. Throughout the year, a satellite travelling on this orbit will see every point on the planet at the same time of day. This permits a comparison of images obtained under the same light conditions. This is the type of orbit used by the various Spot satellites.

Spectral and spatial resolution

Satellite sensors record radiation (light of various wavelengths, both visible and invisible) reflected or emitted by the ground and the various objects of which it consists. Spectral resolution is a measure of the ability of sensors to distinguish electromagnetic radiation of different wavelengths. The more sensitive the sensor is to fine spectral differences (narrow wavelength intervals), the higher the sensor’s spectral resolution.

Two types of imagery result. First there is panchromatic imagery, obtained from a single ray that includes all visible wavelengths. In a way, it is the intensity of the light providing the black and white picture of what is observed. It is this type of image that provides most details. The spatial resolution of the objects observed is high but the spectral resolution is low. In remote sensing, spatial resolution refers to the size of an area observed that is covered by a single pixel. Each image pixel thus corresponds to a part of the Earth’s surface. Today’s most precise satellites have a spatial (panchromatic) resolution of around 60 centimetres.

Multispectral images, on the other hand, are colour pictures. These are produced by a number of sensors, each of which is sensitive to a part of the electromagnetic spectrum (red, green, blue for the visible, but also infrared). It is by combining the information from these different spectral bands that a coloured image is obtained.

Are plants really red?

Most satellite images used for scientific purposes show vegetation cover in red, no doubt an odd choice of colour to represent forests, meadows and gardens. So what is the reason?

Most satellites that supply multispectral images are equipped with sensors that are sensitive to various bands of the electromagnetic spectrum. Thus, in the visible radiation field (ranging from blue to red and including green), satellites have sensors sensitive to three spectral bands: blue, green and red. From these three bands it is then possible to reconstitute all the ‘true’ colours of an image by varying their chromatic intensity.

The blue is not very interesting for the purposes of remote sensing, however, as this spectral band is very sensitive to changes in the atmosphere. It’s better to remove the ‘blue’ sensors and to add others sensitive to near infrared (that follows red in the electromagnetic spectrum), a ‘colour’ that is invisible to the naked eye. The near infrared is also a very interesting radiation field that is typical of vegetation.

On remote sensing satellites, the allocation of colours to the different sensors does not therefore correspond to the band colour. For sensors sensitive to green, scientists use the colour blue to reconstitute their images. Those sensitive to red are rendered green and those that capture infrared are translated into red. These are therefore responsible for producing vegetation in a colour that we may find unnatural. There are also other aberrations, such as red roof tiles that become yellow and the sea that is shown as black.

Hyperspectral imagery

Hyperspectral images are obtained by sensors able to record information in a multitude of much narrower spectral bands (often more than 200) in the visible, near-infrared and mediuminfrared of the electromagnetic spectrum.

All objects reflect, absorb or emit electromagnetic rays that correspond to their composition and structure. Hyperspectral data therefore provide more detailed information on the spectral properties (spectral signature) of a scene and permit a more precise identification and discrimination of objects than broad-band multispectral sensors.

Hyperspectral imagery has many applications. The most important are perhaps geology (identification of mineral deposits), precision farming, forestry (health and identification of species) and the management of aquatic environments (water quality, composition of phytoplankton, etc.).