SAR (Synthetic Aperture Radar) Processing Overview

Home Page or Table of Contents

Synthetic Aperture Radar (SAR) Processing Overview

An antenna is the equivalent of an optical lens, with gain (magnification) being achieved by forming an aperture capable of achieving a coherent (phased) summation of all appropriate points in the image field. Synthetic Aperture Radar (SAR) "synthesizes" an antenna -- a very long antenna -- by taking radar samples looking sideways along a flight path, taking advantage of the fact that the target area of interest is essentially stationary during the fly-by time.

Spatial diversity is achieved by means of the different look angles from the flight path to the target, causing non-coherent echoes (noise or interference) to be defocused during the coherent summation process.

The intent of the information presented here is to provide a practical (not a theoretical) overview of how raw SAR data is generally processed, in particular with a view towards implementing the processing in real time. In its simplest form, the the formation of a SAR image can be described as follows:

where, referring to the first figure shown above, the hyperbola represents range (the so-called "cross-track" dimension) as a function of azimuth (the so-called "along-track" dimension). One of the chief complications encountered in processing SAR images is what is referred to as "spatial variance" -- that is the shape of the hyperbola changes as a function of range:

If it were not for this complication, the processing of raw SAR data to produce an image would be greatly simplified.

Because the transformation of raw radar data to a physical image is basically a problem in geometry, the approach given here is from that point of view. In particular, three different approaches to the DATA PROCESSING block used in SAR image formation will be considered:

Regardless of the approach used, the basic geometry is the same. For purposes of illustration, a configuration is shown below which is scaled to units which can be easily extrapolated. The basic parameters are:

Items to note are:

Assuming the transmitter and receiver are co-located on the aircraft, the basic operations involved in forming the image are:

Each of these is treated briefly below.

FINDING THE DISTANCE FROM THE TRANSMITTER TO ANY POINT IN THE IMAGE PLANE

If a 32-bit floating format is used (24-bit mantissa), this computation may require the use of double-precision arithmetic.

COMPUTING THE NUMBER OF WAVELENGTHS FROM THE TRANSMITTER TO THE POINT

ROTATING THE POINT BACK TO THE TRANSMITTER USING THE FRACTIONAL WAVELENGTH (RESIDUE)

This operation is basically the following

           angle = 2. * pi * ( length / wavelength )

which, in order to make the residue more visible, can also be computed as

           angle = 2. * pi * ( length / wavelength - int(length/wavelength) ) 

Again, if a 32-bit floating word format is used (24-bit mantissa) and a large number of wavelengths is involved, double-precision arithmetic may be required.

VECTORIALLY ADDING THE ROTATED POINT TO FORM THE SUM

This operation is the transformation-of-coordinates operation

         sum = sum  +  ( i + j q ) * ( cos(angle) + j sin(angle) )

with the real and imaginary parts kept separate

         sumreal = sumreal + i * cos(angle) - q * sin(angle)
         sumimag = sumimag + i * sin(angle) + q * cos(angle)

from which the final power will eventually be computed as

         power = sumreal**2 + sumimag**2

Note that the direction of rotation can be clockwise or counter-clockwise, depending on the number of mixers in the receiver and whether the mixing operation is an up-conversion (positive frequency) or a down-conversion (negative frequency). An odd number of down-conversions will result in a (cos - j sin) rotation, while an even number of down-conversions will result in a (cos + j sin) rotation. If this detail is not observed, the image will be wildly out of focus, since the vectors will not have been added coherently. (If Fourier transforms are used to perform a cross-correlation operation by multiplying transforms, this sign can also be complicated by fact that the cross-correlation operation is A x A*, that is, one of the images being correlated must be complex-conjugated.)

INTERPOLATION

The operations described above can be used to form an image at various angles to the ground. The ground itself is usually the desired plane, unless the scene being viewed is itself skewed. The radar-return samples relative to the ground will not, as a rule, overlay exactly the points in the map being constructed. Most real-time applications simply ignore this and use the closest point without correction. This effect is illustrated below:

If a more sharply focused image is desired, then some form of interpolation can be used. A discussion of single-point and array-based interpolation techniques can be found in Interpolation in the Time or Frequency Domains. The following considerations are apropos when applying these techniques to SAR image formation:

Home Page or Table of Contents