(Note about the coloration of the images in the following article: When I added SRTM and ASTER file handling capability, I chose to map the elevation values directly to RGB space without any transformations. As a result, elevations near zero will map to black. In addition, smooth transitions in elevation will not necessarily result in smooth transitions in RGB space, resulting in some bizarre DEM renderings. I will probably transform the elevations to gray scale space in future versions. This is more complicated than it should be, mainly because I am starting to load BLACKART with a lot more features and capability than I had originally planned for. As a result, each new addition requires a lot of programming to prevent unwanted side effects from occurring in the originally planned modules.)
(Editor's Note February 15, 2004: . I applied a partial update to this article. Please consult the News section and the BLACKART Help file for current information about the program. I will update the article further as I add features.)
The last several articles have included BLACKART tutorials aimed at creating DEMs from "scratch", i.e. from contour line data. However, interpolation applications have other uses as well. One of these is to repair DEMs (as opposed to creating them.) By this I mean repairing missing data areas that occur mainly as a result of data acquisition problems. Specifically, cloud cover and areas of low contrast such as snow and ice in the case of visible wavelength sensors, and shadowing, and phase unwrapping anomalies for micro wave sensors may result in the DEM that you are interested in having a decidedly moth-eaten appearance.
However, many of these problem DEMs can be improved, depending on the degree of data fidelity that you require. In many cases simple techniques such as filtering, clipping and interpolating can make problematic data more usable.
BLACKART version 3.45 and higher has the ability to read and write SRTM data and read ASTER data. As a result, all of the image processing and interpolation algorithms are available for repairing and improving SRTM and ASTER DEMs. I will illustrate some of the capabilities of the program by working a couple of examples.
Before I start the tutorial, a couple of words about the SRTM DEM data format. I have not had a whole lot of good things to say about the SRTM experience on these pages. However, NASA certainly got the data format issue exactly right. Working with SRTM data is a joy compared to most other data sets, for example SDTS, ASTER and shapfiles, because it is simple, I16 fixed format flat binary data. It is compact, easy to machine process, and unencumbered with unnecessary information that is a pain to parse through and discard. SRTM bil format readers and writers can be written with just a few lines of code. After the SDTS nightmare (I am still looking for the application that writes SDTS format data), one government agency got it right. Here are a few things that almost make me love NASA:
1) The tiles are all the same size, 1201 by 1201 elevation postings. No need for embedded record length data.
2) The tiles are all oriented north/south. No need for embedded rotation angle information.
3) All data is referenced to the WGS84 EGM96 geoid. No need to haul this information along in every data granule.
4) Elevation postings are (3 arc-seconds/1201) apart (approximately 90m at the equator) for all tiles. Ditto above.
5) The corner coordinates are the same as the file name. No need to embed this data and the file names are at the same time both non-ambiguous (no duplicate file names like SDTS) and useful. Bravo!
6) No FP32 or FP64 data types. At least one government agency figured out that it does not make sense for your file format to have more precision than your sensor, (especially when it means doubling file storage requirements and significantly increasing file processing time to decode the FP representation.)
The best testimony is in the results. SRTM DEM data has become arguably the most used data set of its type in the world, which is remarkable given that it has only been available for a few months. The flight simulation community has embraced this data set almost exclusively, with flight simulator scenery disks seemingly available almost before the data was released. The friendly file format is one reason for the rapid utilization of the data. If only NASA/NIMA could fully understand the importance of the SRTM data set and release the (unaveraged) 30m data. Terrain modeler fulfillment! But now back to our story.
Despeckling SRTM Water Areas
One common type of problem that occurs with SRTM radar imagery over water is speckling or noise. Water surfaces generally produce radar backscatter that often results in noisy images. The surface of the DEM may thus be "speckled" with positive and negative deviations from the actual zero sea-level elevation values. (In addition, coastlines may not be well defined.) An example of this problem is shown in the first image to the right. It is a tile taken from the Eurasia data set, file number N00E073, which is somewhere in the Pacific Ocean south of Sri Lanka. There are actually two problems. The first one is that sea level is a little off, averaging about 3m. The second is the noise. Both these problems are relatively easy to fix.
The oceans can be calmed by shifting the entire data set a couple of meters closer to the center of the earth (actually, the reference geoid.) To do this, select Image|Array Operations| Offset Arrays. Enter -3m into the text field in the dialog box and hit 'Submit'. All the elevations will be decremented by this amount.
Next we can deal with the noisey sea level elevation values. The BLACKART 'Clip Negative Elevations' feature has been modified recently to handle sea-level radar backscatter anomalies better. It is now called 'Fix Sea-Level Elevations'. Instead of just setting negative elevations to zero, it sets all negative elevations greater than lowerSeaLevelElevation and less than upperSeaLevelElevation to zero. This utility is designed to eliminate both the positive and negative deviations from sea level produced by the SRTM sensors. (The upper limit selection may also affect coast lines.) Select Image|Array Operations|Set Sea-Level Clipping Parameters to define the clipping limits. (Note: using this feature will generally leave null data areas unaffected, so these areas should be interpolated in a separate operation after fixing the sea level elevations).
After setting the clipping limits, select Image|Array Operations|Fix Sea-Level Elevations. Any elevations between the clipping limits will be set to zero. Normally, this would not be a sufficient fix but since the negative elevations are all in the ocean for this example, this is what we want them to be so no further processing is necessary. The result of the processing is shown in the second figure at the right. (Note: this is only a chip from the larger image. This and all subsequent examples were created by processing the entire 1201 by 1201 SRTM file, although it is also possible to perform the same operations on subsetted images as well.)
It is also possible to deal with noisy data using low pass filtration. My tests have demonstrated that this does not work as well as clipping for this type of problem. Several passes of the BLACKART low pass filter algorithm improved the data but did not completely remove all the negative elevations.
Removing Null Data Areas from SRTM Files
The next example shows another problem typical of this data set, and one that is more likely to be encountered by SRTM users. The image shows tile N00E077.hgt of an area on the border of Ecuador and Peru. This is a pretty good DEM except it has some small areas of missing data in the mountainous region to the northwest of the tile, perhaps due to lakes or snow (the orange pixels). The missing data is indicated by an elevation of -32,768. These areas can be fixed pretty easily as well.
Unlike the previous example, however, these null data regions occur in the mountains where the elevations are as high as 3000m. We don't want to settle for 0m, and we don't have to. We will interpolate the entire file to fill in the missing data regions. (There are also some null data areas in the non-mountainous areas, caused by what may be a tributary of the Rio Caqueta.)
It is now possible, and indeed necessary with the latest BLACKART versions to set the definition of null data. For SRTM files, null data is -32767 so a definition of all elevation values less than, say -32000 will work. However ASTER files may define negative elevations as -150. In order for BLACKART to find the null data without replacing valid data, set the null data definition by selecting Image|Array Operations|Set Null Data Definition and enter the desired value.
Select File|Input Data. I used 0 LSQR iterations and 1000 Laplacian Iterations. We can get away with this because we are really not interpolating much. The only zero elevations between known elevations are the null data areas, which are relatively small. So the interpolation should be fast and effective. (In fact, even as few as 200 iterations produced very acceptable results.)
The most recent version of BLACKART has been modified slightly to handle null data in a way that I hope makes more sense and will work better. Formerly, the program converted all the negative elevations to zero elevations and then interpolated the file, replacing only the zero values with the interpolated values. The problem with this approach is that any legitimate zero elevations, for example oceans were also interpolated. Although this would appear to cause little harm, it is a side effect that is not really needed or wanted. I modified the program so that it copied the target file to a temporary file before zeroing the negative elevations. It then interpolates the temporary file. When the interpolation is done, it scans the target and replaces the null data elevations (as defined above) with the corresponding elevations from the interpolated file. (Negative elevations are defined as less than nulDataElevation as discussed above). This now leaves even "naturally occurring" zero elevations unmodified.
The result of the interpolation is shown to the right. This is a chip cropped from the interpolated DEM in the area where the missing data occurred. The interpolation has done a nice job of filling in the holes. However, the holes were small, so perhaps it is not surprising that the results were good. What will happen if a file with greater problems is encountered?
Correcting Larger Null Data Problems from ASTER Files
The next image shows a typical ASTER DEM. ASTER is notorious for its huge null data regions. (Sometimes there is more missing data than good data.) This is probably because the ASTER instrument is essentially a visible spectrum sensor. It is easily fooled by low-contrast areas like snow fields and it cannot see through clouds. All of the null-data areas in this file (ASTER_DEM20020108123405.hdf) have elevation values of -150m. Unfortunately, ASTER images include a collar area because although the borders of the image are approximately aligned with true north, the valid data region is aligned with the path of the Terra satellite. The collar is always null data too, of course. In order for the collar to be removed, the whole file needs to be rotated so that the valid data region can be cropped out. This is a somewhat tricky operation and is at the moment beyond the capabilities of BLACKART. For this exercise I had to be content with working with a subset of the larger image cut from an area completely inside the valid data region in order to get around this problem. However, the last section of this article shows how to run a subset interpolation without cutting the subset from the DEM.
The next image shows that subset, taken around the area with the largest holes. The following picture shows the same subset with the negative elevations clipped to zero. The final image shows the repaired holes after 10,000 Laplacian iterations and 1000 LSQR iterations. This is probably over kill, but the subset was relatively small (625 by 614) so I could afford to be a little extravagant in the interest of producing a good image. Although the patched holes are obvious because of their smoothness relative to the valid data, they appear to be technically correct with no drooping or other unwanted artifacts. The result appears especially suitable for overlay work. (I have seen one application that applies a texture to the interpolated surface to try for a better match. I may give this a shot with the next BLACKART release.)
BLACKART versions higher than 3.45 include a utility for manually adding elevation points to the unknown data regions. (Do not try to add missing data using the standard drawing tool. Since the byte order of drawn pixels is little-endian, adding an elevation like 56m will result in an elevation of 360,000m in the data file. You can do it if you figure out what the number has to be that, when reversed, will give the desired value. However I have written a tool that does this for you automatically.) This feature is enabled by selecting Graphics|Set Elevation and then Graphics|Draw Elevation|Enabled. My tests have indicated that this is not necessary or beneficial in most cases, but may prove useful in stubborn cases of abrupt elevation changes.
Correcting SRTM Files by Merging DTED0 Data
Another BLACKART feature for repairing DEMs includes a utility that allows the user to merge DTED0 DEM data with SRTM data. There are several problems with this approach. However, it may give better results than interpolation of the SRTM file in some cases.
The advantage of this approach is that DTED0 tiles match SRTM tiles exactly. That is, they share a common corner tie point, orientation and are projected to the same reference geoid. They both cover one degree of latitude and longitude. In addition, coverage is extensive for both data sets. So if you have an SRTM file with a lot of missing data you can search the NIMA database (see previous articles) for the corresponding DTED0 tile and use the data in that tile to repair the SRTM tile.
The obvious problem with this approach is the significant difference in resolution. DTED0 tiles are 121 by 121 elevation postings, while SRTM tiles are 1201 by 1201 postings. BLACKART extracts data from the DTED0 file as follows. First, it expands the DTED0 file by inserting nine zeros between each row and column, converting the 121 by 121 DTED0 file to a sparse 1201 by 1201 elevation postings intermediate file. It then interpolates this file, filling in the zeros with approximated elevations using a Laplacian interpolation. In the final step, it scans the SRTM file. When it detects a null data elevation it replaces the null data with the corresponding elevation taken from the interpolated intermediate file.
The procedure for doing this is as follows. First, load the SRTM image by selecting File|Open from the main menu. Then open the DTED0 file by selecting File|Open Merge DEM also from the main menu. Then, interpolate the DTED0 DEM by selecting Image|Array Operations|Interpolate DTED0 DEM, specifying the desired number of iterations when requested. When the interpolated DEM is computed, merge the DEMs by selecting Image|Array Operations|Assign Interpolated Elevations.
The next few images show the result of repairing SRTM file N00E077.hgt using this technique. The first image shows DTED0 data file n00.dt0 taken from the 77 degree West latitude folder downloaded from the NIMA database. The next image shows this file interpolated to 1201 by 1201 postings. (The image is a little smaller because there is no way to save this image in BLACKART, so I had to screen capture it instead. The rendering shows as much of the image that I could fit on the screen at one time. However, the entire file was processed.) The final image shows N00E077 merged with this image such that the missing data areas are replaced with the interpolated values from the DTED0 file as described above.
Another obstacle to this technique is the sometimes distressing lack of agreement between DTED0 and SRTM elevations. Often these can differ by a significant amount. It may be possible to salvage a merge by offsetting the DTED image to more closely match the SRTM. To do this, select Image|Array Operations|Offset DTED0 DEM. (It is also possible to go the other way if you feel that the DTED0 elevations are more accurate. In this case, select Image|Array Operations|Offset Arrays.)
Correcting Null Data Areas by Subset Interpolation
BLACKART now features subset interpolation for patching holes in SRTM and ASTER files. This means that rather than interpolating the entire file, it is possible to identify a subset of the target image to interpolate using a rubber band box. This has the obvious advantage of significantly decreasing the time it takes to compute the interpolation. Of course it is possible to apply subset interpolation several times in order to fix widely separated null data areas. This feature can be accessed by selecting Image|Array Operations|Interpolate Subset Array.
After selecting your subset area, you will be prompted to enter the number of interpolations to run. (You can afford to be more extravagant when subsetting as only a portion of the image is interpolated.) When you are finished entering this information, select Run|Run Blackart from the main menu. At the conclusion of the interpolation, null data areas within the rubber band box will be replaced with interpolated values.
|