ย
Tomaลพ Podobnikar
Research Fellow
Institute of Anthropological and Spatial Studies
Scientific Research Centre of the Slovenian Academy of Sciences and Arts
Novi trg 2
SI-1000 Ljubljana
Slovenia
Tel: +386ย 1ย 470ย 64ย 93
Fax: +386 1 425 77 95
Email: [email protected]
Digital elevation model (DEM) is one of the most important datasets for the greater part of spatial-based studies and research. A high quality DEM could be generally used as all-purpose dataset, but unfortunately its production could be very expensive. If we know a nature of application that applies DEM and if our demands for the final result are clear, then we can adjust the DEM selection or we can simplify its production.
Generation or selection of a digital elevation model (DEM) suitable for different spatial analyses or visualisation purposes is being discussed here. At the first we should stress that DEM is only a model, that is approximation of the nature and its nominal ground. The models, in our case DEMs, might be different concerning their purpose of use, quality of data sources or interpolation algorithms, experiences of operator, etc. Our starting point is that the DEM should be carefully produced or chosen regarding purpose of our applications.
In general, more course analyses require lower accuracy DEM than detailed ones. We suppose that DEM for producing contour lines (in any scale) or to emphasize the main characteristics of the geomorphology, should be more course and smoother than the DEM for calculating slopes or aspects. Modelling of hydrographical networks requires geomorphologically correct DEM. To get overview of the geomorphology or visualisation of the whole Alps, the DEM could be carefully generalised from more detail data or appropriate modelled. For the analyses of natural landscape or ancient environment, the recent anthropogenic changes (stone quarries, dykes) should be eliminated. For the palaeo-landscape analyses, geological changes should be consideredโฆ
Anyway, we assume that the final user should be aware of the characteristics, cost and usability for particular application before to decide between using of the existing DEM or employing its own DEM production. Here exists also a third option, adopting existing DEM to own needs. For the correct decision is important also considering many other elements of the required DEM, like quality parameters including positional and temporal accuracy, completeness, lineage, etc.
If we are decided to produce our own DEM or to adopt existing DEM, then it is important to select suitable software for interpolation and to take into account quality of the data sources for DEM modelling. In the next chapters we will demonstrate simple tests of the different algorithms using the same basic data sources. Further on we will enhance the DEM production with fusion data sources of different quality and type. These procedures are educational for understanding the management of data sources and DEMs themselves to get the final suitable DEM for required needs. DTM (digital terrain model) is mathematically defined as spatial distribution of heights that are described by continuous and regionally (within the segments) smooth surface. In the praxis good approximation of the DTM is digital elevation model (DEM) that is recorded as two-dimensional discrete matrix of data heights that is more common known as grid structure. In the next, we are going to centre to DEM.
DEM is usually produced form sampled data that are used as its source. Ideally the data sources would be used without of interpolation. For example, only contour lines themselves may represent a model of terrain. They can be acquired directly, for instance photogrammetrically from stereo model or indirectly, for example from analogue cartographic data, satellite images, by surveying, etc. Interpolation is also not necessary in the cases if data source is very precise and high density, and especially if the data is acquired directly into regular grid (DEM). But interpolation of data sources to produce DEM is necessary if the data sources themselves do not predict treated landscape phenomena.
Interpolation techniques base on the principles of spatial autocorrelation, which assumes that objects close together are more similar than objects far apart. On the edges of the interpolated area extrapolation is also reasonable. Unfortunately no one of the interpolation techniques is universal for all data sources, geomorphologic phenomenon or purposes. We should be aware that in the praxis, different interpolation methods and interpolation parameters on the same data sources lead to different results. The best chosen algorithms on fair data sources should not differentiate much from nominal ground that is idealisation of our desired model and which is commonly similar to actual Earth’s surface. Divergences between results of interpolation and from nominal ground are especially consequences of the following circumstances:
- available data sources do not approximate terrain (distribution, density, accuracy, etc. of the sources is not appropriate)
- selected interpolation algorithm is labile (is not enough robust) on the employed data sources
- chosen interpolation algorithms or data structure are not suitable for selected terrain geomorphology or application
- perception or interpretation of Earth’s surface (better: nominal ground) is not the same when more DEM operators work on the same problem; operator’s own imagination is common and reasonable problem in DEM production.
Application requirements play important role to expected characteristics of the used DEM. For example, we do not need high geomorphologic quality of DEM for regional, small scale analyses and for calculating average altitudes. But geomorphologic accuracy is more sensitive for visibility analyses and even more for analyses that uses algorithms bases on derivates like slope, aspect, cost surface, drainage, path simulation, etc.
In the most cases, a very high quality DEM should cover all application demands. So it is preferable to find a good and robust interpolation algorithm, what unfortunately difficult task is. Even if more generalised surface is required, DEM with high detail can be simplified to the required quality. It should be noticed that appropriate generalisation methods are very important for producing required DEM. Commonly these methods are complex.
DEM modelling with common interpolation algorithms
We were tested some most common interpolation algorithms based on inverse distance weighted (IDW), kriging and spline using the same data sources. The IDW methods apply the idea that influence decreases with increasing the distance from particular points. The method could be good for interpolation of geomorphologically smooth areas. Kriging methods take into consideration autocorrelation structures of elevations in order to define optimal weights for different distances from a point and then automatically evaluate the results. The method requires a skilled user with geostatistical knowledge. Spline-based methods fit a minimum-curvature surface through the input points. The interpolation ensures continuous and differentiable (smooth) surface. Rapid changes in gradient or slope may occur in vicinity of the data points.
We employed all of three described algorithms using contour lines data sources on the study area, which is geomorphologically variable (see Fig 1). All of the algorithms were used on standardised way and with default parameters. First of all we decided to asses the results with visual approach, which is suitable for general overview of consequences of the interpolation methods.
Fig 1: Contours with interval of 10 m and lake of Bled in the western Slovenia (a). DEM is produced with IDW – smooth (b), kriging – more details (c) and spline based method – smooth but with recognisable characteristic features (d) (area of 5000 by 5000 m).
For general purpose it is difficult to decide which algorithm produces the best DEM form contour lines. IDW algorithm is optimal if we need results produced in a short time and if the real terrain is smooth. Kriging method is useful in this case but some problems occur mainly on the areas with low density of data sources. Spline-based algorithm produces smooth surface and fortunately without many of badly interpolated areas. If we would like decide to use only one of three basic methods of the contour lines interpolation, then we can think on following way:
- to get optimal result without much effort: use spline algorithm
- to get the best general result for more advanced analyses and visualisation: use kriging algorithm
- to get the fastest result: use IDW.
We can stress that there are no bad DEM interpolation algorithms. Some of them have simply more advantages in certain circumstances. The algorithms are actually the most flexible part of the whole modelling process. It is because usually nobody has opportunity to use the ideal data and one can therefore only select the algorithm that is the most suitable for the used data sources and application.
If operator or user knows a purpose of the DEM’s application, then he can decide about importance of particular quality parameters. Generally, the optimal is the DEM that requires good results after evaluation of many geomorphologic and statistical quality parameters. Let’s propose to allow combination of the three proposed basic algorithms. Then the best DEM from contour lines would be produced as combination of kriging and spline. The kriging would be applied for the areas around the characteristical features like peaks, sinks, valleys, ridges, edges, etc., but the spline algorithm would be preferred on the other areas.
We are going to put more energy to enhanced DEM modelling and then visually compare the results with previous interpolations. The main idea here is to produce as good as possible DEM that would be useful for most of applications. The main data source is still the same, contour lines, but the DEM is being enhanced with fusion of some other data sources, like local ones, lower quality DEMs, geodetic network points and others. One could even use datasets without a height attribute such as lines of the hydrological network, roads, railways, standing water polygons, etc.
A method of weighted sum of data with geomorphologic enhancement was developed. With this method, DEM is modelled through averaging and fusion of individual datasets considering their quality. Grid based datasets are thus overlaid as regards the weights of particular grid cells. After overlaying, geomorphologic enhancement is applied. At the beginning a unique grid size for all data sources is determined – the same as for the final DEM.
Furthermore, each particular data source should be precisely evaluated by a reference dataset (points) regarding the standard test areas delineated by standard regionalised layers. The result is a predicted quality for each grid cell denoted with a random error ( ). For the sake of simplicity we will continue with our discussion with only two data sources. Height of DEM (Hi+j) regarding weights wi and wj and variances and are then
Weighted sums of pairs of surfaces (i+j)k are calculated iteratively by adding independent datasets to previous ones (Fig 2). The random error of the computed DEM ( ) incrementally decreases with every iteration. For two datasets it is calculated with the differentiation of heights as
The best practical solution is to start DEM modelling with data sources of the lowest quality (lowest weights) and to finish with the best data.
Fig 2: DEM, interpolated as weighted sum of data sources
DEM, derived iteratively with weighted sums of data is smoother than the geomorphologically highest quality data source (Fig 3). This is usually a consequence of the nature of the weighted sum. Geomorphologic enhancements of such a derived DEM are therefore required. The best solution seems to be to apply the enhancements only when the DEM is already derived from all weighted data sources.
Fig 3: DEM, geomorphologically enhanced from weighted sum of data sources
The main step of geomorphologic enhancement is the generation of trend surfaces as low frequency functions – with generalising DEMs. Trends are produced with the same conditions for datasets of the statistically best DEM derived by weighting (i+j), and the DEM with appropriate geomorphology (j) (Fig 3). Relative elevations (?j) as a high frequency part are computed from j and then added to the trend surface i’+j’ of the dataset i+j. In this way the final geomorphologically enhanced DEM is produced. Statistically it is slightly worse than the weighted one (i+j), but geomorphologically it is much better. The main problem of the described enhancement lies in finding a suitable filter to calculate the appropriate trend surfaces. The optimal solution is a compromise between geomorphologic improvements and retaining statistic quality.
Described method of DEM modelling was tested on the same study area as in the previous chapter. The modelled DEM looks visually geomorphologically high quality with clear and reasonable details (Fig 4). As it had been tested before, the method serves also DEM with high precision and accuracy. If we are going to produce geomorphologically and statistically highest quality DEM, we do not need to think much about required application.
Fig 4: DEM, generated with more advanced modelling using weighted sum of data with geomorphologic enhancement (area of 5000 by 5000 m around lake Bled).
Effective and suitable DEM modelling from variable datasets is complex, rather iterative process that cannot be achieved intuitively or via a single step. In this sense experience is connected to the execution of stacks of tests and analyses, as well as a better understanding of the nature (parameters) of data. The quality of the DEM is evaluated for every data element and the portion of every data source element used for DEM modelling is known. With purposed approach, we can have full control of the production and effectively inform the final user about the characteristics of the DEM.
Conclusions
Before to start modelling of our DEM we should ask ourselves: Do we need to produce all-purpose high quality DEM? Or: Do we need to produce DEM just for our application? It is known that high quality DEM could consumes even more than 100 times more time then production with using basic algorithms like IDW, or spline are. Furthermore we need advanced software, higher quality of hardware, experienced team and advanced know-how. For our decision is important to have a review over existent DEMs which can be used unchanged for our applications or they are just sources within more advanced DEM production.
Specialisation in the information society is currently so high that we can not master all the processes on enough high level, so it is necessary to trust to the particular specialists and organise them to a reasonable team. It is nice to hear that quantity of digital spatial data is increasing, but unfortunately quality of data doesn’t follow the quantity. Many of the producers do not test produced data sources or models enough carefully. The result of such work is data of unpredictable quality. Similarly, the software always offers more than hardware can manage.
After such experiences we propose some kind instructions that should come together with interpolation algorithms and that suggest the most important steps of DEM production for required application. The instructions should be prepared on basic and on higher level and may include following steps: preparation for DEM production, pre-processing of data sources, processing DEM from sources and managing the DEM data. Furthermore we suggest that user’s manuals of the DEM interpolation algorithms should propose more tips and tricks for the common users. Most often they include only general information as description of the algorithm with parameters, common purpose of use, and some simple examples. We suggest at least hints regarding appropriate algorithm and parameters implementation if data sources are differently distributed or different type. For example contour lines interpolation algorithms are different to the algorithms for scattered points. The tip that suggests which parameters might be used for production of high quality DEM from particular datasets might be also important.
Significant aim of the DEM’s nature is to find a balance between users’ demands and capability of the developed realisation process. High quality DEM production using advanced methods could be very expensive. However, users always demand higher quality then it is offered, but this is not always reasonable. We should stress that even if we produce more sophisticated DEM and if more experienced producers are to be employed in the job, we would get different models. The solution proposed in this paper was confirmed through applied experimentation that enables cost-effective, high quality production and assumes higher collaboration between producers and users. Better DEM we can produce or choose regarding higher knowledge of the terrain characteristics and if we are aware of the application for that DEM is being used. Nevertheless the model should look reasonable!