Eyeing the change
Dr A.R. Dasgupta
[email protected]
Bhanu Rekha
[email protected]
High resolution remote sensing from air and space has seen an explosion of new sensors and associated software for processing the data. The end user is spoilt for choice and perhaps a bit confused as well. What can we expect in the future? What are the promises and pitfalls? Will satellites take over from aerial surveys?
GIS Development spoke to a few experts from industry, government and academia to understand the pulse and provide a snapshot of the status of the technology and associated technomanagerial issues.
AERIAL IMAGING SENSORS
The vital element in all photogrammetric surveys – the camera – has undergone a digital make-over a few years back owing to technological advancements and has started replacing its analogue counterpart so fast that in some countries, analogue aerial images/magazines/2008/sept are no longer accepted for topographic mapping. With better radiometric resolution and ease of use, they eliminate the complications in handling, film retrieval, and processing associated with analogue cameras, cutting down the costs significantly. Pointing out this trend, Prof Karsten Jacobsen, Institute of Photogrammetry and Geoinformation, University of Hannover, says, “There are two trends – one goes to an extension of number of pixels/image, the other goes to the use of medium format digital cameras, which may be more economic for small projects.
Both the trends are caused by the economic requirements.” He predicts that the separation between both groups will soon disappear because of multi-head medium format cameras, using 2 and 4 subcameras like produced and announced by DIMAC, RolleiMetric, IGI and Applanix. In fact, Intergraph has recently announced a medium format camera with metric capabilities aimed at the small survey companies and smaller jobs in big companies.
“However,” Prof Gottfried Konecny, Institute of Photogrammetry and Geoinformation, University of Hannover, points out, “It must be remembered that digital frame cameras use multiple objectives, and therefore have a calibration problem. ” Instead, a digital camera based on the pushbroom principle has better possibilities for imaging at a wider angle, he opines.
Also, these new systems require development of new algorithms to utilise their advantages. In particular, large format cameras almost always use multiple linear or frame CCDs and thus have a much more complicated interior geometry, which requires a new comprehensive sensor modelling and calibration software. Looking into the future, Dr PK Srivastava, Deputy Director, Signal and Image Processing Area, Space Applications Centre, Ahmedabad and President, INCA, says, “Imaging sensors are becoming complex. Technologies like time-delay integrated circuits, combination of the imaging devices as well as registration of multi-spectral and the High resolution remote sensing from air and space has seen an explosion of new sensors and associated software for processing the data. The end user is spoilt for choice and perhaps a bit confused as well. What can we expect in the future? What are the promises and pitfalls? Will satellites take over from aerial surveys? Motionsphere photogrammetry high resolution panchromatic may become part of onboard hardware itself. Onboard satellite systems may approach a resolution of 20-30 cm in civilian domain.” Another trend that would dominate the industry is clustering of sensors. New electronic fabrication processes, software implementations and new application fields will dictate the future growth of image sensor technology.
DIGITAL ELEVATION MODELS
From being a trend, automatic DEM generation has become a standard procedure today. People are trying to get about 30 m posting DEM in automatic mode. It is not 100% automation but semi-automation is definitely in place. Wherever the behaviour of ground features is regular, there is no serious problem in automatically generating DEM. But wherever ground breaks, elevation model also breaks and that is where automation becomes difficult.
“Even in such cases, context based, slope based algorithms are being tried out which are leading to more and more automatic DEM generation. At resolutions of about 2.5 m with 10 m posting of data, up to 90% of automation is possible today,” Dr Srivastava says.
It also depends on the accuracy needs and the funds available. Says Prof Konecny, “While air-borne laser scanning is the most accurate tool in the decimetre range, it may well cost over $ 100 per square km. Airborne radar (Nextmap Britain or Europe) may cost less than half at an accuracy in the metre range. DEMs from medium and large scale aerial photography are in the 0.2 to 0.5m accuracy range. ” Nowadays, they are produced by automatic image matching techniques in digital photogrammetric work stations. The matching technology is under constant improvement (for the generation of true orthophotos).
HIGH AND LOW OF IMAGERY RESOLUTION
The technological advancements in imaging sensors led to a glut of high resolution imagery, presenting an entirely different set of issues to tackle with. “High and low resolution images/magazines/2008/sept are used for totally different applications.
They are complementary. While high resolution imaging is required for topographic maps (in the form of GIS), low resolutions have advantages for classification,” substantiates Prof Jacobsen. One interesting question often posed in this context is why not have a geo-synchronous imaging sensor at 50 m resolution taking imagery when needed to substitute high resolution imagery of 1 m resolution at low earth orbit? Space agencies are seriously considering this proposition but are bogged down by complexities at this point of time. “The complexities are related to tradeoffs between large area coverage with imaging devices and active controlled scanning of the earth’s disc. Advantages of pushbroom scanning are not available at this altitude. But people are serious about it and in future we will see these kinds of resolutions on geo-synchronous platforms,” predicts Dr Srivastava. Experts also consider satellite constellations, such as those in preparation by Rapid- Eye, as another suitable solution.
Another question that often props into discussion rooms is if high resolution data is meaningful only if used in stereo mode for generating DEM. While the data in stereo mode is good for interpretation, improves data extraction and feature identification significantly, experts suggest that it is not a requisite for generating DEM.
STATE CONTROL OF DATA
Google Earth opened the can of worms for high resolution imagery and the issue of State control over data access to deal with security and privacy concerns became a topic of debate. This notwithstanding, there is a unison of opinion regarding data access, owing to the fierce competition. “The influence of the State can make or break the usefulness of satellite imagery,” argues Prof Konecny. Echoing similar sentiments, Prof Jacobsen feels, “the State should never control data. We see in all countries with State control of data access, a slower development of the Imaging sensors are becoming complex. Some of the technologies like time-delay integrated circuits, combination of the imaging devices as well as registration of multi-spectral and the high resolution panchromatic may become part of onboard hardware itself Wolf Creek impact crater, Western Australia application and a use of unlimited available high resolution images/magazines/2008/sept.”
SATELLITE IMAGERY VS AERIAL PHOTOGRAPHY
The ever raging debate at all major discussion fora is if high resolution satellite imagery can ever replace aerial photography. “This is a question of economy and access to the data,” opines Prof Jacobsen. Truly, small and medium scale maps from 1:50,000 up to 1:10,000 scales can at most be replaced by high resolution satellite imagery, though we would still need ground survey of some objects. This gets feasible only when the costs of high resolution satellite imagery is not high or is comparable to that of aerial photography. But according to a survey in Europe, the current commercial price of high resolution satellite imagery is about two times higher than aerial photography. The advantage is that mapping with high resolution satellite imagery is much simpler than with aerial photography and efficient in terms of mathematical modelling and the coverage area.
Taking a judicious stance, Dr Srivastava says, “There are certain applications where aerial photography is getting replaced by satellite imagery. On the other hand, it is the same sensor which can be put on a satellite or on an aerial platform. So, on an aerial platform, it definitely gives much higher resolution and much more availability to the people who own the system.
The quality of aerial photography from the same sensor is bound to be better at any time. So, aerial photography in that sense is not replaceable. While the sensors keep improving, it is the aerial platform which is not replaceable.” So, the opinion is divided and the verdict is ‘yes and no at this moment’.
SOFTWARE SPAR
The trend in photogrammetric software is definitely towards higher automation like automatic recognition of objects. Nevertheless, the state-ofthe- art is the supported object identification, a human interaction today cannot be avoided. The debate between open source and proprietary or customised software continues as contrasting views emerge from experts and industry.
Mostafa Madani, Chief Photogrammetrist and Product Manager, Z/I Imaging, votes for open source software. He opines, “Software is becoming universal so that it can handle data from any sensor provided that data follows international standards (ISO). Data processing is on-the-fly eliminating or at least minimising post processing tasks.” Contradicting Mostafa’s idea, Dr Srivastava says, “People are not happy with general purpose application software. They want to have the application carried out the way they want it or the way they are used to do. So, software packages are getting customised.”
Another major trend is the run towards Web-based software. This way, collaboration with other data sets and data sources is possible. The third trend is building interactive elements to control data quality.
“The feedback to the system, in terms of quality, should be built in the software. Interaction in terms of field data collection should be possible as extension to the software package and it should directly feed into whatever computation is taking place,” surmises Dr Srivastava.
CONCLUSION
No technology is worth its mettle if it doesn’t find use in humanitarian applications. Fortunately, imaging, image analysis, LiDAR and photogrammetry find use in societal applications including disaster management in a big way.
However, advance data preparation and creating last mile infrastructure in terms of communication and relief holds the key. So, the one aim of all the stakeholders should be to create an enabling environment to reach the fruits of technology to general public.