As geospatial technology advances and becomes critical to almost every element of the world around us, we are saturated with geospatial data everywhere. It is phenomenal how accurate data be gotten from governing bodies, environmental agencies and even transport bodies.
For the geospatial specialist, they can open their GIS or spatial software and have access to thousands or millions of data which can be used to model, analyze or visualize.
Almost these data are snapshots of now or of the last known occurrence, the real world is always changing, it isn’t good practice to compare last year’s conservation data against this year’s aerial imagery or a decade old forestry data.
Yet, it is happening all the time, not that it is an enormous problem if the real world hasn’t changed too much. After all, if accuracy is important to a project, then it could be captured through a survey. The issue doesn’t stop there.
It is important to consider the data which came before. How quickly is the city growing and is it likely to affect the new area being considered? Where is the coastline disappearing? Issues which may only be addressed by using the data as it changes.
Geospatial data standards do a great job of ensuring the right names and terms are used. In ISO 19115, there are even a few lines about temporal resolution in the metadata, but how many data providers also provide their historic data? Further to the point, who’s responsibility is it to ensure this historic data exists?
When looking at almost all the national data, which is provided openly to support the millions of businesses using geospatial information to deliver their projects, there are no links or reference to availability to historic data or past data. This includes providers like Natural England, Defra, Historic England and Ordnance Survey.
History
The irony is that weather data, natural hazard data, earth observation and tidal data are readily available as monthly and annual data for periods over more than a decade or more.
Some data is more granular than others but essentially modelled so that valuable information may be extracted, like regression analysis and AI, which may be performed to predict what might happen in the near or distant future.
This temporal geospatial data is more valuable now more than ever. There is a growing pressure to deliver affordable and green housing as well as ensuring environmental targets are met.
This can only be done by understanding the current problems and then using how the information has changed over time to understand where future change may occur and how it may be influenced.
It not only applies to housing, but to the energy sector, through understanding how community acceptance, weather and land availability may change. The transport industry, through prediction of urban sprawl, requirement for new stations due to change in work habits.
Even the leisure industry, through the use of environmental, historic, customer habits and transport over time. This could go on endlessly, for almost any industry.
What needs to change
In an ideal world, geospatial policy would change to ensure that providers of geospatial data are responsible for maintaining historic snapshots of their data, make them available as well as provision of the current data. Of course, this would be supported with metadata.
There are many legal reasons for the definitive source data to be recorded in snapshots but the most compelling is where a project may be built, then years on, an accident or complete failure occurs and the information is needed to be analyzed.
When this happens, the data used will need to be verified that it was the best data at the time or that nothing was missed. Without the definitive historic data, it is impossible to verify.
Currently within the UK there are many abandoned renewable projects due to a change in government policy in 2016. Some of these projects have already been approved but validation of the information used in the project is required by multiple interested parties but with no access to the available data at the time.
As more geospatial data is used to secure and validate projects, it is more important that ever to have a way to perform due diligence on projects, one only needs to look at the growing housing market, new infrastructure requirements to support the increasing population or the military services who face constant changing dangers.
At present, there is no mandate or requirement from data providers for maintaining or providing historic records. Users may only use historic data for many of the providers by implementing it in their own systems through the use of timestamping data or archiving old versions and keeping the newest, which works well, though on from the point at which you first implemented it.
A further issue here is that there is little or no way to “back fill” these data either, there is no central repository or helpdesk to call who will agree to provide a decade worth of data. In part due to the size of the data and the intractability of trying to share it, but also due to the time which might need to be invested into recalling the data.
What next?
The ISO 27001 business certification covers data retention and storage which extends itself to geospatial in that data stores will be archived and stored for periods of time (defined by the policy).
This means that data is recorded historically but not in a useful manner, ideally a change only update approach method needs to be implemented, whereby new records get inserted with timestamps and removed records also get timestamps (and an identifier to say whether it is inserted or deleted).
By using a change only update (COU) method, the historic data may be recorded in the most efficient manner but also be read in a geospatial software.
For over a decade, GIS like Esri ArcMap and QGIS have had time viewers (called time managers) which allow viewing of data over time through reading an allocated time field, even the export of videos showing the change over time.
Web mapping can also render time enabled data, platforms like CesiumJS have time sliders out of the box to allow viewing data over time.
As we grow into AGI (Artificial General Intelligence) and there is a need to train and inform AI models, there will be an increasing need to use more historic data to truly understand trajectories of growth and change.
In AI and ML, this data has started to move away from GIS and is being called “spatiotemporal data”, though in reality it is still geospatial with a time element.
My prediction here is that there will need to be a change in the way public bodies provide data and this may lead to a new data format. Much in the way we embraced Z (height) enabled data, whereby the Z was not a field but a datum reference, we will need to embed time and changes within the data to not only remove data bloat but to also ensure time zone references are kept and linked to the coordinate systems.
Could we see Z,M and T enabled data? I hope so, this will help with better BIM integrations and planning projects.
The motto of GIS is that everything happens somewhere, I suggest we now mature to the next evolution of GIS and change the motto to everything happens somewhere at sometime.
Disclaimer: Views Expressed are Author's Own. Geospatial World May or May Not Endorse it