In business, much of the data we use has a geospatial element. The reports we write will reference a place, sometimes they will include assets that have defined locations, even travel and staff are geospatially driven. As businesses strive to be more intelligent, they (mostly) use this geospatial data to great effect through business intelligence solutions to enhance product offerings.
Unfortunately, this doesn’t always filter to the rest of the business as geospatial data is not always quantifiable. Where business is a money driven beast, it is not uncommon to find that there is a procurement division or process which is often not as geospatially aware or skilled in the nuances of geospatial data analytics, spatial accuracy or industry standards. When purchasing data for reports, a request for 1:1,250 topography vector data goes in and the Geographic Information System (GIS) team end up with some 1:2,500 raster.
The best analogy a mentor once gave me was:
“Imagine that you had to hire a car for a business trip and you asked for a Hummer. The business will turn you down flat as they only look at the cost and the high level need. It doesn’t consider that you are driving into a war zone in the middle of winter.”
Geospatial data isn’t special, but it can be made effective. Choosing the right data can mean that it can have downstream benefits and support further parts of a project. This article aims to look at both: where we source the data for our projects, and how investing can provide better return.
Buy once use many times
Often, data is purchased initially as part of a scoping exercise. The business wants to ensure that its thinking is right and it is profitable before investing more money. At this stage, the business wants to invest the minimum amount, whereas the analyst will need to have the best data at hand to provide a sound result. Even before they have started, both parties have lost.
To provide the most accurate picture of all the issues, risks, and potential benefits, the geospatial analyst requires a full understanding of the project, the parameters of the output (how detailed, what counts as a ‘no’, what is a ‘yes’), and also how it will be presented in the final output. Without this guidance, the analyst is simply throwing data together and hoping something sticks. This is why managers often find the team asking questions that they might not always have.
Commonly, for GIS, data may be definitive, large scale, or small scale. Definitive scale is used for planning and may be required for submitting plans or information to local government. Working at this scale is ideal for working through the whole project, though due to the high cost it may not be ideal at the scoping stage. Large scale mapping ranges from definitive scale to city or sometimes county-wide data; this has less detail, such as trees, exact building footprints, and points of interest due to the amount of information that needs to be condensed into the area shown. Many national mapping agencies provide free or open-source data at around 1:10,000 scale which is more than good enough for scoping, though one has to keep in mind that certain detail may not be present for precise analysis. The last — small scale — is country or world-wide mapping. This is ideal for scoping wider issues, for example, wind or tide regimes, disease, or shipping. This data is available through many places for free or open source, though again, the data may not be specific enough to analyze data at city level.
Another consideration is the temporal, or currency of the information. In many scenarios it does not matter if the data is a few months old, though in the case of crime, disease or construction, data can change by the week or even day. Imagine purchasing some coronavirus data, how useful would it be if it were a month or more old? This extends to aerial imagery data too, as many incorrectly believe that imagery from Google, Bing or Esri are current. The truth is that these aerial images take time to process and edit, often the images are up to or over three years old.
Buy or build?
Geospatial data can be a commodity that carries value. This may be returned financially and also through having information which others may not. So, one of the toughest questions is whether to capture the data through the GIS team, or purchase it through a third party. For many businesses, capturing data isn’t an option, there is no consideration of the value of data or how financially viable it may be, especially with the wealth of techniques and tools at hand at present.
If we consider a project which requires some 3D GIS modelling of a small town in the UK, one might immediately think of the off-the-shelf products like OS Mastermap, which includes buildings’ heights (to create massing buildings), 1:1,250 scale topography, and they might also provide 5-meter terrain. It is trusted data, accurate, and claims to be updated every six weeks. Unfortunately it could cost in excess of GBP 20,000 (USD 26,155) or more. For the same price, you could use an aerial capture company that could fly the area, capture new aerial topography, 25-centimeter terrain and surface data for around GBP 10,000 (USD 13,078). With detailed imagery and surface maps (Digital Surface Maps, or DSM), it would be easy to extract detailed tree, building footprints, and roads — even generate a simple 3D mesh that could be used for shadow and flood analysis.
Sharing is caring
The modern GIS is designed to share information — most of them use live web maps where results and information can be viewed as it is updated — if it doesn’t, there are a plethora of free online systems where it can be done. If your GIS team can’t, then you may want to consider a refresh of staff as at the worst case a simple QGIS2web map could be stored on a shared drive to view!
Immediate access to data and live interaction drives a much better experience for the team as those writing the project reports are able to examine and review potential issues and detail in advance of receiving analysis or maps, allowing more detailed questions and focused analysis. With accessibility to live mapping, experts are able to question missing or incorrect data in the first instance, removing timely reviews and refreshes of the data. Subject matter experts on the project may also provide quality assurance on the data, pointing out issues should there be any.
Management is key
The truth is that if you are not a geospatial user, it can be hard to ensure that the projects are using the best data and the right considerations are being made. In many businesses, projects fall into ‘types’ which makes reducing factors easier. If it is a business where every project is unique and different, it is harder to make more efficient but there are controls which may be put into place.
When projects fall into types — even if it is as high-level as ‘city level’, ‘county level’ or ‘including buildings’ — it is possible to start creating a menu. The menu should be a simple spreadsheet which has each project type either in a sheet or as a header, then within that project type, a list of the common maps or pieces or work that are expected are put underneath. Although the data or analysis may be different every time, there will be common elements, these may vary depending on the scale, temporal, or accuracy requirements of that specific project. Therefore, the potential data is listed, making a menu of sorts. What is actually being created is a structured workflow, which will enable engagement between manager, geospatial team leads, and the analysts and allows all levels to comprehend whether the data is suitable for the project. It is this engagement that ensures that the questions over the data are answered by the whole project team. The project lead/manager is the only one who may specify the detail and currency required for the project, the geospatial team are only able to provide the availability, current licenses, and the potential risks related to each data option; the menu approach provides a method for applying these risks and costs.
Where there is no specific project type, it is harder to specify any of the above but a menu of sorts may still be created based on expected outputs, though this may be needed on a project-by-project basis.
Conclusion
As a manager, you should not need to keep abreast of geospatial data formats, quality or even the currency of the data, though the risk in using the data and the impact to the project is paramount. Therefore, putting controls in place to ensure understanding between teams and leaders is vital. Even if things do go wrong, by using shared web maps with the data in, issues may be picked up early and effectively, if more up to date data or higher resolution is needed, it can be caught early.
Disclaimer: Views Expressed are Author's Own. Geospatial World May or May Not Endorse it
© Geospatial Media and Communications. All Rights Reserved.