In the previous article Collaboration Begins at Home, I discussed the benefits of a Geospatial Strategy and getting more involved with the wider business. For those who are in a small business or churning out large volumes of maps in a team, it may not be as useful as some guidance on how they may be more efficient or control the quality.
Many businesses do not fully understand the geospatial capability that they have. This can lead to a lack of utilization and not getting the true value and power of what they have at hand. Because of the repetitive nature of the GIS teams’ work and the need for some managerial control, it may seem like some GIS teams are lazy or let simple errors through, this can frustrate a professional business but this is rarely solely down to an individual and misunderstandings around output are often because of poor instruction.
Here, we discuss methods for both GIS team leaders and businesses and how they may efficiently control workflows to ensure that information is clear to all parties involved in the process and to get more quality and control by enabling stages in the workflows that will be helpful to everyone.
There are two primary areas, the first is the workflow. This is how the data or information’s got, processed, and related to producing the required output. We describe it as a model, so that we may modify it for each project but maintain core elements to keep as much consistent control as possible. For the workflow, the prime elements focus on ensuring the message is understood by all parties, making the process efficient, and then verifying the output meets the requirement.
The second area focuses on the data itself. How to relate what the data is, how current and costly it is, how to ensure quality, and then make it relatable for non-GIS users. Personally, I find that keeping the workflow and data control works better as two separate areas, this allows for authoritative control of information/data and the setup of models for workflows independent of each other, though this doesn’t mean that you can’t intertwine the two so that data models, workflows and licensing is captured on a per project basis.
Workflow
Managing requests
As a project manager or person requesting a map, you may not understand all the intricacies that go into making the map. You surely won’t know things about projection systems, graticules, or which north arrow to use. As the map creator, there is never enough information. If you want the information, it will need to be put into a request form, especially information on what paper size and whether it is to go in a report or be shared as a web map. Luckily, we are in a time where this can be done with Google Forms, an email request, a spreadsheet, or even Jira (for a DevOps team).
Another benefit of doing this is that you have evidence of the request too, the requestor can’t come back at the end and deny it is what they asked for, though, to be efficient, the requestor should always see the first draft to ensure it is in the right ballpark. Ensure this is a face-to-face meeting to capture the reasons and openly discuss changes or deviations.
Production specification
In a way that a geospatial strategy can help guide the management team on how the geospatial function can be successful, by creating a product specification (the core recommended parameters for which geospatial teamwork). This will make it easy for the management team to present the common working parameters to clients when initiating projects and it makes it a lot easier to create some consistency and best practice for the geospatial team. Commonly this form would be different for offshore/onshore and country-specific with expected coordinate systems, datum, base mapping, graticules, or grid being used.
Daily Stand-ups
If you’re not on your own, having daily catchup for 15 minutes every day with the team is a huge help. It is an approach used in dev-ops to capture any hold-ups or issues which may delay the product and it works just as well for GIS, even if you are working on different projects as knowledge can be shared to overcome issues to speed up output. Management can then catch up with a team lead every day or weekly to understand where issues may lie and to manage expectations with a client (or clients) better.
Modeling analysis where possible
For many projects, there may be data that requires creating or an analysis which requires performing, for example, if a cable route needs creating or a hill shade needs creating. For both evidence for the management (and client) and also for repeatability, this should always be modeled. With GIS systems like Esri and QGIS, there are workflow modeling tools, these can be exported or saved as a flowchart, (PostGIS/SQL models/scripts are also easily saved). By doing this, not only can you re-run the analysis easily if required but also so that it may be re-used for similar projects with little setup. From a management perspective, it can build IP and also evidence how the work is going to be performed to the client. Obviously using models is a time saver which can be performed by junior GIS users.
Automate where possible
Partially answered by the above, automation is where the entire workflow for each stage is automated. As stated above, this can be easily done in QGIS, Esri, and PostGIS. Python can also be used to link processes together but in my opinion, FME made by Safe software provides the best way to automate both data maintenance but also workflows. By automating, even on the smallest project, when you need to replicate similar work, it can take an hour rather than a day to run a complex piece of work. It also provides a “safety net” so that if a process needs re-running it can be done by another person.
QA (Quality Assurance)
Before you begin doing any work, the QA should already be written. The QA should be a (tick box) list of items based on the standard processes and requests in making the map/output. This will include items such as whether the legend box confirms certain standards, the text is all correct or even the correct data was used.
The first QA should be done by the creator, they should go through and tick that they have checked all the items. The output should then go to a team lead or manager who can then also QA using a list with tick boxes and constructive feedback should be provided. Although this seems overly complicated, it ensures the GIS team has understood what is asked and it confirms the client’s and requestors’ needs, it also reduces the number of times an output needs remaking or adjusting.
When making maps, part of the QA should involve printing the map onto paper at the size and resolution it is intended to be used. Through years of experience, a printed map can sometimes not work the way it appears on the screen. Colors can bleed, sometimes fonts don’t work and I even had colors look too similar because the printer couldn’t replicate a color.
Data Control
Glossary
Business is confusing, let alone when you are using technical terms and geospatial terms. For this purpose, it is good practice to maintain a project glossary and a business glossary, the two may even have overlapping terms. This provides a dictionary for the management team, project leads, technical teams, and geospatial teams to use and removes confusion over what you mean by a “line of sight map” or what the client means by “standard scale”. It is often the case that these glossaries end up becoming part of the client’s proposal and as understanding grows, even used as part of the client discussion.
Data Catalogue
This is the bane of every geospatial manager’s existence. The product and management teams will require evidence of the data they are using for a project, they may even need to reassure the client that the data being used is the most current or meet a specific standard. This means having immediate access to an entire list of data being stored or used.
For the average GIS/geospatial manager there may be 100+ data at different stages of maturity and update from different countries and different coverages at different resolutions which are being maintained, keeping track of it all and adding it to a spreadsheet are extremely time-consuming.
To further complicate the above issues, each data will have specific license terms and lifecycle, therefore it will be important to let a client know if a data license requires an update in advance.
Unfortunately, as far as I am aware, there is no all-encompassing system that pulls all this information together. What I have built myself in many employs is a “license database”, this is a database in Microsoft Access (though could be built in SQL/PostgreSQL) and contains a list of the data, the license start date, end date, cost, requirements, copyright text, and comment. By using the Access database forms, it allows for entering new information and reporting on when items are due to expire.
I also maintain the data in either a consistent folder structure (before I started using PostGIS) or PostGIS. Both allow ways of getting a list of data and the last time it was updated. Of course, this is modeled so that it can be run at a moment’s notice, though it only really works when new items are correctly added and expired or deleted items are properly removed.
The data catalog and license database should ideally be linked and work as one, though I have never until recently, managed to get it to all work in the same database. A read-only method of presentation should be a priority to reduce the amount of time dealing with requests from management.
Editor tracking
As discussed briefly above, if working in a bigger team, changes can happen to data, often you may find new data the same as the one you need with “_final” appended which makes you question which is the right version.
If using Esri databases or PostGIS it is possible to enable “Editor tracking”, this enables extra information on fields to let you know when data has been changed and by whom. It has been a lifesaver a few times where data has been accidentally deleted or an update hasn’t worked properly as it affords the ability to roll the database back to before the corruption has taken place.
Metadata
Some love it and some geospatial users hate it, but there is no denying from a management perspective that it is useful. Being able to view the data name, what it is about, what coverage, currency, and intended use it has not only helps the management to sell the data and its quality but also helps in the proposal stage as it means less need to bother the geospatial team or to waste their time in lengthy meetings asking questions.
For larger and enterprise businesses I would always recommend INSPIRE metadata standards, with a company-accessible website or page. For smaller teams or individuals, this is a large undertaking and although I would tell you on paper it should be followed, between friends, I recommend following the format but dropping a few of the more unnecessary fields. Products like Geoportal may be helpful to the beginner, though simply having a sheet with the fields and information may go a long way.
Summary
One of the biggest problems with the geospatial industry is that many companies are telling you what data and standards you need, but none provide advice on efficiency, consistency, best practice, or working better in the workplace. As a manager or business executive, it can be confusing and frustrating to not get results promptly or not to receive the quality you expect.
The information provided in this article is a brief list of items that may be used to speed up geospatial mapping production and is not wholly inclusive. Depending on the work created and the nature of the business, there are many other items that could be included, though I hope it provides some ideas for client-facing teams, managers, product teams, and geospatial teams to make efficiency gains in both production speed and understanding.