We are in the midst of a global climate crisis, with climatic disasters worsening in intensity and frequency with each passing year. According to the latest IPCC report, the fight to keep global heating under 1.5° Celsius has reached “now or never” territory.
It is “almost inevitable” that humanity will surpass this critical temperature threshold in the current scenario. It further states that global emissions must peak by 2025 to limit warming to 1.5°C above pre-industrial temperatures. However, to achieve this goal, immediate and meaningful climate action would be crucial from both public and private stakeholders, stemming the need for robust monitoring infrastructure.
The trend of increased frequency and intensity of extreme climatic events has inevitably increased the exposure of businesses to material financial losses across the corporate value chain. By the end of the century, the expected value at risk to manageable assets is estimated to be as high as USD 43 trillion. Additionally, 215 most prominent global companies reported almost USD 1 trillion at risk from short term climate impacts, likely to accrue by 2024.
Presently, the world is plagued with fragmented and expensive environmental data, leading to inefficient capital/resource allocation for climate change mitigation and adaptation. Satellite data can play a pivotal role in bridging this gap as, according to the Global Climate Observing System, 50% of Essential Climate Variables (ECVs) can only be tracked using satellites.
However, a variety of issues are still prevalent that hinder the ability to harness the true potential of satellite data:
These problems are evident when we consider that although the world has observed exponential growth in the upstream satellite market in the past decade, the utilisation of the data harnessed has been below par. For instance, it is estimated that more than 100 TB of satellite data is generated per day, with less than 1% being analysed.
“The upstream space market, with its rocket launches and high-tech satellite payloads, may seem at first glance to be the most exciting segment of the space industry. But when it comes to innovation, job and revenue creation and the provision of services that change people’s lives for the better, the downstream market is where the action is” – New Space Economy ExpoForum in Rome on December 12, 2019.
At Blue Sky Analytics, we wanted to empower decision-makers across sectors with climate intelligence by leveraging satellite data, AI and the cloud. The idea was to harness the true potential of satellite data as a result of the booming upstream space market to draw deeper and decision-ready environmental insights.
However, from an implementation standpoint, we realized early on in the company’s journey that we were solving an engineering problem as much as we were solving a data problem. To process and deliver insights from satellite data, substantial work was required to build a technology infrastructure that could efficiently handle such large amounts of data, and deploy and maintain complex machine learning algorithms, while offering it to enterprises in an accessible way through APIs.
To this effect, we set our minds to building a revolutionary new technology infrastructure that would significantly cut down the time of deploying each new dataset while continuing to evolve the underlying models as new sources come online & improved research is available.
Instead of treating each dataset like an individual product, we decided to build a completely automated, scalable, and reusable infrastructure that could ingest data from various sources, including satellites, run them through various machine learning algorithms, and deliver the resulting datasets via APIs. Hence, making our tech infrastructure our core value proposition and differentiating us from other data companies.
This infrastructure is what we call our “Geospatial Data Refinery”. Similar to an oil refinery, it takes in and analyses terabytes of raw data from satellites, on-ground monitoring devices, and any other credible ancillary public sources. As a first step, the data is cleaned, standardized, pre-processed, and stored on the cloud. After this, we run it on our proprietary algorithms and disseminate intelligent data to various stakeholders via APIs and our visualization platform, SpaceTime™.
The refinery helps tackle the climate data gap by:
Given the accelerating climate crisis, the geospatial data refinery has the ability to generate valuable insights to speed up the response timeline. It has proved to be an effective layer between the upstream ecosystem and decision-makers. For instance, our first dataset, BreeZo for air quality monitoring, took us 12 months to develop, while our second dataset, Zuri for fires and GHG emissions tracking, took us only eight months. Since then, it has been our goal to develop new datasets every quarter, which will enable us to build a comprehensive catalogue of datasets efficiently.
On the user-facing side, SpaceTime™, our visualization platform, has also reaped the benefits of the scalable infrastructure that we have built over the last two years. It took us more than a year to deploy the first two datasets on SpaceTime™. In just the last two months, we have deployed five new datasets across parameters like GHG emissions, electrification, and fire prediction with each varying in type, representation, temporal frequency & spatial resolution. As the name suggests, SpaceTime™ has the ability to accommodate any kind of dataset with a spatial and temporal component.
The impact of our refinery can most notably be observed in its application in Climate TRACE, a global coalition led by Al Gore, with a mission to accelerate climate action by providing independent high-resolution and near-real-time (GHG) emissions data. As a founding member of Climate TRACE, in just one week, we have been able to visualize global sector-wise and country-wise emissions data provided by the numerous members of the coalition on SpaceTime™.
The geospatial data refinery infrastructure will be instrumental in harnessing the true potential of satellite data and propagating meaningful climate action. Moreover, the refinery will enable us to develop a myriad of climate datasets in 2022 across various parameters – Fire Prediction, Electrification Mapping, Surface Water quality and quantity monitoring, Flood Mapping, Sea Level Monitoring – contributing to 11 out of the 17 Sustainable Development Goals (SDGs).
We encourage users to engage with the geospatial data refinery through SpaceTime™ and our Developer Portal to explore how they can make the most of the infrastructure.
© Geospatial Media and Communications. All Rights Reserved.