Home Blogs Understanding the First Independent Benchmark of Spatial and Time-Series Databases

Understanding the First Independent Benchmark of Spatial and Time-Series Databases

4 Minutes Read
spatial database
Kinetica webinar on โ€˜First Independent Benchmark of Spatial and Time-Series Databasesโ€™

A first of a kind webinar was organized on October 6th titled โ€˜First Independent Benchmark of Spatial and Time-Series Databasesโ€™ hosted by Kinetica and organized by Geospatial World.

The presenter of the webinar was John O’Brien from Radiant Advisors โ€“ an independent research and advisory firm. He shared detailed information on this topic from an eBook covering the details of the research findings.

John Oโ€™Brien is Principal Advisor and CEO of Radiant Advisors and has 35 years of experience delivering value through data strategy, architectures, and analytics. As a recognized thought leader, he has been publishing articles, teaching, and presenting at conferences in North America and Europe for more than 20 years.

Today, John provides independent research, strategic advisory services, and mentoring to guide companies in meeting the demands of next-generation data management, architecture, analytics, and emerging technologies.

The eBook presented in the webinar includes the results of the independent benchmark the research team had designed and performed in early 2022 and released later in third quarter of 2022.

In this report, Radiant Advisors have independently designed and performed the firstย spatial and time series benchmarkย to help organizations evaluate database technologies suitable for their workloads. This report evaluates the performance and functionality of leading cloud databases with built in geospatial, temporal, and graph capabilities.

An interesting presentation was shared in this learning session which included rich insights in to the benchmark researched including geospatial analytics, graph analytics, and examples of use cases where graph analytics were used.

One of the main premises of the research was the understanding that increasing number of companies from varied industries like healthcare, manufacturing, defense, transportation and so on are adopting large number ofย  IOT devices into their business operations which in turn are generating huge amount of data.

Discussions unfolded on how the IoT data world includes both location as well as the time aspect in its nature. Sensor devices capture data at a particular location and time and then again at a different location and time bringing forth the challenge of how to analyze this streaming data moving back and forth through space and time.

The research team kept this challenge in mind and have designed an assessment benchmark to tackle this aspect.

The researchers aimed at using a data set for the study which would relate to a wider audience and chose the New York City bike data set. This is a publicly available popular data set and has a lot of the data analytics and examples associated with it. It includes bike trips data with daily event data which can be tracked and also includes all bike stations with their geolocations and the relevant timestamps.

A lot of location data was thus acquired from New York City. The data set also took into account how natural phenomena like season changes, temperature changes may affect the nature of the data.

The research team spent careful time deciding on which market database to use which had geospatial capabilities as well as time series capabilities and thus focused on Kineticaโ€™s distributed database.

One of the key takeaways of the research team was that the Kinetica database clearly outperformed, in every post GIS query and was the only database that actually passed all the feasibility tests across geospatial time series graph and streaming.

The EBook brings out the kind of business goals which were envisioned for this data type benchmark assessment.

For example it was thought that a bike operations organization may want to optimize their cost in redistributing bikes and understand more such relevant questions like how many rides originated from a selected area or how long were the rides that originated in that area and so on.

The research targeted acquiring greater number of results and the main intent behind the entire exercise was to change the way businesses are thinking about analytics and asking the right questions about the relevant multimodal databases associated with such analytics.

The team worked on the project independently, and also delved into understanding strengths and weaknesses between databases.

The presenter explained in details the various technical paths taken up during the research and the various key components taken up at the different stages of looking at the data and creating the bench mark assessment for the same.

The research team has tried to do the most independent fair benchmark and has designed it specifically to demonstrate multimodal analytics, delving into real business questions and having a data architecture and the technologies that can support that.

The learning session helped attendees understand –

  • Business questions and scenarios that require spatial and time-series data as the basis for the benchmark study
  • Database SQL feasibility for analyzing geospatial and temporal data
  • Working with relational and native graph databases, streaming data, and in-database visualizations
  • Performance results and factors that impact business analysis

The findings of the publication may be read by downloading this e-book from the following link:

https://www.kinetica.com/resources/time-space-database-benchmark/

The session was wrapped up with an interactive question and answer session.

In the present scenario organizations are increasingly incorporating geospatial data to assess data trends that are often difficult to see.

This webinar provided a useful platform for the participants to gain thinking points on how combining geospatial data with real-time and IoT data adds a layer of analytics capability that can help companies unlock more profound insights and in the long run improve operations such as marketing and customer service.