A disruptive change is transforming the electricity industry. And one of the important changes involves the role geospatial data and technology play in all this. In the past, geospatial has been a tactical tool for utilities; now it is poised to become a foundation technology for the smart grid.
For the first time in a hundred years, the electric power utility industry is undergoing a momentous change. Distributed renewable power generation, especially solar photovoltaics (PV), is introducing competition into an industry that has been managed as regulated monopolies. Consumers with solar PV panels on their roofs (and in not-too-distant future with Tesla batteries in the basement) and companies like Solar City (co-founded by Tesla co-founder Elon Musk) are fundamentally changing the traditional utility business model. A recent report from the Edison Electric Institute (EEI) report refers to disruptive challenges that threaten to force electric power utilities to change or adapt the business model that has been in place since the first half of the 20th century.
These disruptive challenges have arisen due to a number of factors, including the falling costs of distributed generation such as solar PV, demand-side management technologies (DSM), government programmes to incentivise solar PV and other technologies; and the very low price of natural gas. The financial risks for utilities created by these disruptive challenges go to the very core of how the electric power grid is financed and include declining utility revenues, increasing costs, and lower profitability.
Naturally, US regulators are very concerned about this trend. According to Jon Wellinghoff, outgoing chairman of Federal Energy Regulatory Commission (FERC), “…we are seeing advances in technology and the desire at the consumer level to have control and the ability to know that they can ensure the reliability of their system within their home, business, microgrid or their community. People are going to continue to drive towards having these kinds of technologies available to them. And once that happens through the technologies and the entrepreneurial spirit we are seeing with these companies coming in, I just don’t see how we can continue with the same model we have had for the last 100 or 150 years.”
As a result, the electric power industry is changing every aspect of the utility industry. One of these changes involves the role that geospatial data and technology play in the electricity industry. In the past, geospatial has been a tactical tool — it was (and still is) used in a variety of applications — in outage management, asset management, mobile workforce management, energy density modelling, vegetation management, demand modelling, transmission line siting, substation siting and design, energy performance modelling of buildings, disaster management, and mapping renewable resources, to name just a few. However, with the changes that the industry is undergoing now, geospatial is poised to become a foundation technology for the smart grid.
Two key areas where technical advances are happening are sub-surface utility engineering (SUE) and energy-performance optimisation of new and existing buildings, which are largely resulting from the convergence of geospatial and other technologies, and government mandates and incentives or very favourable and documented return on investment (RoI) creating significant new opportunities.
State-of-the-world energy industry
The International Energy Outlook 2013 (IEO 2013) projects that global energy consumption will grow 56% between 2010 and 2040. Energy use in non-OECD countries is projected to increase 90%; compared to an increase of 17% in OECD countries. The primary contribution to increasing energy consumption comes from the BRICS (Brazil, Russia, India, China, and South Africa). Climate change has been recognised as a global problem by the major energy consuming countries, all of which have adopted policies designed to reduce total emissions (United States, the European Union and Japan) or energy intensity (China and India).
The major drivers for the fundamental transformation in the electric power industry, which is often referred to as Grid 2.0, are increasing demand, universal access, decarbonising electric power, reducing revenue losses, and grid reliability and resilience. Some of the technologies contributing to this transformation are intelligent devices integrated with a communications network, distributed renewable power generation, net-zero energy buildings, microgrids, and the new remote sensing technologies of sub-surface utility engineering.
Planned transmission lines for Germany
At the Second Annual Summit on Data Analytics for Utilities in Toronto, Brad Williams, Vice President of Utilities Industry Strategy at Oracle, made the case for spatial analytics becoming a key technology because everything a utility deals with (customers, assets, and operations) involves location. Williams outlined a number of specific areas where spatial analytics is being applied, including reducing non-technical losses, targeting demand response, distribution operations planning, transformer load management, data quality, voltage correlation (linking meters to transformers), energy modelling, voltage deviation monitoring, geographical outage frequency analysis, and predictive analytics for electric vehicle adoption to name just a few.
On the software side, the convergence of geospatial and model-based design is transforming how we plan, design, build, operate and maintain electric power networks, including generation, transmission, substations, distribution and home and office networks. For instance, Wolfgang Eyrich, creator of the substation design software primtech at Entegra, points out that because substation design is a highly iterative design process, it is a core requirement to link all the available information including geospatial and make it available to the design team. Primtech delivers a comprehensive integrated product model, which includes complete data for everything, from a digital terrain model to the equipment that is maintained throughout the lifecycle of the substation. In the graphical (CAD) environment, primtech, which is based on AutoCAD, can integrate with other design applications like Civil3D, which incorporates geospatial data and technology. Linking to Civil3D enables the geographic dimension for site design and preparation based on a digital terrain model and for critical 3D visualisation that is essential for home owner acceptance and regulatory approval.
Furthermore, “related technologies that had their origin in the utilities GIS world are also serving to further establish geospatial as a foundational technology,” says Cindy Smith, Senior Director, Applications Advantage, Bentley Systems. One example is the use of point-cloud data for reality modelling. Initially used significantly in the utilities industry in applications such as transmission corridor planning, transmission design, and vegetation management, point clouds are now being adopted in other areas of utilities infrastructure work, such as for the retrofit of brownfield substations using hybrid designs of intelligent 3D objects for new equipment and point clouds modelling for existing equipment.
Utility GIS market to grow
A recent report from Navigant Research estimates that the market for smart grid technologies will reach $73 billion in annual revenue by the end of 2020. The increasing penetration of GIS into smart grid workflow applications, such as mobile workforce management (MWFM), distribution management system (DMS), energy management systems (EMS), outage management system (OMS), customer information systems (CIS), and analytics will be the primary driver for electric utility GIS software and services growth. It is forecasted that the utility GIS market will grow at CAGR of 12.8%, increasing from $1.8 billion in 2011 to $3.7 billion in 2017. This is supported by an analysis by Research and Markets entitled Global GIS market in the Utility industry 2012–2016, which projects a CAGR of 10.37% over the period 2012–2016. Research and Markets sees one of the key factors driving this growth is the increasing need for knowledge infrastructure.
GIS has been widely used by utilities for years for automated mapping/facilities management, back office records management, asset management, transmission line siting, and more recently for design and construction, energy conservation, vegetation management, mobile workforce management (MWFM), and outage management (OMS). Now, utilities are integrating GIS with automated meter infrastructure (AMI) and supervisory control and data acquisition (SCADA) systems. Intelligent design has crossed over from the office to the field in utilities, also enabled by the capabilities of GIS, says Smith.
Geospatial-related analytics (spatial analytics) is seen as one of the key aspects of success for electric utility operations in the smart grid era. Looking for patterns and correlations between different land, weather, terrain, assets, and other types of geodata will be increasingly important for utilities. Power-related analytics with geospatial components include network fault tracing, load flow analysis, Volt/VAR analysis, real-time disaster situational awareness, condition-based maintenance, and vegetation management.
Foundation for the smart grid
The smart grid is all about situation awareness and effective anticipation of and response to events that might disrupt the performance of the power grid. Since spatial data underlies everything an electric utility does, GIS is the only foundational view that can potentially link every operational activity of an electric utility, including design and construction, asset management, workforce management, and outage management as well as supervisory control and data acquisition (SCADA), distribution management systems (DMSs), renewables, and strategy planning.
In Smith’s opinion, geospatial technology is already a foundational component of electric power utilities’ IT/OT systems. “Smart grid simply brings more focus to the role it can play by virtue of the visibility of smart grid projects and processes in a utility and their need to exploit the vast amounts of data produced by the smart grid,” she adds. Since much of the data is inherently geospatial/location-based in nature, increased awareness and need for geospatial technology has motivated utilities to ensure IT organisations can readily embrace the technologies. As GIS moved from being a “specialty” system into mainstream IT, many electric utilities have chosen to move away from older, proprietary, or vendor-specific data stores to open spatial databases such as Oracle Spatial and Microsoft SQL Server Spatial.
John McDonald, Chair of the Governing Board of the Smart Grid Interoperability Panel (SGIP), has been a firm believer for long that geospatial information is part of the foundational platform for smart grid. GE’s Grid IQ Insight, which is a software platform of the future for smart grid, includes geospatial technology. “We have been developing analytics on that platform and we found that geospatial information is a key component of utility analytics,” adds McDonald, who is also a director at GE. SGIP recently signed an MoU with the Open Geospatial Consortium which is expected to provide inputs to SGIP’s domain expert work groups, priority action plans (PAPs) and the committees for architecture, cyber-security, implementation methods, certification etc. to see where geospatial information could be incorporated and utilised.
Matt Zimmerman of Telvent/Schneider Electric also foresees geospatial technology playing an even greater role as the smart grid develops. One of Schneider Electric’s key technologies is “graphic work design” which is integrated geospatial and engineering design (CAD or BIM). Schneider Electric’s geospatial division focuses on developing integrated, location– aware enterprise solutions; for example, integrating enterprise systems such as outage management (OMS), customer information (CIS), GIS, and an external weather reporting and forecasting service to help plan crew deployment during a storm. Right now, utilities are seeing the biggest benefit from location-aware work management, asset management and mobile solutions, which provide access to asset and work information to field staff and management using both ruggedised and Apple/Android tablets,” maintains Zimmerman.
Pictometry utilities
A whole new world of predictive analysis
The volume of data generated by smart grid networks is estimated to be 10,000 times greater than that of our existing electrical networks. The exploding number and variety of smart devices and sensors is generating exponentially increasing volumes of real-time data, most of which includes location that requires real-time Big Data analytics to turn the data deluge into actionable information. Geographic location is a foundational technology for Grid 2.0 because location is a fundamental index for organising this data — things that are geographically proximal to each other tend to affect each other.
Geospatial data and technology, including spatial analytics, visualisation and simulation, have become critical for the modern data-driven smart utility. Predictive analytics, analysing structured and unstructured data sources to uncover patterns in the data that can be used to identify issues before they occur is a growing area but this will require high performance computing to enable it to be applied to its full potential.
Zimmerman foresees location-aware predictive analytics for electric networks to come up as one of the major development areas for utilities in the future, where an integrated location aware system will be able to estimate threat potential and forecast where and what type of outages are expected during a storm.
Geolocating underground utilities
According to national statistics, in the United States an underground utility line is hit on average every 60 seconds. The total cost to the national economy is estimated to be in the billions of dollars per year. In most municipalities in North America, 2D as-builts of underground infrastructure are notoriously unreliable. The result is that in most municipalities the location of underground utilities is very poorly known.
New “remote-sensing” technologies are being developed that are helping municipalities and utilities create accurate 3D models of underground infrastructure. Steve Dibenedetto, Senior Geoscientist and Technology Manager, Underground Imaging Technologies (UIT), part of Caterpillar, points out that subsurface utility engineering (SUE) requires a combination of different technologies, what UIT calls a multi- technology approach. The best results are not going to be obtained by using only one technology. If, for example, you are only interested in 2D mapping you will do fairly well using standard radiofrequency (RF) locators or wand type instruments. But RF locators have limitations. Most importantly, the objects you are looking for have to be metallic and you need to know they are there, for example, from as-builts. That is why UIT complements RF locators with newer technologies such as ground penetrating radar (GPR) and electromagnetic induction (EMI). GPR can offer you the depth to a target, in other words, 3D. Another major advantage of GPR is that it can find non-metallic utilities. EMI is another technology that uses a device which induces an electromagnetic current in metallic objects. The conductive objects retain this induced current briefly and then show up on the detector as “bright” areas. A major advantage of GPR and EMI technologies is that they can find underground utilities that are not recorded on as-builts, a very frequent occurrence.
Cities around the world are beginning to realise the value of knowing reliably where underground utilities are located. One example is the City of Las Vegas which is putting in place policies and technologies to develop an accurate 3D model of all of its above ground and below ground utility infrastructure. Lombardy, the region around Milan has mandated underground remote-sensing mapping of all underground infrastructure in its region.
It has been difficult to quantify the cost and benefit of improving the location and other information about underground utilities, but in the last few years research has begun to put a dollar figure on the benefits of accurate location data for underground utilities. For example, the Pennsylvania Department of Transportation (PennDOT) commissioned the Pennsylvania State University to study the savings on 10 randomly selected Penn DOT projects. The study found a return on investment of $21.00 saved for every $1.00 spent on improving the quality level of subsurface utility information. A pilot conducted by the region of Lombardy reported an ROI of €16 for every euro invested.
Energy-efficient buildings
The adoption of BIM processes and technologies is a major trend that has been gathering steam over the last decade motivated by the need for better outcomes. According to McGraw-Hill Construction, overall adoption of BIM by architects, engineers and contractors in the US has increased from 17% in 2007 to 71% in 2012, a growth of 45% over the previous three years, or 400% growth over the last five years.
At this year’s Royal Institution of Chartered Surveyors (RICS) BIM National Conference 2014, Christopher Gray of the Mollenhauer Group, Los Angeles, gave an overview of Mollenhauer’s approach to modelling existing buildings in BIM. Mollenhauer has been an early adopter of laser-scanning for what is known in the UK as “measured building survey”. It uses FARO and Leica laser scanners together with total stations in these surveys. The deliverable is a Revit BIM model.
For example, Mollenhauer scanned a mall called the Beverly Center in Los Angeles with laser scanners and total stations to create a BIM model for the entire structure as a basis for the redesign. A total of 750 separate scans resulted in about a terabyte of point cloud data. They manually generated a Revit BIM model for the entire structure. In this case, the BIM model was the starting point for the redesign of the mall, but to quote architect Carl Elefante “the greenest building is the building that already exists”. Christopher Gray sees the major future business opportunity in energy performance modelling for existing buildings the first step of which will be “scan to BIM”.
An energy performance analysis allows architects to estimate how much energy a building will consume in a year and to assess alternative types of insulation and glazing, and other aspects of building design to optimise energy usage. The starting point for an energy performance analysis is a BIM model of the building. For new buildings, the architect’s BIM model provides the information that is required. For existing buildings, laser scanning (“scan to BIM”) is increasingly being used by companies such as Mollenhauer to create a BIM model. The BIM model provides the key elements that are required for an energy performance analysis, including simplified walls and floors, room bounding elements, complete volumes, window frames and curtain walls.
Together with the geographical location of the building, surrounding structures and the local historical environmental conditions, an energy performance analysis can reduce annual energy consumption and power bills by upto 40%. The energy performance analysis typically includes daylighting and airflow simulations as well as thermal modelling — solar heating, energy consumption, thermal comfort, CO2 emissions, renewable energy integration, and electric power load.
Currently, the primary motivation for energy performance modelling is the aggressive building codes that push energy efficiency, for example, the 2013 California Green Building Standards Code (Title 24). Other motivations are customer-driven certification such as LEED and other “green” certification (LEED v4 incorporates up to 18 credits for demand response) and financial incentives from local governments and power utilities to reduce energy consumption, peak load or both.
Measures aimed at improving the efficiency of buildings have been introduced in Europe, the US and Japan. Zero energy buildings (ZEB) are loosely defined as buildings that generate as much energy as they consume. A major area of focus in the EU is “nearly zero energy” buildings. The European Commission has mandated 2020/2021 as the deadline when all new buildings will have to be designed to be “nearly zero energy”. For public buildings, the deadline is even sooner, by 2018/2019. The Government of Japan put forward its “zero emissions buildings” target in April, 2009. The announced objective mandates that all new public buildings will be “zero emissions” by 2030. The US Energy Independence and Security Act of 2007 (EISA 2007) requires that by 2030 all new Federal facilities must be “zero net energy” (ZNE) buildings. In 2007, the California Public Utilities Commission (CPUC) adopted aggressive targets for ZNE. All new residential construction in California to be zero net energy by 2020, all new commercial construction to be zero net energy by 2030, and 50% of existing commercial buildings to be retrofit to ZNE by 2030.
According to a report from Navigant Research, global zero energy buildings revenue is expected to grow from $629.3 million in 2014 to $1.4 trillion by 2035.
Together with the geographic location of the building and surrounding structures, plus local historical insolation and weather information, a building energy performance analysis and simulation can be performed to compute current energy requirements and assess alternative ways of reducing the building’s energy usage.
Summing up
Intelligent design has crossed over from the office to the field in utilities, also enabled by the capabilities of GIS. Mobile GIS apps are used by field crews in support of installation, repair, inspection, and emergency restoration. Field Service Management or Workforce Management systems take advantage of geospatial information to most efficiently dispatch and route field crews. Web-based GIS is also being used to collaborate among government agencies, utilities, and the public to ensure awareness of planned work.
Geospatial and BIM are also enablers for energy performance modelling which is a fundamental instrument for reducing the energy consumption and improving the energy performance of new and existing buildings. Cities are beginning to develop 3D models of underground infrastructure motivated by new underground remote-sensing technologies and by RoIs of up to of $21 saved for every $1 spent on improving the quality level of subsurface utility information. One of the biggest challenges that utilities are experiencing is increasing volumes of structures and unstructured data that is overwhelming traditional enterprise systems, points out Zimmerman. The structured data comes from smart meters, intelligent electronic devices, and the unstructured data from social networks including Twitter, Google, Facebook and other social applications. He foresees that consumerisation of geospatial technology (all GPS-enabled sensors) will enable crowdsourcing of all sorts of information about electric power networks most of which involves location.
The major challenge is “siloed” GIS. It is said that GIS usage in smart grid applications will demand a high degree of accuracy and timely, synchronised updates, which will be difficult to orchestrate in a federated environment. Ultimately, utilities will have to implement a GIS repository of record that supports smart grid requirements. This means that the stage is set for a transformation in how utilities use GIS that will make it a foundation for managing the smart grid and for fully realising the benefits of smart grid technology.
As Smith of Bentley points out, “The challenge would appear to lie not in convincing the industry to adopt geospatial technologies, but rather in how quickly these technologies can adapt to the evolving plethora of operating systems involved in smart grid and the standards and interfaces they require.” Capitalising on open spatial databases, technologies, and languages is also key to success.