Smart Grid: A View from the Inside “It’s All about Data”

I’ve invited Kevin Miller, who has many years experience in the electric power utility industry and who now works for Autodesk advising utilities on design information management challenges and solutions, to give his perspective on what the smart grid will mean for electric power utilities.

I have worked in the utility industry for the past 24 years.  I worked at a major electric utility in several capacities in the Transmission & Distribution organization, but for the majority of the time, I was involved in the implementation and support of Distribution systems including GIS, Outage Management System (OMS), Work Management, and Design.  During this time I have seen the concept of smart grid develop and take shape.

How we design, operate and maintain today’s electrical grid  for the most part hasn’t changed in the past 75 years.  Over the years, devices and equipment have evolved slowly (the level of R&D investment in the electric power industry is one of the lowest of any major industry),  but there really haven’t been major changes in how the grid functions.  Of course there was a massive investment in building out the grid to provide universal electrical power under the impetus, for example, of the Rural Electrification Act, which provided federal funds for installation of electric power in rural areas of the United States. 

But, in the 90’s, there was a major change when deregulation and decoupling deflected capital investment from the grid.  The “gold-plate” that had been lavished on the grid in the proceeding decades now was used to finance preparations for freer markets with the result that the technical evolution of the grid stagnated.  In the last two years, stimulus monies, green initiatives, and energy conservation are combining to create significant pressures to quickly catch up, to make the grid smarter to address today’s pressing problems, including reliability, security, customer empowerment, and global climate change.

The systems used to design, capture, maintain, and analyze the grid in use today are partial automations of 75 year old procedures.  The business processes for the design, construction, operation, and maintenance of the grid are carried out by different teams within utilities, for example, engineering design, construction, records, outage management, operations, and billing. Over the years the different functional teams have evolved procedures that support their own specific missions and informational and operational needs.  What has been missing is a holistic view of the entire business process for managing the life cycle of network assets.  Individual teams optimize their own sub-processes, but the optimized sub-processes do not take into consideration the “production and capture” business process  to capture and manage the digital data required to operate and maintain the grid.  The focus on sub-processes and not looking at the bigger picture of the overall business processes and information flows has resulted in enterprise data bases with stale, incomplete, and error laden data and network models.

In the past with the existing grid, we have been able to scrape by with out-of-date and inaccurate data.  Smart grid changes all that.  From a data perspective, smart grid is ultimately just much, much more data, much of it real time.  Getting the value (operational efficiencies and improved operating metrics such as fewer and shorter outages) out of smart grid investment relies on being able to utilize the digital data collected from smart devices to monitor, analyze and simulate the electrical grid in real time.  If your current digital model of your company’s electric power network is based on inaccurate, out-of-date, and incomplete data because your business processes for managing the information flow across the organization are archaic and inefficient, it is going to get much, much worse with the smart grid. 

Systems optimized over the years at the sub-process level may have appropriate technology to utilize and analyze network data, but I find that utilities fall down significantly in their ability to “produce and capture” accurate and timely information to feed these systems. I am continually surprised at how bad a job utilities are doing at maintaining their network facilities data.  When companies are inundated by a sea of unreliable information and experience the difficulty in making operational decisions based on this information, they will quickly realize that they have to fix their business processes’ “produce and capture” problems.  But I expect that when that realization happens it will be too late.

Utilities need to address these looming problems now.   It is critical for forward looking organizations to assess the quality of their network facilities data and review their business processes from the perspective of operational efficiency, optimizing business processes and information flows for data quality, and making sure that they have the appropriate supporting technology.  Redundant data and inefficient data and work hand-offs are prime symptoms of an organization focusing on sub-process optimization and ignoring the big picture.  Reviewing your overall business process from soup to nuts with a perspective above the sub-processes (team level) is critical.  As well it is essential to develop a technology architecture that enables automating the overall business process in addition to supporting and optimizing the productivity of each operational team.  

Geoff Zeiss

Geoff Zeiss

Geoff Zeiss has more than 20 years experience in the geospatial software industry and 15 years experience developing enterprise geospatial solutions for the utilities, communications, and public works industries. His particular interests include the convergence of BIM, CAD, geospatial, and 3D. In recognition of his efforts to evangelize geospatial in vertical industries such as utilities and construction, Geoff received the Geospatial Ambassador Award at Geospatial World Forum 2014. Currently Geoff is Principal at Between the Poles, a thought leadership consulting firm. From 2001 to 2012 Geoff was Director of Utility Industry Program at Autodesk Inc, where he was responsible for thought leadership for the utility industry program. From 1999 to 2001 he was Director of Enterprise Software Development at Autodesk. He received one of ten annual global technology awards in 2004 from Oracle Corporation for technical innovation and leadership in the use of Oracle. Prior to Autodesk Geoff was Director of Product Development at VISION* Solutions. VISION* Solutions is credited with pioneering relational spatial data management, CAD/GIS integration, and long transactions (data versioning) in the utility, communications, and public works industries. Geoff is a frequent speaker at geospatial and utility events around the world including Geospatial World Forum, Where 2.0, MundoGeo Connect (Brazil), Middle East Spatial Geospatial Forum, India Geospatial Forum, Location Intelligence, Asia Geospatial Forum, and GITA events in US, Japan and Australia. Geoff received Speaker Excellence Awards at GITA 2007-2009.

View article by Geoff Zeiss

Be the first to comment

Leave a Reply

Your email address will not be published.


*