Three Global Utility Metatrends

European Utility Week serves as an excellent venue to obtain more global industry perspectives than what is typically found in most North American-based industry events. In the case of the 2014 event, it served to reinforce the conclusion that there are really three global metatrends in play for electric utilities. Three recorded interviews that I conducted during this conference highlight these metatrends, and some of the impacts to utilities.

First, the distributed generation (DG) genie is out of the bottle –centralized generation is no longer the only energy architecture. DG is one component of distributed energy resources (DER). The newly released Smart Grid Dictionary 6th Edition defines DER as Grid-connected or standalone generation, energy storage, or negawatt assets that are deployed in the distribution grid. DER assets can substitute for or supplement grid-supplied power. Jochen Kreuss, Head of Smart Grids Initiative, ABB discussed the challenges that DG, whether in the form of renewables, aggregated assets within a microgrid, virtual power plant (VPP), or independent asset creates for utilities in this interview.  He also noted that communications infrastructures must be capable of helping utilities manage large numbers of devices. This is a critically important metatrend impact. The Smart Grid requires dedicated bi-directional M2M networks that can deliver the necessary security, reliability and speed to support bi-directional electricity transactions.

The second metatrend is the growth of big data and the impacts to utility operations. The four Vs of big data – volume, variety, velocity, and veracity – are stressing the existing siloed operations and legacy solutions common to utilities. At the same time, these asset-intensive businesses continue to add more equipment that pumps out more data, exacerbating the problem. Peter Sigenstam, Vice President and Head of E.ON Innovation Centre Distribution and Daryl Rolley, Executive Vice President, Global Sales for Ventyx, an ABB company, discuss how E.ON’s proof of concept project called Smart Grid Control Center creates more flexible grid operations for both generation and demand in this interview.  This project finds inspiration for data management optimization in industries that range from consumer goods and financial services to automotive manufacturing (particularly robotics) and oil and gas operations. As one example, E.ON recognizes that the retail sector has extensive experience in creating customized suggestions to cross-sell or upsell customers, which could help this utility tailor its service offerings to customers.

The third and final metatrend is the rapid culmination of the impacts the first two trends exert on utilities. Xavier Moreau, Strategic Marketing Director for Schneider Electric reflected on these trends and other important drivers that are triggering fundamental transformations within utilities, as well as in how utilities view and value the grid edge in this interview.   What is particularly interesting is the combinatorial nature of some DER – it can include different forms of energy (think thermal as well as electrical), and controllable loads that are the subject of demand response (DR) solutions and services. That points to additional complexities to microgrid and other DER management as well as managing the data produced by these new assets. He mentioned an ongoing project with DONG Energy focused on island microgrids that integrate very high levels of renewables and incorporate data from sources including weather to provide reliable power. Utilities will have to become smart too, in terms of re-engineering processes, reskilling employees, and revising corporate cultures to accommodate de-carbonized and DER-based electricity grids.

The good news is that while these metatrends are common to utilities around the world, some utilities are finding the opportunities, and not just the challenges created by these trends.  These utilities are actively deploying pilots or full-scale implementations of DER and/or microgrids, exploring the use of sophisticated analytics to aid decision-making, and leveraging grid operations solutions for proactive, not reactive responses.


Smart Grid Trends to Watch: ICT Innovations and New Entrants

The convergence of information and communications technologies (ICT) with the traditional operations technologies (OT) is an ongoing Smart Grid trend.   Within the USA and its 3000+ electric utilities, Smart Grid investments focused on optimization of transmission and distribution grid operations through machine to machine (M2M) communications and forays into data analytics for applications ranging from revenue assurance to voltage conservation.

This ICT/OT convergence trend is encouraging new entrants into the vendor ecosystem that supports electric, gas, and water utilities.  One of the latest entrants is Dell Computers.  Dell made two announcements in the past two months that illustrate how ICT companies are exploring Smart Grid market opportunities.  2013 will be the year to watch their strategies and progress.

Dell recently unveiled their Smart Grid Data Management Solution which combines high-performance computing, networking and storage to manage data for review and action in utility operations.  Leveraging domain expertise and the PI System™ from OSIsoft, they developed and tested a reference architecture in a simulation environment that modeled a utility’s transmission grid operations.  Transmission grids have been one of the early beneficiaries of the Smart Grid through products called Phasor Measurement Units (PMUs), which are extremely high speed monitors that sense changes in transmission conditions.  Taking hundreds of measurements per second from multiple PMUs leads to large quantities of data that challenge existing data storage practices in utilities. Dell’s solution coupled with OSIsoft’s solution provides faster updates and makes actionable data available to staff, applications and business systems.  It’s an excellent example of how M2M communications and data management technologies can become ubiquitous in the Smart Grid.

This is a noteworthy collaboration between a traditional ICT vendor (Dell) and a traditional OT vendor (OSIsoft) that is focused on grid operations.  But Dell has also signaled its intent to get involved in the consumer side of the electricity value chain by joining the Pecan Street Inc. Advisory Board.  Pecan Street is an energy and smart grid research and development organization, and serves as a living laboratory with a community microgrid characterized by residence-based solar generation, electric vehicles (EVs), energy efficiency and energy management solutions for homes.  The project is conducting research in the brave new world of consumer/prosumer evolutions and their energy interactions through data analytics.

While the term “big data” is used in this project, its volumes are dwarfed by the volumes of data that are generated by today’s PMU deployments.   Similarly, if smart meters ever provide data to utilities at 15 minute intervals, that would constitute really big data, at least as analytics providers in financial services or telecommunications would define it.  It’s more accurate to describe the Pecan Street project as one that offers horizontal complexity and scalability as the types of devices, with all their variations in hardware, firmware, and software will need to be managed in addition to the networks that connect them.  There aren’t too many analytics companies out there that can offer this expertise, and the best ones are proven performers in other industry sectors outside of electric utilities.

However, Dell has proven abilities in the arena of data management, and they understand a thing or two about consumers after successfully building a competitive business that sells direct to them.  So their moves into the Smart Grid sector portend more than a continuation of the ICT/OT convergence trend.  It also highlights another trend – that of businesses (others are Verizon and Comcast) that are experienced in consumer retail operations and engaged in exploratory activities to directly engage with electricity and water consumers.  Traditional utilities may discover that their business models are disrupted more by this second trend than the first.  Of course, this second trend is a riskier play, and it is too early to tell if these new players will become intermediaries between consumers and utilities.  It will be interesting to watch Dell in 2013 and see how these trends progress.


Smart Grids and Smart Cities – Same Problems, Same Solutions?

The world population is expected to soar to more than 9 billion people by 2050.  Roughly 70% of the global population will live in cities, which today consume 70% of global energy supplies.  That’s a concern for electric and water utilities, but there are ways to address these concerns.  The best Smart Grid planning methodologies embody two related meta concepts called the “system of systems” and “network of networks” approaches.   Both terms are defined in the Smart Grid Dictionary, and mean that planners must identify potential relationships between systems and/or networks and design solutions that leverage these synergies.  These approaches encourage creative use and reuse of resources for multiple purposes instead of single-use applications, and are especially important when dealing with complex systems and networks like Smart Grids.  Advanced data analytics leverage synergies between data from different sources, and are already delivering value in different electric utility applications.  We need to apply these same concepts and tools to build or renovate complex systems like electrical grids and city infrastructures. 

I recently moderated a panel discussion at a Smart Cities event about the technological and policy implications of big data created as more devices are enabled with intelligence to sense and communicate to other machines (M2M) or humans (M2H).  I have three observations to share with you. 

First, we need to think differently about decision-making and time.  Data analytics give us the opportunity to time-shift decisions.  Sophisticated analyses can be used in predictive and proactive decision-making.  In the Smart Grid world, we recognize that energy storage allows us to “time-shift” generation.   We can also time-shift electricity consumption through demand response and dynamic pricing programs that encourage or reward use at off-peak times.  A smart city working with data to predict traffic patterns could enable automated and realtime traffic congestion management instead of reactive activities.  Smart Grids and smart cities can reap significant benefits from data analytics.  But humans have to imagine the possibilities of how big data can be harnessed to really improve infrastructure management.  And the most insightful information derived from analytics is worthless if humans fail to take action on that information. 

The second takeaway is that the concept of privacy has the same plasticity that we see in our concepts of personal space.  Personal proximity definitions vary on several factors including culture and relationships.  Privacy, and especially data privacy, has similar plasticity in terms of our expectations of how much and what type of data we intentionally share with friends, family, acquaintances, businesses, and governmental entities.  There is a real need for well-articulated roles, responsibilities, and benefits of data creation and use as well as the perils of unintentional data sharing. 

Finally, we need a new law that helps us frame expectations around data.  Data can be created by machines or by humans.  Data will travel across networks to destinations, and may be transformed (anonymized), analyzed (correlated with other data), or stored in multiple locations.   Moore’s Law stated that processing power would approximately double every 24 months.  Metcalfe’s Law said that the value of a network grows as the square of the number of users grows.  We are missing a similar law that frames our expectations about data volumes and the privacy and security of that data.

We need big data from machines and humans, and holistic solution design methodologies to optimize our designs, developments, and management of smart cities and Smart Grids.   But we do need to increase our understanding of the promises and perils of data use in both types of complex infrastructures.  What are your thoughts about the equivalent of a Moore’s Law for data?


Four Smart Grid Innovations that are Utility Painkillers

The electricity value chain of generation, transmission, distribution and consumption has a number of challenges, or pain points, to overcome to deliver all Smart Grid benefits.  Electric utility Smart Grid investment decisions are made on the basis of what reduces or eliminates pain.  There is no “one size fits all” answer in terms of innovations that are true painkillers for them, but here are a few that are most likely to be adopted by utilities.  

The most significant business driver for investor-owned utilities or IOUs is a very simple one – increasing revenues.  That’s particularly difficult with an overall trend of declining electricity use – caused by energy efficiency gains and the lingering effects of the Great Recession.  Electric vehicles (EVs) can be significant painkillers for utilities – creating increased reliance on electricity as the new transport fuel, and thus increased revenues.  EVs may also play roles in helping balance load or the amount of electricity supplied on the grid, another challenge for utilities.  Utilities are well-served to enthusiastically support EV rollouts in their service territories.

IOUs are also very focused on the reliability of the electricity they deliver.   Reliable deliveries impact the amount of revenue collected and can influence regulatory decisions about rate increases or funding for specific projects.  Rate increases are not likely when consumers and their regulatory agencies are upset about frequent or prolonged outages.  Municipal and cooperative utilities are focused on keeping their citizens/members happy with inexpensive and continuous electricity supplies, so solutions that increase reliability are also important to them.

One of the greatest benefits of the transformation to a Smart Grid – is improved reliability.  Replacement of aging equipment with devices that can be remotely monitored and/or controlled has enormous implications to reducing the frequency and duration of outages.  New transformer technologies that are under development will improve reliability and thus be painkillers for utilities.  Just imagine what a smart transformer or two could have done for the December 19, 2011 football game between the San Francisco 49ers and the Pittsburgh Steelers.  

There would have been diagnostic messages that indicated an imminent failure and spurred proactive maintenance procedures.  While smart transformers are still a few years away from commercialization, they will be key painkillers by reducing the number of outages that occur in distribution grids.

These technologies rely, however, on extremely robust and reliable communications networks that overlay the power grid and deliver the bi-directional signals to make the grid a Smart Grid.  Networks include combinations of wireline or fiber connections; microwave networks; public cellular carrier networks; licensed and unlicensed frequency networks that carry smart meter communications; and even satellite signaling for a range of grid operational needs.  Communication networks that experience congestion, delivery delays, or poor service quality could negate the value of many Smart Grid initiatives.  In other words, the communications networks are mission-critical components for the Smart Grid. 

There are two closely related technologies that focus on the maintaining the health of utility communications networks.  Network management systems (NMS) are software applications that provide unified views of communications networks.  NMS solutions eliminate siloed management of these networks, and deliver levels of situational awareness that wouldn’t otherwise exist across these networks.  Improved visibility into network performance helps utilities diagnose problems and dispatch the right responses to eliminate or minimize their impacts.  These holistic views help maintain communications network reliability and security, as well as reduce overall operations costs. 

Closely tied to NMS systems, advanced data analytics solutions that can correlate massive amounts of structured and unstructured data (big data) from NMS solutions and grid operations provide rich information to optimize network management.  For instance, collecting, normalizing, and analyzing the big data involved with network routing, reliability and performance can identify patterns and trends as early indicators of pending problems.  Analytics solutions enhance NMS solutions by putting precision into decisions, thereby reducing network management costs and improving network reliability for complex utility communications networks. 

Unlike smart transformer technologies, there are proven NMS and advanced data analytics solutions available today.  These painkilling solutions are deployed to manage and optimize carrier-class telecommunications networks, which share a number of common characteristics with utility communications networks that cover scalability, flexibility, and complexity.  Utilities will be well-served to look to experienced solution providers from the communications sector to deliver the painkilling solutions that address their communications network management challenges. 



Can Utilities Address Organizational Challenges to Achieve Successful Data Analytics Deployments?

Data analytics solutions will be prominent tools for managing Smart Grid networks – both power and communications – at the distribution level and at the grid edge.  This is one of the common conclusions based on recent interviews with two companies, Aclara and PreClarity Utilities.   Whether the data being analyzed is to develop the most acceptable Demand Response (DR) programs or identifying chokepoints on a network, analytics solutions can help utilities reduce costs, optimize existing assets to postpone new asset investments, and design programs that appeal to consumer segments.  Aclara builds networks to collect data for utilities, and then manages that data through software applications.  Some of the data supplies back office functions like billing or engineering, but they also have a web portal for end users to view their individual consumption data.  PreClarity Utilities delivers advanced data analytics solutions that are used by utilities and other companies to manage “big data” – large volumes of data from meters and other assets in a distribution network for a range of uses.

In separate interviews, Aclara, represented by Andy Zetlan, VP of Product Management, and PreClarity, represented by Bob Becklund, Co-founder, agreed that there are a couple of related challenges for utilities to intelligently incorporate analytics into operations at the distribution and consumption points in the electricity supply chain.  The first has to do with composition of organizations, and the second with approaches to problem solving, and how these challenges are addressed can have profound implications for the success of analytics solutions and any other Smart Grid initiatives.   

Utilities are commonly described as siloed organizations, meaning that departments like operations, engineering, marketing or regulatory relations work very independently of each other.  The introduction of communications networks to transmit the data that smart meters can deliver – also known as Advanced Metering Infrastructure* (AMI) networks – creates challenges for siloed organizations since different groups have expectations and requirements, Andy Zetlan noted.  For instance, the group responsible for billing needs a network that reliably delivers volumes of meter data – although it may not need to be at near real-time speeds.  An operations center may need relatively small streams of data that have little tolerance for latency or delays to transmission.  Building cost-effective communications networks in the distribution grid that can adequately satisfy both needs is the challenge.  Utilities that can bring all groups together to document their use cases – what they need the communications network to do – are taking the first steps to reducing, if not removing, those silos. 

Similarly, once the communications networks are in place, utilities also need to determine the value of data to build the proper analytics tools.  Use cases are a great way of helping to define requirements.  Some data may have no value, and other data may be immensely useful.  Marketing will find data that helps them predict the profiles of early adopters of DR programs to be extremely valuable, but this data doesn’t need to be delivered instantaneously.  On the flip side, the operations group might identify a combination of asset data with Geographic Information Systems (GIS) or spatial relationship data that helps them respond to outages faster, providing a higher prioritization for that data in terms of how it is loaded and stored for analysis.

The related challenge is in how utilities solve problems.  There are some aspects of utility operations that are unique to electric, gas, or water utilities, but when it comes to communications networks and data analytics, there’s a lot to be said for learning from the experiences of other business sectors.  Telecom companies have experimented with flat rate pricing to application-specific pricing, and companies like PreClarity have experience in designing analytics solutions that help regulatory and marketing groups determine the right pricing designs and programs.  Putting a different spin on consumer segmentation, Bob Becklund pointed out that the cable industry is similar to utilities in that what was once a uni-directional service (entertainment rather than electricity) is now transforming into bi-directional flows of communications or electricity.  That forces changes in how these companies will relate to consumers, and what they’ll need to know about them via analytics. 

Both PreClarity and Aclara also voiced similar themes about the needs to normalize data – which means providing a common synchronization of data so it can be used in the correct context.  Time-skewed data creates false relationships and erroneous conclusions that can undermine the value that correctly normalized data can provide to different utility users. 

Two different companies, but the same messages that utilities best serve themselves and their customers by finding new ways of doing business that start with addressing siloed operations and looking to other industries for parallel applications of how data analytics can improve operations, reduce costs, and improve consumer relationships.  

 *you can get a definition of AMI in the Smart Grid Dictionary 3rd Edition.