There’s been a significant amount of hype about the Smart Grid’s four Vs of data – volume, velocity, variety, and veracity. But there’s a missing element to these discussions, and that’s the fifth V – the value of data. In the Age of the Prosumer, the value of data has profound implications for the utility sector in general, and to defining consumer and prosumer value in particular. To excel in this new world of real and digital energy service choices, utilities will have to develop prosumer-centric operations and manage the data that is most valuable to them and their consumers and prosumers.
That’s not an easy task. Utilities are challenged to manage terabytes and petabytes of data with processes, tools, skillsets, and metrics designed for megabytes of data. As more devices become smart – capable of sensing and communicating status and accepting commands – the challenges to maintaining productive and cost effective operations will grow. Utilities can’t afford to engage in traditional siloed methods of learning – they will have to look to other sectors for knowledge, best practices, and tools.
As noted previously, other business sectors have knowledge that can be leveraged to good effect by utilities to avoid reinventing wheels of discovery and education. Some sectors are adept at tailoring promotions for both brick and mortar and online purchases (aka omni-channel strategies), to push discounts, loyalty program awards, and purchase suggestions. Their data expertise and best practice experiences can help utilities develop B2B2X marketing programs and definitions of consumer and prosumer value. The telecom sector has extensive expertise in segmentation and churn analytics. This sector also has standardized processes to ensure interoperable transactions with partners – knowledge that could be particularly useful to utilities to support the successful development and management of seamless digital energy services targeted to consumers and prosumers.
The good news is that there’s an entity called the TM Forum that collects and manages this repository of knowledge, and it is available to utilities as part of their Smart Energy program. TM Forum provides a neutral, open, and structured forum for collaboration between service providers and their vendors. Their goals are to reduce costs and risks, ease system and process integrations, and improve business and information agility. Members utilize consensus-built tools, metrics, and best practices to accelerate their initiatives in network operations modeling, customer experience management (CEM), and data architecture strategies.
For example, a tool called the Business Process Framework is a proven blueprint for enabling successful business transformations – something that can be extremely helpful for utilities as they revamp and restructure their operations to accommodate all the new data generated by Smart Grid solutions. What’s more, TM Forum supports projects called Catalysts to explore how their tools can be applied to different business sectors and their unique challenges.
There’s an ongoing Smart Energy Catalyst that has already demonstrated large-scale integration points and digital handshakes necessary to connect utility grid and back-office operations and concomitant applications to support digital energy services for consumers. Participants in this low-risk proof of concept project include utilities (BC Hydro, Hydro-Quebec, and Salzburg AG) and solution providers (Esri, Infonova, and BaseN, among others).
Learning by doing in the collaborative environment enabled by a Smart Energy Catalyst is a great first step for utilities to build expertise in data that supports strategic objectives. Getting the 5Vs of data right will be critical success factors for utilities to build prosumer-centric operations and properly define prosumer value.
Disclaimer: TM Forum is an SGL Partners client.
Can data kill your pain? The city of Los Angeles, California is hoping it will, at least where some data sources are concerned. Back in May, the city launched a new DataLA site that features data downloads on topics such as crime statistics and budget information, as well as easy to understand visualizations of key metrics at a separate portal called PerformanceLAcity. A June hackathon encouraged developers to take these datasets and create solutions that improve city life. Projects focused on affordable housing, public transit, and spurred by a devastating statewide drought, apps to report water waste.
Code for America has similar objectives to enhance the quality of civic life on a broader landscape, organizing hackathons in over 130 US cities so far. Its fourth annual Summit occurred this past week in San Francisco. The non-profit organization places software developers, user interface designers, and data enthusiasts into projects to re-imagine, re-think, and/or redesign existing processes to optimize productivity, experiences, and satisfaction.
For many cities around the world, one of the most intractable problems is traffic congestion. It’s certainly one of the biggest problems for LA, where 65% of commuters are solo travelers. This sprawling metropolis, which installed the world’s first traffic lights in 1924, has ambitious hopes for innovative solutions based on their traffic data.
The data is collected by ATSAC (Automated Traffic Surveillance and Control) and city parking management systems. ATSAC, first rolled out to manage signal timing on the streets surrounding venues used for the 1984 Olympic Games, is now implemented citywide at over 4400 intersections with traffic signals. Street sensors monitor vehicle passage, speed, and congestion in one second increments. This realtime data delivers situational awareness to the ATSAC operations center to adjust traffic signal timings to reduce congestion. The ATSAC system has a number of measurable benefits, most specifically in travel times, CO2 emissions and fuel use. Any concomitant reductions in road rage haven’t been tracked, but that’s not as easy to measure. On September 22, the city published an RFI (Request for Information) focused on that realtime ATSAC data. The objective is to learn who is interested in this data and what new information and valuable services can be derived with this data.
Imagine if electric and water utilities operated this way. If meter data, properly anonymized and aggregated into data sets to protect privacy was available for hackathons, more feasible solutions for residential rentals and multi-family housing might pop up – two markets sorely underserved by existing home energy management applications. The federal Green Button initiative has sponsored and participated in hackathons, most recently an event in August in San Francisco, and in September at the KTH Royal Institute of Technology in Stockholm, Sweden. Kudos to the organizers, sponsors, and participants of these hackathons that take existing energy data sets and create new applications to address the event challenges. It would be very interesting to see utilities get engaged in hackathons. One starting point would be to consider what types of data and data sets could be made available to answer a wide range of their challenges.
Leaders engaged in smart city initiatives acknowledge that they don’t have all the answers when it comes to data manipulation and analysis, and welcome outside help via hackathons to optimize infrastructure, enhance services, and improve civic life. Could similar activities help utilities engaged in Smart Grid initiatives ensure that they are getting the most from their data? There’s no doubt that plenty of pain points could potentially be addressed with intelligent data visualizations and analytics. Maybe expanding the pool of solution contributors could accelerate development and deployment of painkillers.
Leaders engaged in smart city initiatives acknowledge that they don’t have all the answers when it comes to data manipulation and analysis, and welcome outside help via hackathons to optimize infrastructure, enhance services, and improve civic life. Some of that data impacts what they would consider mission-critical operations. Could similar activities help utilities engaged in Smart Grid initiatives ensure that they are getting the most from their data? There’s no doubt that plenty of pain points could potentially be addressed with intelligent data visualizations and analytics. Perhaps expanding the pool of solution contributors, with appropriate controls for security and privacy considerations, could accelerate development and deployment of utility painkillers.
Common industry consensus holds that Smart Grid technologies and policies will enable utilities to deploy new products and services in the pursuit of safe, reliable, and cost-effective electricity. Aspects of utility operations will become easier – such as identifying outage locations using smart meters or preventing service disruptions with closer monitoring of equipment conditions. But as far as the customer service organization is concerned, the Smart Grid means business as usual. Utilities will adjust to include social media channels for inbound and outbound communications along with traditional voice, email, and webchat, but that will be the extent of change.
But this industry consensus lacks comprehension and vision about the revolutionary rise of the prosumer, as noted here and here. The traditional utility/customer relationship is changing as electricity consumers become electricity prosumers – producing kilowatts and/or negawatts through Smart Grid enabling technologies like distributed energy resources (DER) plays, long term energy efficiency decisions or short term demand response actions. Utility customer care organizations must change to accommodate these disruptive shifts from ratepayer relationships into interactions with consumers who have choices and new value for utilities.
What are these choices and what is the utility value? The coming changes are not limited to states that are fully deregulated and where consumers can switch retail energy providers at will. Consumers in regulated states have choices that have important consequences for utilities too. They may choose to participate or not in a demand response program or to install solar panels on their rooftops and switch to a net metering or feed-in tariff (FiT). Consumers become prosumers when they make these choices, and their decisions have value in the form of kilowatts or negawatts for utilities.
These changes are occurring as forward-thinking utilities and regulators acknowledge that the utility business model itself will change, as recently published in the New York Public Service Commission’s Reforming the Energy Vision staff report. Business model changes portend enormously important roles for contact centers as the strategic focal point for prosumer care, rather than traditional consumer care. Within the next 5 years, a growing number of utilities will be permitted to offer new services that produce new revenue streams beyond basic electricity sales. For instance, new services may allow utilities to leverage customer-side DER assets. Such services can create new lifetime consumer value for utilities that goes well beyond simple electricity sales.
The important role that prosumer care operations play in utilities is magnified as these operations transform into revenue centers rather than remain as cost centers. Prosumer care operations deliver the critical sales functions for utilities as their business models change. They help introduce new services and develop new revenue streams, and they most definitely will compete against new energy services market entrants. The utilities in the best position to transform into prosumer care operations are the ones that first plan to invest and transition into consumer-centric operations.
Consumer-centric operations can help transform utilities into the trusted advisors on energy matters for consumers. Trust builds loyalty, and helps avoid intermediation or dissolution of existing utility/consumer relationships by third parties such as Comcast, Verizon, or other new entrants in residential energy services. But a consumer focus is just part of the strategy to achieve prosumer care operations. Consumer value is redefined for utilities and has to be considered on a lifetime basis of what a consumer means to a utility. It is a sophisticated sum total of a utility’s electricity transactions (bidirectional sales and purchases) as well as investment requirements and investment postponements.
Lifetime consumer value calculations comprise a data analytics convergence of utility operational grid data with meter, CRM, and other traditionally backoffice data into the prosumer care operations center. It’s another important convergence brought about by Smart Grid technologies and policies. Unlike the typical utility IT/OT convergence that is evolutionary, this one is truly revolutionary because it enables a prosumer relationship that doesn’t exist in any other business. Among all the Smart Grid changes in the utility sector, this one will have the most direct impacts on consumers as their relationships transition into prosumer interactions. Utilities are well-advised to prepare for that.
Rodney Dangerfield had a great line about how he went to the fights and a hockey game broke out. It was a concise and witty commentary on the frequency of bench-emptying fist fights between hockey teams. We’re going to witness a similar trend in energy storage conferences. Many will transform into data analytics events.
Energy storage has extraordinary potential to transform electric utility operations and business models. There are many moving parts to it – there are almost daily announcements about new breakthroughs in chemistries that improve energy density, safety, and number of roundtrip charges and discharges.
Some moving parts of the energy storage market will mirror trends and characteristics of solar energy. Energy storage breakthroughs will result in rapid decreases in costs that have been witnessed in solar technology production and deployment. Privately-owned energy storage, like rooftop solar, will create whole new businesses that focus on deployment and maintenance of batteries and local jobs that require skilled workers. Energy storage solutions will also challenge utilities and regulators – another example of how technology outpaces policy. Just like net metering and feed-in tariffs were created for distributed solar photovoltaic (PV) installations, special tariffs will be created for energy storage assets that are not owned by utilities.
There is added complexity to energy storage tariffs insofar as batteries can deliver a variety of difference services to utilities ranging from ancillary services to keeping the lights on for specific periods of time during grid service disruptions. While solar panels constitute opportunities for their owners to reduce purchases of electricity from a utility or sell power back to it, energy storage has a greater variety of potential business cases. In addition to firming renewable sources of generation, owners of energy storage assets will have new opportunities to transact with utilities or with energy services aggregators. The basis of these transactions may be to place distributed energy storage assets alongside solar generation to enable greater participation in demand response programs. Today, demand response usually means a business must modify their operations to accommodate reductions in energy use. For some businesses, that may not be feasible as they lack that flexibility in their use of energy, or don’t want to invest in management systems that support automated responses to DR events. But if that business can rely on energy storage to take up the slack for demand response or other utility requests for capacity, voltage, or frequency responses, then they can earn money leveraging their energy storage assets.
And that’s how a funny thing happened when I attended a recent Agrion event about energy storage. A data analytics event broke out. In order for companies to participate in these new types of transactions, there’s a significant amount of data that must be gathered and analyzed. One company, Bosch Energy Storage Solutions has focused on development of complex software analytics and algorithms to define customer power needs. They are building data sets to forecast energy use patterns based on a number of historical and realtime data sources. They look 24 hours ahead to anticipate energy use as well as review the last week and the last year’s data to understand historical patterns of use in order to formulate the best energy storage management decisions. Another company, Stem, created a software platform that uses predictive algorithms to manage individual energy storage systems and aggregated storage systems to perform load reduction in conjunction with utilities’ operational needs. They collect granular usage data at a higher resolution than what is gathered by smart meters and combine this data with tariff information to build reliable financial models and determine the optimal times to discharge and charge batteries.
These are two examples where data analytics applications help run energy storage as cost-effectively as possible, and help build the business cases for sales resources to sell storage solutions to customers. There will be many more applications to follow that help energy storage solution purchasers determine which technologies and what storage applications make the most sense based on unique requirements. So don’t be surprised if you go to an energy storage event and discover that it’s a data analytics event too.
The convergence of information and communications technologies (ICT) with the traditional operations technologies (OT) is an ongoing Smart Grid trend. Within the USA and its 3000+ electric utilities, Smart Grid investments focused on optimization of transmission and distribution grid operations through machine to machine (M2M) communications and forays into data analytics for applications ranging from revenue assurance to voltage conservation.
This ICT/OT convergence trend is encouraging new entrants into the vendor ecosystem that supports electric, gas, and water utilities. One of the latest entrants is Dell Computers. Dell made two announcements in the past two months that illustrate how ICT companies are exploring Smart Grid market opportunities. 2013 will be the year to watch their strategies and progress.
Dell recently unveiled their Smart Grid Data Management Solution which combines high-performance computing, networking and storage to manage data for review and action in utility operations. Leveraging domain expertise and the PI System™ from OSIsoft, they developed and tested a reference architecture in a simulation environment that modeled a utility’s transmission grid operations. Transmission grids have been one of the early beneficiaries of the Smart Grid through products called Phasor Measurement Units (PMUs), which are extremely high speed monitors that sense changes in transmission conditions. Taking hundreds of measurements per second from multiple PMUs leads to large quantities of data that challenge existing data storage practices in utilities. Dell’s solution coupled with OSIsoft’s solution provides faster updates and makes actionable data available to staff, applications and business systems. It’s an excellent example of how M2M communications and data management technologies can become ubiquitous in the Smart Grid.
This is a noteworthy collaboration between a traditional ICT vendor (Dell) and a traditional OT vendor (OSIsoft) that is focused on grid operations. But Dell has also signaled its intent to get involved in the consumer side of the electricity value chain by joining the Pecan Street Inc. Advisory Board. Pecan Street is an energy and smart grid research and development organization, and serves as a living laboratory with a community microgrid characterized by residence-based solar generation, electric vehicles (EVs), energy efficiency and energy management solutions for homes. The project is conducting research in the brave new world of consumer/prosumer evolutions and their energy interactions through data analytics.
While the term “big data” is used in this project, its volumes are dwarfed by the volumes of data that are generated by today’s PMU deployments. Similarly, if smart meters ever provide data to utilities at 15 minute intervals, that would constitute really big data, at least as analytics providers in financial services or telecommunications would define it. It’s more accurate to describe the Pecan Street project as one that offers horizontal complexity and scalability as the types of devices, with all their variations in hardware, firmware, and software will need to be managed in addition to the networks that connect them. There aren’t too many analytics companies out there that can offer this expertise, and the best ones are proven performers in other industry sectors outside of electric utilities.
However, Dell has proven abilities in the arena of data management, and they understand a thing or two about consumers after successfully building a competitive business that sells direct to them. So their moves into the Smart Grid sector portend more than a continuation of the ICT/OT convergence trend. It also highlights another trend – that of businesses (others are Verizon and Comcast) that are experienced in consumer retail operations and engaged in exploratory activities to directly engage with electricity and water consumers. Traditional utilities may discover that their business models are disrupted more by this second trend than the first. Of course, this second trend is a riskier play, and it is too early to tell if these new players will become intermediaries between consumers and utilities. It will be interesting to watch Dell in 2013 and see how these trends progress.