Can data kill your pain? The city of Los Angeles, California is hoping it will, at least where some data sources are concerned. Back in May, the city launched a new DataLA site that features data downloads on topics such as crime statistics and budget information, as well as easy to understand visualizations of key metrics at a separate portal called PerformanceLAcity. A June hackathon encouraged developers to take these datasets and create solutions that improve city life. Projects focused on affordable housing, public transit, and spurred by a devastating statewide drought, apps to report water waste.
Code for America has similar objectives to enhance the quality of civic life on a broader landscape, organizing hackathons in over 130 US cities so far. Its fourth annual Summit occurred this past week in San Francisco. The non-profit organization places software developers, user interface designers, and data enthusiasts into projects to re-imagine, re-think, and/or redesign existing processes to optimize productivity, experiences, and satisfaction.
For many cities around the world, one of the most intractable problems is traffic congestion. It’s certainly one of the biggest problems for LA, where 65% of commuters are solo travelers. This sprawling metropolis, which installed the world’s first traffic lights in 1924, has ambitious hopes for innovative solutions based on their traffic data.
The data is collected by ATSAC (Automated Traffic Surveillance and Control) and city parking management systems. ATSAC, first rolled out to manage signal timing on the streets surrounding venues used for the 1984 Olympic Games, is now implemented citywide at over 4400 intersections with traffic signals. Street sensors monitor vehicle passage, speed, and congestion in one second increments. This realtime data delivers situational awareness to the ATSAC operations center to adjust traffic signal timings to reduce congestion. The ATSAC system has a number of measurable benefits, most specifically in travel times, CO2 emissions and fuel use. Any concomitant reductions in road rage haven’t been tracked, but that’s not as easy to measure. On September 22, the city published an RFI (Request for Information) focused on that realtime ATSAC data. The objective is to learn who is interested in this data and what new information and valuable services can be derived with this data.
Imagine if electric and water utilities operated this way. If meter data, properly anonymized and aggregated into data sets to protect privacy was available for hackathons, more feasible solutions for residential rentals and multi-family housing might pop up – two markets sorely underserved by existing home energy management applications. The federal Green Button initiative has sponsored and participated in hackathons, most recently an event in August in San Francisco, and in September at the KTH Royal Institute of Technology in Stockholm, Sweden. Kudos to the organizers, sponsors, and participants of these hackathons that take existing energy data sets and create new applications to address the event challenges. It would be very interesting to see utilities get engaged in hackathons. One starting point would be to consider what types of data and data sets could be made available to answer a wide range of their challenges.
Leaders engaged in smart city initiatives acknowledge that they don’t have all the answers when it comes to data manipulation and analysis, and welcome outside help via hackathons to optimize infrastructure, enhance services, and improve civic life. Could similar activities help utilities engaged in Smart Grid initiatives ensure that they are getting the most from their data? There’s no doubt that plenty of pain points could potentially be addressed with intelligent data visualizations and analytics. Maybe expanding the pool of solution contributors could accelerate development and deployment of painkillers.
Leaders engaged in smart city initiatives acknowledge that they don’t have all the answers when it comes to data manipulation and analysis, and welcome outside help via hackathons to optimize infrastructure, enhance services, and improve civic life. Some of that data impacts what they would consider mission-critical operations. Could similar activities help utilities engaged in Smart Grid initiatives ensure that they are getting the most from their data? There’s no doubt that plenty of pain points could potentially be addressed with intelligent data visualizations and analytics. Perhaps expanding the pool of solution contributors, with appropriate controls for security and privacy considerations, could accelerate development and deployment of utility painkillers.
Common industry consensus holds that Smart Grid technologies and policies will enable utilities to deploy new products and services in the pursuit of safe, reliable, and cost-effective electricity. Aspects of utility operations will become easier – such as identifying outage locations using smart meters or preventing service disruptions with closer monitoring of equipment conditions. But as far as the customer service organization is concerned, the Smart Grid means business as usual. Utilities will adjust to include social media channels for inbound and outbound communications along with traditional voice, email, and webchat, but that will be the extent of change.
But this industry consensus lacks comprehension and vision about the revolutionary rise of the prosumer, as noted here and here. The traditional utility/customer relationship is changing as electricity consumers become electricity prosumers – producing kilowatts and/or negawatts through Smart Grid enabling technologies like distributed energy resources (DER) plays, long term energy efficiency decisions or short term demand response actions. Utility customer care organizations must change to accommodate these disruptive shifts from ratepayer relationships into interactions with consumers who have choices and new value for utilities.
What are these choices and what is the utility value? The coming changes are not limited to states that are fully deregulated and where consumers can switch retail energy providers at will. Consumers in regulated states have choices that have important consequences for utilities too. They may choose to participate or not in a demand response program or to install solar panels on their rooftops and switch to a net metering or feed-in tariff (FiT). Consumers become prosumers when they make these choices, and their decisions have value in the form of kilowatts or negawatts for utilities.
These changes are occurring as forward-thinking utilities and regulators acknowledge that the utility business model itself will change, as recently published in the New York Public Service Commission’s Reforming the Energy Vision staff report. Business model changes portend enormously important roles for contact centers as the strategic focal point for prosumer care, rather than traditional consumer care. Within the next 5 years, a growing number of utilities will be permitted to offer new services that produce new revenue streams beyond basic electricity sales. For instance, new services may allow utilities to leverage customer-side DER assets. Such services can create new lifetime consumer value for utilities that goes well beyond simple electricity sales.
The important role that prosumer care operations play in utilities is magnified as these operations transform into revenue centers rather than remain as cost centers. Prosumer care operations deliver the critical sales functions for utilities as their business models change. They help introduce new services and develop new revenue streams, and they most definitely will compete against new energy services market entrants. The utilities in the best position to transform into prosumer care operations are the ones that first plan to invest and transition into consumer-centric operations.
Consumer-centric operations can help transform utilities into the trusted advisors on energy matters for consumers. Trust builds loyalty, and helps avoid intermediation or dissolution of existing utility/consumer relationships by third parties such as Comcast, Verizon, or other new entrants in residential energy services. But a consumer focus is just part of the strategy to achieve prosumer care operations. Consumer value is redefined for utilities and has to be considered on a lifetime basis of what a consumer means to a utility. It is a sophisticated sum total of a utility’s electricity transactions (bidirectional sales and purchases) as well as investment requirements and investment postponements.
Lifetime consumer value calculations comprise a data analytics convergence of utility operational grid data with meter, CRM, and other traditionally backoffice data into the prosumer care operations center. It’s another important convergence brought about by Smart Grid technologies and policies. Unlike the typical utility IT/OT convergence that is evolutionary, this one is truly revolutionary because it enables a prosumer relationship that doesn’t exist in any other business. Among all the Smart Grid changes in the utility sector, this one will have the most direct impacts on consumers as their relationships transition into prosumer interactions. Utilities are well-advised to prepare for that.
Rodney Dangerfield had a great line about how he went to the fights and a hockey game broke out. It was a concise and witty commentary on the frequency of bench-emptying fist fights between hockey teams. We’re going to witness a similar trend in energy storage conferences. Many will transform into data analytics events.
Energy storage has extraordinary potential to transform electric utility operations and business models. There are many moving parts to it – there are almost daily announcements about new breakthroughs in chemistries that improve energy density, safety, and number of roundtrip charges and discharges.
Some moving parts of the energy storage market will mirror trends and characteristics of solar energy. Energy storage breakthroughs will result in rapid decreases in costs that have been witnessed in solar technology production and deployment. Privately-owned energy storage, like rooftop solar, will create whole new businesses that focus on deployment and maintenance of batteries and local jobs that require skilled workers. Energy storage solutions will also challenge utilities and regulators – another example of how technology outpaces policy. Just like net metering and feed-in tariffs were created for distributed solar photovoltaic (PV) installations, special tariffs will be created for energy storage assets that are not owned by utilities.
There is added complexity to energy storage tariffs insofar as batteries can deliver a variety of difference services to utilities ranging from ancillary services to keeping the lights on for specific periods of time during grid service disruptions. While solar panels constitute opportunities for their owners to reduce purchases of electricity from a utility or sell power back to it, energy storage has a greater variety of potential business cases. In addition to firming renewable sources of generation, owners of energy storage assets will have new opportunities to transact with utilities or with energy services aggregators. The basis of these transactions may be to place distributed energy storage assets alongside solar generation to enable greater participation in demand response programs. Today, demand response usually means a business must modify their operations to accommodate reductions in energy use. For some businesses, that may not be feasible as they lack that flexibility in their use of energy, or don’t want to invest in management systems that support automated responses to DR events. But if that business can rely on energy storage to take up the slack for demand response or other utility requests for capacity, voltage, or frequency responses, then they can earn money leveraging their energy storage assets.
And that’s how a funny thing happened when I attended a recent Agrion event about energy storage. A data analytics event broke out. In order for companies to participate in these new types of transactions, there’s a significant amount of data that must be gathered and analyzed. One company, Bosch Energy Storage Solutions has focused on development of complex software analytics and algorithms to define customer power needs. They are building data sets to forecast energy use patterns based on a number of historical and realtime data sources. They look 24 hours ahead to anticipate energy use as well as review the last week and the last year’s data to understand historical patterns of use in order to formulate the best energy storage management decisions. Another company, Stem, created a software platform that uses predictive algorithms to manage individual energy storage systems and aggregated storage systems to perform load reduction in conjunction with utilities’ operational needs. They collect granular usage data at a higher resolution than what is gathered by smart meters and combine this data with tariff information to build reliable financial models and determine the optimal times to discharge and charge batteries.
These are two examples where data analytics applications help run energy storage as cost-effectively as possible, and help build the business cases for sales resources to sell storage solutions to customers. There will be many more applications to follow that help energy storage solution purchasers determine which technologies and what storage applications make the most sense based on unique requirements. So don’t be surprised if you go to an energy storage event and discover that it’s a data analytics event too.
The convergence of information and communications technologies (ICT) with the traditional operations technologies (OT) is an ongoing Smart Grid trend. Within the USA and its 3000+ electric utilities, Smart Grid investments focused on optimization of transmission and distribution grid operations through machine to machine (M2M) communications and forays into data analytics for applications ranging from revenue assurance to voltage conservation.
This ICT/OT convergence trend is encouraging new entrants into the vendor ecosystem that supports electric, gas, and water utilities. One of the latest entrants is Dell Computers. Dell made two announcements in the past two months that illustrate how ICT companies are exploring Smart Grid market opportunities. 2013 will be the year to watch their strategies and progress.
Dell recently unveiled their Smart Grid Data Management Solution which combines high-performance computing, networking and storage to manage data for review and action in utility operations. Leveraging domain expertise and the PI System™ from OSIsoft, they developed and tested a reference architecture in a simulation environment that modeled a utility’s transmission grid operations. Transmission grids have been one of the early beneficiaries of the Smart Grid through products called Phasor Measurement Units (PMUs), which are extremely high speed monitors that sense changes in transmission conditions. Taking hundreds of measurements per second from multiple PMUs leads to large quantities of data that challenge existing data storage practices in utilities. Dell’s solution coupled with OSIsoft’s solution provides faster updates and makes actionable data available to staff, applications and business systems. It’s an excellent example of how M2M communications and data management technologies can become ubiquitous in the Smart Grid.
This is a noteworthy collaboration between a traditional ICT vendor (Dell) and a traditional OT vendor (OSIsoft) that is focused on grid operations. But Dell has also signaled its intent to get involved in the consumer side of the electricity value chain by joining the Pecan Street Inc. Advisory Board. Pecan Street is an energy and smart grid research and development organization, and serves as a living laboratory with a community microgrid characterized by residence-based solar generation, electric vehicles (EVs), energy efficiency and energy management solutions for homes. The project is conducting research in the brave new world of consumer/prosumer evolutions and their energy interactions through data analytics.
While the term “big data” is used in this project, its volumes are dwarfed by the volumes of data that are generated by today’s PMU deployments. Similarly, if smart meters ever provide data to utilities at 15 minute intervals, that would constitute really big data, at least as analytics providers in financial services or telecommunications would define it. It’s more accurate to describe the Pecan Street project as one that offers horizontal complexity and scalability as the types of devices, with all their variations in hardware, firmware, and software will need to be managed in addition to the networks that connect them. There aren’t too many analytics companies out there that can offer this expertise, and the best ones are proven performers in other industry sectors outside of electric utilities.
However, Dell has proven abilities in the arena of data management, and they understand a thing or two about consumers after successfully building a competitive business that sells direct to them. So their moves into the Smart Grid sector portend more than a continuation of the ICT/OT convergence trend. It also highlights another trend – that of businesses (others are Verizon and Comcast) that are experienced in consumer retail operations and engaged in exploratory activities to directly engage with electricity and water consumers. Traditional utilities may discover that their business models are disrupted more by this second trend than the first. Of course, this second trend is a riskier play, and it is too early to tell if these new players will become intermediaries between consumers and utilities. It will be interesting to watch Dell in 2013 and see how these trends progress.
Installations of solar systems that generate electricity are coming down in price due to materials innovations and manufacturing efficiencies. But it’s been a challenge to wring costs from the part of the value chain that is focused on business origination – identifying and marketing to qualified opportunities, conducting site assessments and system design, and financing the projects. It is labor-and time-intensive work. According to David Levine, CEO of Geostellar, a solar installer can expend an average of $4000 per deal on these activities. For the average 4kW home, that works out to be about $1/watt, or 25% of the total installed price. His company has an innovative answer that harnesses the power of data to improve operating efficiencies for business origination activities.
Residential and commercial solar installations at the sales proposal stage require data about the best rooftop or ground potential for harvesting solar energy, coupled with detailed knowledge about utility rate structures, load profiles, and incentives like renewable energy credits (RECs) and net-metering and/or Feed-In Tariffs (FiTs). These proposals need data about local zoning ordinances, and the impacts of vegetation and structures such as chimneys on overall solar productivity. This data is extremely siloed, meaning it exists in a variety of unrelated information sources and is difficult to aggregate and ensure overall consistency.
Geostellar uses advanced data analytics that include proprietary predictive algorithms to simulate the sun in 15 minute increments that account for atmospherics, slope and orientation of rooftops or ground, and shadowing from obstructions. Their geomatics (geographically-referenced data analytics) solution estimates the solar energy production potential for any site. The solution breaks down the silos between various sources of rates, building codes, and financial data to calculate internal rates of return and prioritize sites for marketing or deployment. And most importantly for solar installers, the solution delivers a 75% drop in the origination costs to about $1000 per home.
That’s good news for states like California, which just increased the cap for net metered solar generation and whose investor-owned utilities have aggressive renewables generation targets to meet by 2020. It’s also good news for property owners (individuals to REITs), utilities, and companies that install or finance solar equipment. Cost reductions make solar more feasible for a larger number of properties. And not just solar technologies for generation of electricity. Geostellar’s solution has equal applicability for determining the best insolation and ROI potential for solar water heaters, solar attic fans, and solar pool heaters – energy efficiency and storage plays.
Knowledge is power, and that’s never been a truer statement than now as we see more innovative ways to apply advanced data analytics in the Smart Grid. This detailed knowledge helps consumers become prosumers, and the proliferation of these technologies in the distribution grid will drive demand for other Smart Grid-enabling technologies and services, such as community and residential energy storage and companies that serve as energy négociants.
Knowledge is also money. The Geostellar solution produces the sort of data that is the equivalent to the crown jewels of the oil companies – their maps of oil and gas potential. That’s one reason why they recently closed a Series B investment round led by NRG Energy. As quoted in the June 5 press release, NRG’s Denise Wilson, Executive Vice President of NRG Energy and President, Alternative Energy Services stated that “Educating and informing the public on the solar value of their properties is an important first step in creating a thriving market for clean, green and renewable energy solutions.” That’s a Smart Grid impact that will be profoundly influential in transforming consumers into prosumers.