This week’s guest author is Barry Haaser, Managing Director of the OpenADR Alliance. His article clarifies the role that this standard plays in a range of applications.
The OpenADR standard for automated demand response is often misunderstood as just a standard for demand response. In fact, it is a powerful standard capable of supporting a broad spectrum of applications that fall under the demand response umbrella. As the only global standard for demand response, OpenADR is uniquely positioned to address a multitude of load control and load management applications.
In an effort to help utilities and system operators create more demand response programs and further product development, the OpenADR Alliance created an OpenADR 2.0 Program Guide. This draft document defines typical automated demand response (ADR) programs and explains how they are implemented using OpenADR 2.0. The OpenADR Program Guide expands the range of demand response (DR) deployment scenarios available to energy providers, while giving equipment manufacturers additional information on typical DR Program usage models so they can support a full range of DR programs in their products.
The program guide provides utilities with examples of typical DR programs so that they can model their own DR program implementations, and equipment suppliers can understand typical DR Program usage models to help validate interoperability. The program guide provides templates for popular DR programs. These templates include:
- Critical Peak Pricing: This rate and/or price structure is designed to encourage reduced consumption during periods of high wholesale market prices or system contingencies by imposing a pre-set high price for a specific time period (such as 3pm – 6pm on a hot summer weekday).
- Capacity Bidding Program: This program is used by Independent System Operators (ISOs) and utilities to obtain pre-committed load shed capacity from aggregators or self-aggregated customers when they anticipate high wholesale market prices, power system emergency conditions, or as part of normal energy resource utilization by calling DR events during a specified time period.
- Residential Thermostat Program/Direct Load Control: This demand response program describes utility or other energy service provider communications with smart thermostats or remotely controls enrolled customer loads, such as air conditioners. These programs are primarily offered to residential or light commercial customers.
- Fast DR Dispatch/Ancillary Services Program: Fast DR is used by ISOs and utilities to obtain pre-committed load response in “realtime.” Resources are typically dispatched with a latency ranging from 10 minutes for resources that are used as reserves to 2 seconds for resources that are used for regulation purposes.
- Electric Vehicle (EV) DR Program: This demand response activity modifies the cost of charging electric vehicles to cause consumers to shift consumption patterns.
- Distributed Energy Resources (DER) DR Program: This demand response activity smooths the integration of distribute energy resources into the Smart Grid.
This program guide just scratches the surface of the many programs that can be supported by the OpenADR standard. You can download the draft program guide and provide us with your feedback prior to publication this summer.
I made ten predictions in January 2014 about Smart Grid and Smart City trends and changes that will occur between 2014 and 2020. Here is an update on the final five predictions. The first five were reviewed last week. You can review the full predictions here and here, and judge for yourself the quality of my crystal ball.
6. Debates about the future of the social compact for electricity services and the socialization of electricity costs continue. The Reforming Energy Vision initiative includes the objective to “enable and facilitate” new business models for utilities, customers, and energy service companies. This is just the first state activity that will generate significant discussion about how to equitably balance distribution grid investments that accommodate and integrate more distributed energy resources (DER). Since it will take time to implement and then measure results from new business models, this debate is sure to continue for the next decade.
7. EVs advance to 10% of the US car market. The current electric vehicle (EV) penetration in 2013 was just a bit over .5%. The falling costs of gasoline are putting additional pressure on EV manufacturers to reduce prices of zero emission vehicles to increase consumer adoption. However, utilities are now taking a more active role, as Edison Electric Institute members will start investing up to $50 million annually in EV service trucks and charging stations for consumers. The Department of Defense (DoD) is conducting pilots for vehicle to grid or V2G applications. Their first smart charging demonstration are exploring V2G performance, and they will also examine re-purposing used EV batteries for fixed energy storage.
8. Resiliency measures also become part of the definition of a smart building. There are a number of federal, state, and non-governmental initiatives that address resiliency, and some critical infrastructure definitions include selected buildings. The National Institute of Standards and Technology (NIST) is developing standards guidance for community disaster resilience, but this is focused on building materials and codes. Microgrids, DER and Zero Net energy codes and technologies can bridge the gap in existing resiliency initiatives for buildings. Microgrids are already in production as resources to maintain power to critical infrastructure during emergencies – one of the goals of the Borrego Springs microgrid.
9. Nanotechnologies help propel solar harvesting efficiencies past the 50% mark, and by 2020 research scientists are aiming for 75% harvest efficiencies. The number of patents filed for innovations in nanotechnology using graphene have tripled in the past 10 years. The research pipeline contains single molecule thick sheets of graphene and molybdenum that can potentially provide 1000 times more power per weight unit of material than current commercially available solar cells. The fabrication of flexible solar panels is on the horizon, which can be wrapped around curved or uneven surfaces or reduced in scale, expand the possibilities for where solar can be deployed.
10. There’s sufficient electricity production from renewable energy sources that we no longer talk about “renewables.” American grid-connected wind turbines have a combined capacity of 60,000 MW, projected to double by 2020. Solar is enjoying explosive growth. Energy storage solutions will “firm up” the intermittency of wind and solar and thus eliminate the last objections to reliance on renewables. It will just be a cheap and clean source of electricity without the price volatility of fossil fuels.
These final five predictions are well on their way to realization too, although the prediction about nanotechnology advances is admittedly a stretch goal. You’ll note that energy storage has a significant influence on the advancement of some of these predictions. We’ll keep tracking these predictions and bring you periodic updates.
How much can change in a year? When it comes to Smart Grid and Smart City topics, the answer is quite simply – a lot can change. Here’s progress report on my ten predictions about Smart Grid and Smart Cities activity by 2020. The first five are featured this week. You can review the complete predictions here and here, and judge for yourself the quality of my crystal ball.
- California hits and exceeds its RPS objective of 33% renewable sources of electricity by 2020 – the most ambitious of all states with this calendar deadline. As of October 2014, the state’s three investor-owned utilities (IOUs) obtained 22.7% of their electricity from renewables, and are on track to meet the 2016 25% milestone. The California Public Utilities Commission (CPUC) projects that solar alone will contribute 42% of the state’s total renewables generation. The state has about 245,000 rooftop solar PV installed now, and by 2017 the aggregated generation from these systems will approach 3,000 MW.
- Grid resiliency strategies take priority for investor-owned, municipal, and rural utilities. The Electric Power Research Institute (EPRI) has a number of initiatives in grid resiliency, and their clients are utilities. Governmental, commercial and residential interests build microgrids that are capable of delivering a limited degree of building self-sufficiency in energy. NYSERDA announced the first in the nation NY Prize, a $40 million competition to build microgrids and other local energy grids. New Jersey launched the Energy Resilience Bank – the first public infrastructure bank in the country focused on DER for energy resiliency. This bank is capitalized with $200 million for projects that harden critical infrastructure. Utility support for microgrids is growing as utilities like Con Ed see that the Reforming Energy Vision initiative presents an opportunity to redefine utility business models to accommodate new microgrid product and service offerings.
- As utilities consider grid hardening, cities redefine what being a smart city really means. Smart cities aren’t smart if their critical infrastructure relies on fragile transmission or distribution grids. Definitions abound for smart cities, but the lack of consistent standardized frameworks are serious obstacles to development of smart cities. For some states, notably New York, Connecticut, and New Jersey, (states hammered by Superstorm Sandy among other weather events) a city is smart if it upgrades critical infrastructure and deploys distributed energy resources and microgrids for select community buildings and systems.
- Consumer intermediation threats abound for utilities. Investor guidance reports released earlier this year pointed out a number of threats to the existing regulated utility business model, and noted the potential for confrontations between tech giants (notably Google and Apple) and utilities in value-added services (specifically energy management services) to consumers. Consumers are becoming increasingly savvy about solar generation, and companies like Solar City and Sungevity have capitalized on these trends to make it easy for consumers to build relationships with non-traditional energy companies.
- Standards that define how to integrate or grid-tie microgrids and other standalone generation and energy storage assets for bi-directional electricity flows to utility distribution grids are globally adopted. The existing IEEE 1547 standard currently used for DER such as solar PV requires that these assets must be de-energized if they are tied to the grid and it loses power. While necessary as a safety measure, it defeats the purpose of microgrids remaining up to power critical infrastructure or meaningfully contribute power back to the grid. The Smart Grid Interoperability Panel (SGIP) started Priority Action Plan (PAP) 24 for microgrid operational interfaces. This PAP focuses on information models and interoperability and consistency of signals used by microgrid controllers. Another group called PAP 25 will encourage standards that harmonize financial data, as well as forming a new group focused on Transactive Energy. These are all critical steps to develop the standards that will govern bi-directional electricity and realize the full promise of the Smart Grid, as well as power smart cities.
There’s been real progress for the first five predictions and they are well on their way to realization by 2020. Next week I will review progress on the final five predictions.
The Energy Storage North America (ESNA) conference in San Jose, CA last week can be summed up in one word – optimism. The sanguine outlooks on market opportunities and trends were unanimous. Several vendors can’t manufacture their equipment fast enough to meet demand.
California is making the market for energy storage. The ninth largest economy in the world recognized energy storage systems as important technologies in electricity value chains with the passage of AB2514. The CPUC decision 13-10-040 set the regulatory expectations about utility-interconnected and behind the meter energy storage. States like California view energy storage as a critical tool to firm up intermittent forms of renewable generation. State policies in the Northeast USA encourage energy storage systems to deliver resiliency for grids and critical infrastructure. Of course, a credible argument could be proffered that Tesla is making a market for energy storage with its gigafactory in Nevada. The company plans to build 50 GWh in annual battery storage starting in 2017. These combined influences are driving the growth of new storage technologies, services and financing mechanisms.
The comparisons to solar trajectory trends are well-known. Energy storage technologies are expected to rapidly decrease in price in response to increased economies of scale and expertise. Deployment numbers forecast fast growth – particularly in behind the meter solutions that focus on reducing electricity costs due to high demand charges.
But the energy storage ecosystem has to overcome two challenges that could have negative impacts on adoption rates. First, energy storage technologies are diverse. There are chemical and non-chemical categories of storage. There are many subcategories based on different elements such as lithium, zinc, sodium, or iron; and non-chemical storage ranges from pumped hydro to compressed air to flywheels. There is significant variety in number of charges, stability in different environmental conditions, and form factors. You can select an energy storage solution to ensure that your mission-critical devices or operations are not disrupted by power outages – a resiliency function. Storage can help maintain stable grid operations, a reliability function. Storage can reduce electricity use at peak time periods or avoid those demand charges mentioned above – a cost-savings function. The market places very different values on the potential uses for energy storage by function. There’s a lot of confusion that needs to be addressed with education to ensure buyers are making sound decisions that meet and exceed their expectations.
The second challenge is that early stage energy storage technologies and services are usually proprietary and customized engineering solutions. Deployments may include features that aren’t supported on a commercial scale, or may not exist in the future. All of these qualities increase the balance of system costs that go beyond the storage equipment purchases. There is no equivalent to a USB standard for physical connections of different energy storage solutions to the grid. The Byzantine variety of permitting processes and fees is a problem that bedevils the solar industry too, but it’s a brand new learning curve for the energy storage system integrators and installers. In essence, there’s too much complexity in the entire design, development and deployment process for energy storage systems, and it’s an area that’s ripe for innovation.
The good news is that vendors are working collaboratively to solve some of these problems. There’s a new industry initiative called the Modular Energy Storage Architecture (MESA) standard initiative that can help promote more of a plug and play environment. It would be interesting to see similar collaborative efforts between utilities to standardize on interconnection processes. Likewise, the irrationalities of municipal permitting processes should be replaced with national standards – just as we use the NEC (National Electrical Code) to define the safe design and installation of electrical systems in a uniform way across the USA.
The energy storage ecosystem has to rapidly mature, or suffer self-inflicted pain evident in inflexible, non-scalable, and proprietary solutions slowed down with non-standard processes. These challenges could reduce overall investment paybacks for grid scale and behind the meter deployments. Industry optimism must be tempered with pragmatism to create the right technology and policy frameworks that enable continued success to this important segment of Smart Grid solutions.
Last week I attended the White House Energy Datapalooza event in Washington, DC. It was an informational and inspirational gathering of thought leaders, policy-makers, and innovators focused on data standards and demonstrations of applications that use a variety of energy data sources including Green Button data derived from smart meters. A clear and overarching theme of the Energy Datapalooza was that data and innovative technologies and applications that leverage it will play critical roles in addressing and mitigating climate change – the critical environmental, economic, and security challenge of our times. From an energy perspective, climate change means increasing disruptions and more severe disruptions to traditional electricity grids. Smart Grids must be resilient to resist, react, and recover from service disruptions caused by climate change.
Here are three key takeaways from the invitation-only event:
Energy data is a national resource. Dr. Ernest Moniz, the secretary of the Department of Energy (DOE) noted that “freely available government data about energy is a national resource” to be leveraged to help mitigate climate change impacts and improve grid resiliency. There’s a wide range of public and private initiatives that aim to exploit this resource including an energy data initiative, a building performance database, and an Open Data by Design contest targeted at DOE data that will be unveiled on June 4, 2014. More details on the full range of initiatives can be found here.
Data needs to be easily accessible. John Podesta, Counselor to the President, highlighted the value of interagency and public/private initiatives around open data. In March 2014, the Climate Data Initiative released the first data sets focused on coastal flooding and sea level rise, taking data from government agencies like the United States Geological Service and the Department of Defense along with private companies to help communities assess and mitigate these risks. The next release of data sets will focus on climate change impacts to agriculture and food security.
Data needs to be standardized. The Green Button initiative is enabling many innovations in energy applications – many of them utilizing smart phones. It is based on the simple but powerful premise that electricity consumption data should be standardized to make data collection frictionless. A total of 55 US and Canadian-based utilities have announced support for the Green Button initiative. That translates into 100 million consumers who will enjoy easy and secure access to their data. There’s a Green Button certification effort underway, an open source implementation called OpenESPI, and growing adoption of the standard by job creators focused on delivering a range of innovative data applications to manage residential, commercial, industrial and agricultural electricity use.
A new data initiative was announced at the White House Energy Datapalooza that follows the success of the Green Button initiative. Electric utilities and technology companies announced support to develop and use a voluntary open standard for power outage and restoration information. Providing this structured data in an easy-to-use, standardized format and at a consistent location makes it easier for first responders, public health officials, utility mutual assistance efforts, and the public to use this information to manage their responses in outage situations.
We’ve defined human history in categories such as the Stone Age or the Industrial Revolution. Data may indeed be the greatest tool humans have. What we are witnessing now may become known as the Data Age to future generations. There’s no doubt that new and existing energy data, enabled by a number of technologies, will revolutionize grid modernization and help build Smart Grids and Smart Infrastructures.
The convergence of information and communications technologies (ICT) with the traditional operations technologies (OT) is an ongoing Smart Grid trend. Within the USA and its 3000+ electric utilities, Smart Grid investments focused on optimization of transmission and distribution grid operations through machine to machine (M2M) communications and forays into data analytics for applications ranging from revenue assurance to voltage conservation.
This ICT/OT convergence trend is encouraging new entrants into the vendor ecosystem that supports electric, gas, and water utilities. One of the latest entrants is Dell Computers. Dell made two announcements in the past two months that illustrate how ICT companies are exploring Smart Grid market opportunities. 2013 will be the year to watch their strategies and progress.
Dell recently unveiled their Smart Grid Data Management Solution which combines high-performance computing, networking and storage to manage data for review and action in utility operations. Leveraging domain expertise and the PI System™ from OSIsoft, they developed and tested a reference architecture in a simulation environment that modeled a utility’s transmission grid operations. Transmission grids have been one of the early beneficiaries of the Smart Grid through products called Phasor Measurement Units (PMUs), which are extremely high speed monitors that sense changes in transmission conditions. Taking hundreds of measurements per second from multiple PMUs leads to large quantities of data that challenge existing data storage practices in utilities. Dell’s solution coupled with OSIsoft’s solution provides faster updates and makes actionable data available to staff, applications and business systems. It’s an excellent example of how M2M communications and data management technologies can become ubiquitous in the Smart Grid.
This is a noteworthy collaboration between a traditional ICT vendor (Dell) and a traditional OT vendor (OSIsoft) that is focused on grid operations. But Dell has also signaled its intent to get involved in the consumer side of the electricity value chain by joining the Pecan Street Inc. Advisory Board. Pecan Street is an energy and smart grid research and development organization, and serves as a living laboratory with a community microgrid characterized by residence-based solar generation, electric vehicles (EVs), energy efficiency and energy management solutions for homes. The project is conducting research in the brave new world of consumer/prosumer evolutions and their energy interactions through data analytics.
While the term “big data” is used in this project, its volumes are dwarfed by the volumes of data that are generated by today’s PMU deployments. Similarly, if smart meters ever provide data to utilities at 15 minute intervals, that would constitute really big data, at least as analytics providers in financial services or telecommunications would define it. It’s more accurate to describe the Pecan Street project as one that offers horizontal complexity and scalability as the types of devices, with all their variations in hardware, firmware, and software will need to be managed in addition to the networks that connect them. There aren’t too many analytics companies out there that can offer this expertise, and the best ones are proven performers in other industry sectors outside of electric utilities.
However, Dell has proven abilities in the arena of data management, and they understand a thing or two about consumers after successfully building a competitive business that sells direct to them. So their moves into the Smart Grid sector portend more than a continuation of the ICT/OT convergence trend. It also highlights another trend – that of businesses (others are Verizon and Comcast) that are experienced in consumer retail operations and engaged in exploratory activities to directly engage with electricity and water consumers. Traditional utilities may discover that their business models are disrupted more by this second trend than the first. Of course, this second trend is a riskier play, and it is too early to tell if these new players will become intermediaries between consumers and utilities. It will be interesting to watch Dell in 2013 and see how these trends progress.