Can data kill your pain? The city of Los Angeles, California is hoping it will, at least where some data sources are concerned. Back in May, the city launched a new DataLA site that features data downloads on topics such as crime statistics and budget information, as well as easy to understand visualizations of key metrics at a separate portal called PerformanceLAcity. A June hackathon encouraged developers to take these datasets and create solutions that improve city life. Projects focused on affordable housing, public transit, and spurred by a devastating statewide drought, apps to report water waste.
Code for America has similar objectives to enhance the quality of civic life on a broader landscape, organizing hackathons in over 130 US cities so far. Its fourth annual Summit occurred this past week in San Francisco. The non-profit organization places software developers, user interface designers, and data enthusiasts into projects to re-imagine, re-think, and/or redesign existing processes to optimize productivity, experiences, and satisfaction.
For many cities around the world, one of the most intractable problems is traffic congestion. It’s certainly one of the biggest problems for LA, where 65% of commuters are solo travelers. This sprawling metropolis, which installed the world’s first traffic lights in 1924, has ambitious hopes for innovative solutions based on their traffic data.
The data is collected by ATSAC (Automated Traffic Surveillance and Control) and city parking management systems. ATSAC, first rolled out to manage signal timing on the streets surrounding venues used for the 1984 Olympic Games, is now implemented citywide at over 4400 intersections with traffic signals. Street sensors monitor vehicle passage, speed, and congestion in one second increments. This realtime data delivers situational awareness to the ATSAC operations center to adjust traffic signal timings to reduce congestion. The ATSAC system has a number of measurable benefits, most specifically in travel times, CO2 emissions and fuel use. Any concomitant reductions in road rage haven’t been tracked, but that’s not as easy to measure. On September 22, the city published an RFI (Request for Information) focused on that realtime ATSAC data. The objective is to learn who is interested in this data and what new information and valuable services can be derived with this data.
Imagine if electric and water utilities operated this way. If meter data, properly anonymized and aggregated into data sets to protect privacy was available for hackathons, more feasible solutions for residential rentals and multi-family housing might pop up – two markets sorely underserved by existing home energy management applications. The federal Green Button initiative has sponsored and participated in hackathons, most recently an event in August in San Francisco, and in September at the KTH Royal Institute of Technology in Stockholm, Sweden. Kudos to the organizers, sponsors, and participants of these hackathons that take existing energy data sets and create new applications to address the event challenges. It would be very interesting to see utilities get engaged in hackathons. One starting point would be to consider what types of data and data sets could be made available to answer a wide range of their challenges.
Leaders engaged in smart city initiatives acknowledge that they don’t have all the answers when it comes to data manipulation and analysis, and welcome outside help via hackathons to optimize infrastructure, enhance services, and improve civic life. Could similar activities help utilities engaged in Smart Grid initiatives ensure that they are getting the most from their data? There’s no doubt that plenty of pain points could potentially be addressed with intelligent data visualizations and analytics. Maybe expanding the pool of solution contributors could accelerate development and deployment of painkillers.
Leaders engaged in smart city initiatives acknowledge that they don’t have all the answers when it comes to data manipulation and analysis, and welcome outside help via hackathons to optimize infrastructure, enhance services, and improve civic life. Some of that data impacts what they would consider mission-critical operations. Could similar activities help utilities engaged in Smart Grid initiatives ensure that they are getting the most from their data? There’s no doubt that plenty of pain points could potentially be addressed with intelligent data visualizations and analytics. Perhaps expanding the pool of solution contributors, with appropriate controls for security and privacy considerations, could accelerate development and deployment of utility painkillers.
My final five predictions about the Smart Grid, including EVs, smart buildings, and renewables are making good progress. What are your predictions for Smart Grid achievements by 2020?
(Click each link for more...)
In this European Utility Week Engerati interview hosted by Christine Hertzog, Managing Director of the Smart Grid, Library, Jeanne Fox discusses the work of NARUC as well as the recent storms in the US (specifically New Jersey), and the damage they wrought on the electric utility infrastructure.
Innovative Smart Grid and ICT technologies are poised to enhance utility operations and services delivery. That means utility product and service acquisition processes must adjust to handle new product lifecycles. Managing Utility Technology and Service Acquisitions in the Smart Grid Age offers valuable guidance on how to future-proof your decisions.
Click here to read more and to request this complimentary white paper now!
Join Mailing List