Article contributed by Steve Liang, SensorUp
“If you can’t measure it, you can’t manage it.”
The quote is originally from management consultant Peter Drucker and later used by Al Gore to describe the challenge with climate change. It accurately encapsulates the theme of the Climate Change Special Session at the OGC (Open Geospatial Consortium) Climate Member Meeting for 2022.
SensorUp’s CTO Dr. Steve Liang was on the panel of data experts from NOAA, United Nations’ IPCC, NRCan and ECMWF, each of whom spoke about the current state, the challenges and the opportunities of measuring climate change data. Here, we’re highlighting seven key takeaways from the session.
1. We still have a lot of knowledge gaps when it comes to global climate data
Angelica Gutierrez, Lead Scientist for NOAA (The National Oceanic and Atmospheric Administration) talked about the struggles with obtaining accurate and timely data. “Well developed countries have access to sophisticated software, specialized equipment and skills, computing power and other essential elements to address climate change,” said Gutierrez. “Developing countries are at a disadvantage.”
It’s a known problem, and one that OGC members are already working to address. That’s another theme that emerged a number of times during the session — we are becoming more aware of our blind spots and working on solutions to mitigate them. “The 2021 OGC Disaster Pilot (that drew the largest response to an OGC pilot, historically) is addressing many of the challenges, gaps and barriers that I previously identified,” said Gutierrez.
2. The current priority is getting good data to decision-makers
In 2022, OGC is launching another pilot, the Climate Change Services Initiative, which will run from 2022 through 2026. The pilot will connect several global agencies and focus on sharing priority information. “We are rolling out the first focus area this year,” said Nils Hempelmann, an OGC Project Manager and the moderator of the climate session.
“Setting up the appropriate infrastructures to deliver information on demand to the decision makers, that’s what we are going to focus on in the beginning,” said Hempelmann of the new pilot. “And then afterwards, depending on what’s coming up and where the urgent pain points are, we are defining the next focus areas.”
3. We want to be able to more accurately measure and understand specific climate events
In recent years, several severe weather disaster events have wreaked havoc in different parts of the world. Two sets of presenters addressed this issue, using examples of weather events like atmospheric rivers and “Medicanes” (hurricanes originating in the Mediterranean) that we need to do a better job of measuring. “Recently in British Columbia, throughout the month of November, they received three storm events, each one was larger than their monthly precipitation rate,” said Cameron Wilson from Natural Resources Canada.
Wilson’s co-presenter, Simon Riopel goes onto explain the challenge of measuring and predicting an event like an atmospheric river. The challenge is in getting an accurate measure of vectors of force, which have both a magnitude and a direction.
One of the current initiatives that can be useful in learning how to solve this is the Arctic SDI (Spatial Data Infrastructure) that creates a “digital arctic” with a combination of sensor data and satellite imagery.
4. (Political) decision making is based on trust
In order to give political decision-makers what they need to make informed decisions, they have to be confident in the validity of the information.
“Decision-making is based on trust,” says Dr. Martina Stockhause, Manager of the IPCC (Intergovernmental Panel on Climate Change) Data Distribution Centre. “Political decision-makers are no experts, so they rely on trust in data and the service providers. In my view trust is built on two aspects. One is the quality of the data that is accessed. That means that the quality is documented, together with the peer review process. And the second is that the result is traceable back to its sources (with data citation and credit).”
One of the ways to achieve that is using the FAIR (Findability, Accessibility, Interoperability and Reusability) Digital Objects framework.
5. We continue to fInd new ways to use machine learning to make better weather predictions
In 2021 the WMO (World Meteorological Organization) launched a competition to improve, through machine learning and AI (artificial intelligence), how to better predict temperature and precipitation forecasts up to six weeks into the future.
The team currently leading that competition is from CRIM (the Computer Research Institute of Montreal). CRIM’s David Landry explained the team’s process of downloading, preprocessing, subsetting, and reshaping the data, before they ran their AI models and presented data predictions back to the adjudicators.
Incentivizing these research teams to continue to experiment with new models, as WHO has, will help us continue to expand our awareness of how to accurately measure and predict climate change events.
6. Estimating greenhouse gas emissions is really complex
Greenhouses gases like methane and CO2 remain difficult to measure. They can’t be seen by the human eye or typical cameras, and capturing data about them remains a challenge. To achieve a more detailed and timely monitoring of emissions in support of climate mitigation actions, the countries of the world need access to more (and more accurate) information.
“The big issue is that we can’t measure emissions directly, so these emissions need to be estimated,” said Vincent-Henri Peuch from the European Center for Medium-Range Weather Forecasts and lead of the Copernicus Satellite projects. “The problem is that it is really complex.”
Satellite images are able to show the presence of fugitive greenhouse emissions at a macro scale but “the question is, can we use this information about the concentration in the atmosphere to infer some information about the fluxes of admissions at the surface?” notes Peuch. “For that, we need to combine lots of different observations, so of course interoperability is required.”
To help with these crucial measurements, CO2M, the Copernicus Carbon Dioxide Monitoring mission, is one of Europe’s new high-priority satellite missions and will be the first to measure how much carbon dioxide is released into the atmosphere specifically through human activity.
7. Accurately measuring greenhouse emissions requires multiple data sources
Dr. Steve Liang, CTO of SensorUp and Professor at the University of Calgary, spoke about the ways that disparate data sources can be combined to help craft a clearer picture of the severity and source of fugitive emissions. “Even though we know methane leaks are bad, how can we fix them, if we can’t see them?” asked Liang. “We need methane sensors to find the locations and flow rates of the leaks. However, there’s not one sensor that is the best. Multiple types of sensors have to work together, to complement each other. They all have different temporal and spatial-temporal scales, at different levels of accuracy.”
Liang explained that a combination of data from sources like handheld instruments, fixed in-situ sensors, Terrestrial Mobile Methane Mapping Systems, airborne systems and satellite imagery can be used together, in an integrated methane sensor web, to more accurately measure, understand, and even predict harmful leaks and emissions.
If you would like to read a more complete explanation of how this methane sensor web works, you can read Dr. Liang’s blog recap of his presentation.