Climate Change

Weather alerts: Will everyone continue to have access

Amid mushrooming private, for-profit enterprises will old-fashioned public players be able to serve weather data for free

 
By Akshit Sangomla
Published: Friday 25 October 2019

Part 2: The weather paywall: Will a free public service be corrupted

The mushrooming weather forecasting ventures have opened up a front of conflicts between public and private agencies. Experts, mostly associated with experienced public agencies, said most of these startups might be issuing inaccurate forecasts as a changing climate has made weather prediction extremely difficult. Though weather agencies keep correcting themselves about their forecasts all the time, thereby reducing the scope of error, it is not an easy task. 

For instance, explains Mrutyunjay Mohapatra (director-general of India Meteorological Department), often referred to as the cyclone man of India for his expertise in the field, IMD predicted a “low probability” of the formation of a “low pressure area” in northern Bay of Bengal on September 29, 2019 based on his assessment of seven different global and regional weather models.

The warning meant a slim chance of excessive rainfall in Bihar. But the forecast fell flat as the region was already experiencing a delayed withdrawal of the southwest monsoon, which should have ended on September 30. Bihar experienced a deluge, in which over 100 people died.

A month earlier, in mid-August, IMD made similar prediction of heavy rainfall for Himachal Pradesh. But it could neither predict the extent of rainfall nor its exact location. On August 18, the state received 1,064 per cent more than the normal rainfall for the day, which triggered flash floods and landslides and killed 25 people. Shimla, for which IMD downgraded the alert, received 20 times the normal rainfall.

Around the same time, Hurricane Dorian hogged primetime television in the US as despite using multiple weather forecasting models, neither the National Oceanic and Atmospheric Administration of the US nor private agencies could predict the path of the tropical cyclone which swelled to become the most intense tropical cyclone on record to strike the Bahamas.

“One should err on the side of caution in such matters,” said Mohapatra, emphasising the granularity and complexity involved in forecasting.

 

This is the reason, World Meteorological Organization (WMO) requires its data to be of the highest standards to maintain its consistency and accuracy. For instance, wind measurements have to be made at a certain height, and temperature-measuring instruments have to be housed in a certain way. But the quality of data coming from the private sector sources like personal weather stations (PWS) and smartphones remain doubtful.   

For example, IMD cannot incorporate data from personal weather PWS because of quality and incomplete data issues. “There is no ancillary data (like the altitude of the station where measurements are being done) from PWS which is required to be used as an input into IMD’s weather models. If this ancillary data becomes available then PWS data can be incor porated, said VS Prasad, a scientist at the National Centre for Medium Range Weather Forecasting in Noida, Uttar Pradesh.

“Pressure data coming from smartphones is also a good source and research is ongoing to incorporate those into weather forecasting models. On the other hand, temperature data from smartphones is not of much use, as it changes when people are inside air-conditioned rooms,” he added.

The “catch up with climate change” is becoming the buzz-word the world over. But one cannot be always correct in prediction. A forecasting model takes in weather data and calculates what the future weather would be like for different regions of the world for the next six days. This information, along with other such models, is used by weather forecasters around the world to make their predictions.

So a weather model needs observations of the weather, as many as possible, from varied sources. It also needs physics — a set of equations to describe how the atmosphere evolves. The model puts these two together through computation, usually handled by a supercomputer. The success of the model depends upon the strength of each of these three parameters.

The models are of two types: Experimental and operational. Experimental models focus on problems like how clouds and rain form while operational models do the everyday forecasting.

These models are not simple algorithms where you feed in data and expect an output. In between the input and the output is the process of data assimilation. The weather of the real world is assimilated into the models, matching the outside atmosphere with the model’s own version. This means the actual observations are constantly correcting the model’s earlier forecasts. If data assimilation is good enough, it covers for the model's errors or for the lack of observations in many places.

In England, the European Centre for Medium Range Weather Forecasts (ECMWF), has the world’s top-performing weather forecasting model. ECMWF has deployed two supercomputers, each the size of a volleyball court and among the fastest in the world, with 260,000 processor cores capable of making 90 trillion calculations per second (tera flops).

The computers are upgraded every two years so that the processing capability would keep increasing. Together, these supercomputers ingested 40 million weather observations a day and ran calculations at a rate of 90 teraflops to make the forecasts. Even with this type of computation capability and most skilled meteorologists at its disposal, sometimes ECMWF gets its forecast wrong.

As extreme events become more common due to global warming, they will also become more unpredictable. This means weather agencies, private or public, need to be even more careful about their predictions with climate change.

“One should also be willing to revoke the alerts in time, if such a pattern has not emerged. After all, weather forecasting is about predicting extreme events as well as normal events accurately,” said Balasubramanian of IIT-Bombay. But one should keep in mind that inaccurate data of private players should not undermine the forecasts of government-run met agencies. 

A successful collaboration between the public and private sector needs to be built on trust. This means asking ethical and regulatory questions that also plague global debates on data.

“We have a dilemma. It’s about intellectual property rights versus the need for data to be part of the commons,” said Oystein Hov, Secretary General of the Norwegian Academy of Science and Letters, who heads the WMO Commission on Atmospheric Science. He also says there is a risk of a “winner takes all” approach.

If that happens, then the powerful private sector players will become the only conduit of data produced by national weather services and the age of altruism will be over. This is the reason, WMO’s 1995 congress had adopted a resolution warning that commercial meteorological activities should not be allowed to compromise the free and unrestricted exchange of weather data.

“This we must not give up,” says Gerhard Adriaan, president of Deutscher Wetterdienst, Germany’s meteorological service. “This is a basic principle. That, without any discrimination, everyone has the same access to the same data sets.”

This was first published in Down To Earth print edition (dated 16-31 October, 2019)

Subscribe to Daily Newsletter :

Comments are moderated and will be published only after the site moderator’s approval. Please use a genuine email ID and provide your name. Selected comments may also be used in the ‘Letters’ section of the Down To Earth print edition.