The second-best option, according to Dr. Palmer, is to randomize the sub-grid processes. Counter-intuitively, this additional randomness has the effect of stabilizing extreme weather conditions. Weather forecasts that take into account random (or “stochastic”) processes make more accurate predictions for the frequency of tropical cyclones, the duration of droughts and other weather phenomena, such as the long-lasting heat spell over Europe in the summer of 2018. It seems only reasonable, then, that long-term climate predictions should use this method too.

Climate scientists have begun to take note of Dr. Palmer’s argument. The new British climate model, known as UKESM1, in use since 2018, uses this method of randomness, and others are sure to follow. Björn Stevens, director of the Max Planck Institute for Meteorology in Hamburg, Germany, agrees with Dr. Palmer’s assessment. For the next generation of models, he said, his institution “will be interested in exploring the role of stochastic treatments.”

But Dr. Palmer does not want to settle on the second best, and still hopes to bring the grid size of climate models down. A horizontal grid of about one square kilometer, or 0.4 square miles, he believes, would significantly improve the accuracy of our climate models and would give us the information we need to accurately gauge the risks posed by climate change.

To do this, we need supercomputers capable of performing these calculations. Centers of exascale supercomputers — computers able to perform at least a billion billion calculations per second — would be up to the task. But these computing resources are more than any one institution or country can afford: Getting more accurate predictions would require an international initiative and an estimated $1.1 billion in funding.

We need an international collaboration, what Dr. Palmer calls “A CERN for climate modeling.” It’s an apt comparison: CERN — the European Organization for Nuclear Research — was founded to pool resources into a jointly used facility, thereby enabling megaprojects like the Large Hadron Collider that are beyond the budget of any one country. Importantly, this joint effort does not compete with research at national institutions, but instead builds on it. And if that worked for particle physics, Dr. Palmer thinks, it can work for climate science too.

In 2018, together with climate scientists from 18 European institutions, Dr. Palmer proposed such a computing initiative (called Extreme Earth) as a flagship project to the European Research Council. The proposal passed from the first to the second stage of evaluation. But this year, the E.R.C. canceled the 2020 flagship initiatives altogether. No other funding body has stepped up to fund the climate initiative.

But hesitating to fund better climate models makes no sense, neither scientifically nor economically. Climate change puts us all at risk. To decide which course of action to take, we need to know just what the risks are and how likely they are to come to pass. Increasing the resolution of current models from 100 kilometers to one kilometer would not be an incremental improvement but would make the predictions significantly more reliable.

The benefits of an international initiative for climate science would well outweigh the costs. We created this problem together, now we must solve it together.

Sabine Hossenfelder is a research fellow at the Frankfurt Institute for Advanced Studies and the author of “Lost in Math: How Beauty Leads Physics Astray.”

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.