California's expensive COVID-19 predictions were useless for rural areas. Here's why

©The Sacramento Bee

SACRAMENTO, Calif. — In mid-July, California’s pandemic forecast painted a bleak picture for El Dorado County.

The state’s so-called “model of models” predicted 45 people with COVID-19 would die within 30 days in the sparsely populated county. With cases surging statewide and more than half of counties on a monitoring list, it was all-but-certain the death toll would soar in the foothills.

But there was a problem. The county hadn’t yet even tallied a single COVID-19 death.

Ultimately, the prediction for El Dorado County was a total bust in the best way possible. The disease caused by the new coronavirus infected just two people who died by Aug. 15, according to state data.

California’s pandemic models this summer envisioned a catastrophic scenario in some of the state’s less-populated counties. But those forecasts proved to be wildly inaccurate, prompting several local health officers to dismiss the state’s forecasting website altogether — if they’d even heard of the effort at all, a Sacramento Bee review has found.

Carla Hass, an El Dorado County spokeswoman, said local public health officials did not use the state’s predictions “in any way” when deciding how to respond to the pandemic. The “wild swings” made the tool “not particularly useful.”

Sutter County, where only four deaths had been recorded in mid-July, would have 70 by mid-August, the model projected. Instead, only seven people died. Rachel Rosenbaum, a county spokeswoman, said their senior-living centers fared well during the pandemic compared to other places, which might explain why deaths didn’t spike as forecast. The health team uses models only when they’re “properly validated.”

And Solano County’s deaths were expected to nearly double from 31 to 59 between July 15 and Aug. 15, according to the forecast. Instead, the death count climbed by just 10.

Though the forecasts were much more accurate at projecting statewide totals and forecasts for more populated counties, like Sacramento, the inaccurate predictions cast doubt about whether the state can accurately estimate the course of a pandemic in rural counties.

Dr. Bela Matyas, Solano County’s health officer, said he was unaware of the state’s prediction website, largely because he doesn’t put much stock in modeling forecasts. Disease modeling rarely pans out, he said, because researchers cannot capture the dynamics inside one county, let alone between them. “Garbage in, garbage out,” he said.

“I have come to cynically believe that the only people who value models are modelers and politicians,” Matyas said. “People who work with disease on the streets just know how impossible it is to model what we see.”

Modeling experts say overzealous predictions from the spring were lost in translation. They proved to be inaccurate because people took unprecedented action by sheltering in place.

California’s Department of Public Health said it stands by the tools. Significant errors in smaller counties should be expected because projections do not account for changes in policy — like the distancing rules or mask mandates.

Plus, a state spokesperson said, prediction models regularly fail to chart the course of the disease in smaller counties because there’s simply less data to work with.

“There is utility to providing estimates for smaller counties even if the estimates are less reliable,” the state said. “CDPH has been in active communication with local health jurisdictions as well as several smaller local health jurisdictions on the reliability of county-level estimates.”

To be sure, California’s “model of models” — an ensemble that uses a blend of forecasts instead of one single tool — has fared much better in recent months at predicting aggregate deaths in more populated places, The Bee review found.

The one-month forecast of deaths statewide was off by only about 1% in mid-August. And with a prediction of 181 deaths in Sacramento County, the model came remarkably close to the reality on the ground of 199.

California’s combination of models now forecasts nearly 1,000 people with COVID-19 will die each week through the end of September, pushing the state’s death toll toward 16,000 by the end of the month.

As for that wildly inaccurate projection for El Dorado County? The models adjusted their forecast. It now predicts four COVID-19 deaths by Oct. 2.

“It’s incredibly frustrating,” said Michael Saragosa, the mayor of Placerville in El Dorado County. “We get it. I want to trust the science. But the numbers are showing we just don’t have the rates or the deaths in our area.”

“The modeling was always going to be modeling that could be wrong.”

$813,000 for two modeling contracts

In the spring, individual models conveyed disturbing projections about outbreaks, hospital shortages and death tolls.

In a letter to President Donald Trump, Gov. Gavin Newsom said models indicated half of the state’s population would be infected within two months. Newsom’s office quickly clarified that the forecast did not take into account social distancing and economic shutdowns.

Models from leading research institutions like Harvard and the University of Washington showed how hospitals would be overrun, ventilators would run out, and bodies would stack up in major cities.

While some of that happened, like in New York City, where freezer trucks parked outside hospitals overrun with bodies, few of those predictions bore any resemblance to the reality on the ground across the U.S.

That’s largely because the models were based on assumptions that did not — and could not accurately — account for widespread and unprecedented stay-at-home orders and the ripple effect precautions like that would have, experts say. The models often lacked context, and in a fast-paced rush for information, became grossly oversimplified.

To better fit what was occurring in the West, California pledged at least $813,000 in modeling contracts to Johns Hopkins University and Stanford, records show. It added another half-dozen modeling efforts to its contingent.

After months of requests from journalists and the public for the information, Newsom unveiled in late June a public website called the California COVID-19 Assessment Tool, or CalCAT, an open repository for anyone to download data about cases and projections and look for data-focused solutions.

“California is home to some of the world’s most accomplished researchers, technologists, scientists, acclaimed universities, and leading technology companies,” Newsom said at the time. “Today, I am opening more California data for them to help inform our efforts in combating this disease.”

The data portal marked the first real look under the hood of how California’s health officials used forecasts to predict the spread of COVID-19.

But since launching the data website, officials in recent months have not cited models in public comments nearly as frequently as in the spring. Turned off by the oversimplified and incorrect predictions from spring, the public seems also to be less interested in the forecasts.

Many people looked at death and infection forecasts like weather forecasts tracking a storm. When those didn’t pan out, people tuned the projections out, said Inga Holmdahl, a researcher at the Harvard T.H. Chan School of Public Health.

It’s a fundamental problem with public health forecasting, said Holmdahl, whose piece in the New England Journal of Medicine spelled out why modeling cannot be the panacea in public health. People can dramatically alter how disease spreads, leaving researchers scrambling to account for changes to social behaviors stemming from shutdowns or mask mandates.

“We can see an epidemiological model and respond,” Holmdahl said in an interview. “Our actions will change the course of the epidemic.”

Experts now are grappling with a scattershot approach to reopening schools and deciding the fate of indoor dining. “I think it’s just really hard to know what the impact of those is going to be,” Holmdahl said.

Early models oversimplified the situation, looked too far in the future and were miscommunicated to the public. Newer models with more data stand a better chance at getting it right, said Nicholas Reich, a pandemic forecaster and associate professor of biostatistics and epidemiology at The University of Massachusetts Amherst.

“As cases surge again, we have been given another opportunity to let models inform our understanding about the future of this outbreak,” Reich wrote on Twitter. “No model is a perfect crystal ball, but their consensus (or lack of it) can tell us something.”

———

©2020 The Sacramento Bee (Sacramento, Calif.)