Illustration by Curva Bezier/Getty Images.

Four problems shaping the future of fact finding

We should all agree that good decisions are based on solid facts. But in a world where decisionmakers cite their own versions of the facts—or none at all—how can researchers and fact finders ensure their work doesn’t get ignored?

At the annual conference of the Association for Public Policy Analysis and Management, leaders of national research organizations discussed the evolution of their sector and the role of evidence in decisionmaking. For years, their organizations have used scientific methods to produce important insights that inform public policy.

Their discussion highlighted four emerging challenges in using evidence in policy debates, areas for innovation, and reasons for optimism.

We live in a time of distrust

Paul Decker, president of Mathematica Policy Research, noted a “clear, long-term trend” of increasing reliance on evidence in government agencies. “The government agencies that we work with have been generating evidence for years now,” he said.

The creation of the Institute of Education Sciences (IES) in the US Department of Education provides a great example of this trend. “Not only is IES creating evidence,” explained Decker, “they are collecting evidence that already exists, applying standards to that evidence, and drawing an interpretation of that evidence so it can be used by those crafting policy.”

On the other hand, Urban Institute president Sarah Rosen Wartell remarked that “sometimes the national conversation increasingly feels less and less attached to fact-based decisionmaking. We have to live with that and figure out where the opportunities are for evidence to make a difference.”

“We still have a long way to go,” Decker agreed. “Even at the federal level, there are agencies that have devoted very little resources toward relying on evidence to better support their mission.”

To ensure facts don’t get left behind, the panel agreed that the research community must continue to explain why evidence matters, especially as debates, audiences, and decisionmakers evolve.

Research audiences and policy influencers are more diverse and diffuse

Discussing how Urban’s work has changed, Wartell said, “In years past, our audience was the federal agency or foundation who assigned us the project. But increasingly, we are thinking about our primary audience as the people who are doing the work on the ground and the communities in which they live.”

“That’s part of a bigger change,” she added. “Informing ‘decisionmakers’ used to mean simply communicating with policymakers and staff in Washington. But changemakers today are a much broader set of people across the public, private, and social sectors.”

Decker described a similar transformation at Mathematica. “We wanted to make sure that our work had an impact on public policy, so we began to think of ourselves as more than detached scientists,” he said. “Let’s not let someone else do all the translation. Let’s do some of that interpretation ourselves.”

As audiences have transformed, so have debates. “We now have a better understanding at the front end of policymaking of the potential implications,” said Wartell. “We use microsimulation models to turn evidence into ‘decision-assist tools,’ which provide real-time information as debates unfold and predict the consequences of who wins, who loses, and geographical impacts.”

The role of Urban Institute models and analysis in debates on the repeal of the Affordable Care Act and tax reform show how national audiences are eager to learn the potential ramifications of decisions before they are made.

Data and methods of analysis are evolving

New data sources are challenging researchers as they explore new tools like artificial intelligence, machine learning, and coding languages like R and Python. “It’s very hard to move into this new world and predict where we’ll wind up,” said Gordon Berlin, president of MDRC. “Not only do we decide how to use new types of data to do what we want to do, we have seen our way through things that are now possible but weren’t possible before.”

“It’s incumbent on us to modernize how we do our work, because our clients want information more quickly and in a way that’s digestible for everyone,” said Kathleen Flanagan, president of Abt Associates.

Flanagan added, “Evaluations are bringing implementation science to the table, along with the rigorous quantitative methods. Lots of new and interesting methods are coming together to figure out how to solve complex problems and figure out what works.”

These technological leaps require research organizations to find people with new skills and knowledge of these systems. Avi Benus, CEO of IMPAQ, noted how his organization often competes with Amazon for the same candidates to join their workforce. “Perhaps we are both data science companies, rather than a retail company and a policy research company,” he said. “Change is permeating everything that we do, not only in our industry but in every industry right now.”

Technological change threatens to leave some behind

New computing tools like artificial intelligence and predicative analytic tools are being used to produce a list of consequences, based on policy proposals or other factors. But what if instead, Wartell asked, the process started by optimizing for specific outcomes?

“People are using these new tools in ways that have major implications that we don’t yet understand,” Wartell explained. “We’re starting to see that many tools don’t incorporate societally important factors, like race, into their dimensions, even though race is so profoundly linked to other variables and inequalities in our society.

“That’s why some civil rights leaders are concerned that technological advancement in some fields threatens to further harden disparities and disadvantages. So one critical challenge facing us is to better understand exactly how others are using these tools and the impacts they may have on our society.”