Insight

Measuring administrative burden to promote equity in government services

Don Moynihan, the McCourt Chair at the McCourt School of Public Policy and co-director of the Better Government Lab at Georgetown University, joins Nava CEO and co-founder Rohan Bhobe in a conversation about reducing administrative burden.

Everyone has experienced it: waiting for hours on hold with customer service, traveling miles to an office for something that could be done remotely, or filling out an unnecessarily complicated form. These are a few examples of administrative burden, or the many obstacles set in place by bureaucratic processes. When it comes to government programs, administrative burden can be frustrating at best and life-threatening at worst. 

The tricky thing about administrative burden is that there are no standard ways to measure it, and therefore governments didn’t focus on trying to address it until relatively recently. Don Moynihan and Pam Herd, professors at Georgetown University McCourt School of Public Policy and authors of Administrative Burden: Policymaking by Other Means, have spent years researching, defining, and developing ways to better measure administrative burden. As Moynihan and Herd say, “what gets measured gets managed” in government. 

At Nava, we’ve been deeply inspired by Moynihan and Herd’s work. That’s why we invited Moynihan to join our CEO and co-founder Rohan Bhobe in a discussion on how we can more effectively measure, monitor, and reduce administrative burden. An abridged and edited version of their conversation, which was facilitated by Nava’s Partnerships and Evaluation Lead, Martelle Esposito, is below. 

This conversation was made possible through a partnership between Nava and the Better Government Lab at Georgetown University McCourt School of Public Policy. Read more about our partnership here

Over the past year, how has awareness of administrative burdens in government changed?

Don: The awareness has dramatically increased. When Pam [Herd] and I started out, there wasn't a language to describe measuring administrative burden. The first phase of our work was developing a language for talking about aspects of administrative burden like learning costs, compliance costs, and psychological costs. That exercise continued until we published our book.

Right around the time the book came out, the concept of administrative burden began to find resonance in the civic tech community. I heard from so many civic technologists that our research helped to explain their impact to government and to the public. 

Then, the Biden Administration implemented policies, such as the executive order on customer experience, that aimed to reduce administrative burden. The administration saw a connection between reducing administrative burden and making government more accessible for everyone, thereby promoting equity. And so now, the federal government is quite interested in reducing administrative burden. We also see that interest trickling down to state and local government. It's very exciting.

Rohan, when did you start considering administrative burdens as an important measure of how services reach people?

Rohan: When Nava started in 2013, we were concerned with helping fix HealthCare.gov, and even though that was ultimately an effort to reduce administrative burden, we didn't have the framework for it at the time. But at a higher level, we knew administrative burden was something to be concerned about. Now that we're not responding to a crisis, we can talk about the more long-term goals we want to achieve to minimize administrative burden.

Don, can you tell us about the scale you developed for measuring administrative burden?

Don: If you want to achieve any goal, it helps to measure your progress toward it. There's an old adage in government that what gets measured gets managed. That's why we tried to come up with a scale that you could attach to user surveys. The scale, which currently has two versions, is very short and accessible. The first version only has one question and asks about the difficulty of the overall user experience. The other version has three questions and digs into learning costs, compliance costs, paperwork, and psychological costs. Those things are really tricky to measure, and they're also often overlooked. Now that we have the scale, we're working with organizations like Nava and government agencies to deploy it.

Our Nava Labs WIC team used the scale as a quantitative measure and in their usability testing to prompt qualitative feedback. And we're really excited to use this scale in more projects. How does measuring administering burden contribute to reducing it?

Don: The first thing it does is facilitate comparison. When you measure burden, you can learn what works to reduce it in one instance. Then, you can take that knowledge and try to implement a similar solution in another instance. Having cold hard data also helps if you're trying to persuade someone in government of the benefits of implementing a program or solution.

Rohan: Measuring administrative burden has the potential to make the invisible visible. A lot of issues with administrative burden are not in the headlines. Here are some examples: 

A family loses Medicaid coverage just weeks before their four-year-old's critical surgery due to new Medicaid renewal filing requirements they did not know about. 

A person with a chronic illness is denied food stamps because the state agency omitted their apartment number on important letters. It takes ten weeks of calling, documenting, and appealing to reverse the denial. 

A low income mother drops out of WIC assistance for her infant because she fears catching COVID-19 on public transportation, which she must take to reload her EBT card in person. 

A man has his unemployment benefits withheld by the state because of an accidental overpayment nine years earlier. Those are real examples.

Measuring administrative burden has the potential to make these issues seen. It also has the potential to change the terms we use to talk about these issues from a policy perspective. Often, these conversations are framed in terms of fraud, waste, and abuse. If the framing doesn't change, the burden experienced by people won’t get addressed.

What value do we get by focusing on administrative burden outcomes versus software outputs, like working software, or even program outcomes?

Rohan: They’re related. Administrative burden outcomes sit at a higher level than software outcomes, which are pretty non-controversial. In other words, we know what good and bad software looks like and we can measure things like reliability, availability, scalability, and security. 

However, people may still have challenges with using well-designed software for a variety of reasons. That’s where measuring administrative burden comes in. There’s an opportunity to use technology as a tool to address administrative burden issues that stem from non-technology problems. If we think beyond software outputs and consider how people experience software, we’ll be more effective at achieving our mission. 

Don: That reminds me of the sociologist Robert Merton, who wrote about goal displacement in government back in the 1940s. Goal displacement is the idea that the nature of government directs bureaucrats’ attention away from mission and outcomes and toward rules. 

Government tried to deal with that in the ’80s and ’90s by developing performance measurement systems that aimed to direct people's attention to performance indicators. I think that worked to a degree, but it also led to a lot of measurement for the sake of measurement. Now, we’re trying to break the tendency of systems to direct attention away from the people that matter. We’re doing this by centering on people’s experiences when we measure burden.

How does reducing administrative burdens fit into Nava’s vision of being a human-centered systems integrator?

Rohan: Regardless of how technology changes or how Nava changes, people and their experiences will always be at the center of what we do. Ensuring that people have positive experiences with government doesn’t just entail honing a single technological tool or program. It requires understanding how the public experiences the government as a whole, and measuring administrative burden can get us closer to that understanding.

What are some ways technologists can think about measuring the impact of their work on administrative burdens? 

Don: My ideal future does not look like past innovation in government. Traditionally, an outside think tank or research institution might transfer an idea to government, but very slowly. Instead, it would be incredibly helpful for researchers like myself, for people working in technology, and for government to see a culture of innovation and sharing that occurs with shorter and quicker feedback loops. It should be rigorous and centered on the same set of values around trying to reduce burdens and improve government services. That will be key to sharing practices. Something that worked in Massachusetts may or may not work in Colorado, but at least there'd be evidence for trying to implement something different as opposed to sticking to a status quo that often leaves people behind.

That applies to climate change solutions as well. One of the biggest challenges is that government has varying levels of will and varying levels of resources to implement climate solutions. Doing so requires overcoming a lot of red tape. Reducing administrative burden could help those people implement climate solutions. 

What does success look like to you?

Rohan: Long-term cultural change is definitely part of success. We have to keep pace with a world that's rapidly evolving, and that's not possible without a cultural shift. Another aspect of success is measuring administrative burden to minimize it. And we don't have forever to do this—government needs to be capable of operating in the modern age, and quickly, because the public doesn't have infinite patience.

What are you most looking forward to about the Nava and Better Government Lab partnership?

Rohan: There's the potential for a strong feedback loop between implementation and evaluation in a rigorous and sophisticated way. I think this can help us achieve our vision of success. Our teams at Nava are great at building human-centered services and technology, and Don and his team bring a lot of perspective and expertise that we don't have. That's a great foundation on which to build a complementary partnership. Once that feedback loop gets going, we can improve how we measure administrative burden based on our implementation experience. In other words, we can enable a virtuous cycle between implementation and measurement.

Don: When people with different skill sets share a goal, it’s possible to achieve a lot very quickly.

PublishedApril 30, 2024

Authors

Partner with us

Let’s talk about what we can build together.