Integrating eligibility and enrollment systems is a big vision. It requires enormous structural change and years of work to reorient an entire health and human services agency around the people it serves — instead of the policy, regulations, and systems that have been in place for decades. It’s a challenging, high-risk endeavor with an annual expense of $6.5 billion, according to a McKinsey analysis. But it also has the potential to help more than one-third of the US population better access critical benefits like healthcare, unemployment insurance, childcare, and other daily necessities. It’s exactly the kind of work that Nava was founded to do.
Nava has partnered with Vermont, California, Massachusetts, Nebraska, and other state and Federal agencies responsible for administering critical public benefits. The goals of agencies working towards simplified administration of these benefits are often shared:
Make it easier for people to apply for, enroll in, and stay enrolled in benefits.
Increase the efficiency for staff and improve the quality of service.
Our work in Vermont included designing and building three components that are essential to any online benefits portal:
Consumer authentication: allowing beneficiaries to easily create an account, log in, and reset their passwords
Document uploader: allowing beneficiaries to submit documents (like tax returns or pay stubs) that are needed to determine eligibility
Multi-benefits application: allowing beneficiaries to apply for multiple benefits programs in one place
As we further our integrated eligibility and enrollment efforts, we aim to use our learnings — some of which are outlined below — to help other health and human services agencies hit their goals of improving access, outcomes, user experience, accountability, and quality of services for the people of their states. This work has already been recognized by the Aspen Institute in its report Building the tech-enabled safety net and in the book Power to the Public by Tara Dawson McGuinness and Hana Schank.
Start small and avoid “big bang” launches
Based on our experience working alongside the CMS team to help fix HealthCare.gov, we know that releasing too many things to too many people all at once often results in failure. So, we take the opposite approach, launching the smallest feature increments possible to the smallest number of people possible.
In Vermont, the first day the document uploader tool was live, three Vermonters used it. Similarly, the benefits application was only available to a single call center worker the day we released it, and slowly scaled up from there. This approach eliminated the chaotic launch days the Vermont team wanted to avoid. It also makes it easier for development teams to quickly find and address the source of any bugs because multiple people aren’t flooding support channels to report the same issue. We worked with healthcare navigators and assistants as beta users to catch issues and report bugs before launching the document uploader and other products to thousands of healthcare users.
We took this same approach to launching benefits portals in California and Massachusetts. In our partnerships with other states, we recommend that teams take a multi-pilot approach to avoid a “big bang” launch, releasing a small amount of useful functionality to a small group of people (just large enough to catch any issues) and incrementally increasing functionality and people. This minimizes risk by allowing releases to have minimal impact on an agency’s staff, the people using the service, and the delivery team that’s launching the service.
Rigorously prioritize to avoid scope creep
Vermont was operating on a tight time frame. In order to ensure that the web application was released on time, we aligned with the Vermont team on both the project scope and the plan for launch. We mitigated scope creep by agreeing — as a state and vendor team — on building the leanest, simplest version of every feature needed for the web application launch, and deferring anything not needed for the initial launch to be prioritized for later releases.
We often see other vendors and government teams adopt this approach in the weeks or months leading up to the release, but we strongly recommend taking this approach from day one of the project.
Build on proven design patterns and principles
The user experience was of the utmost importance to the state of Vermont, just as it is to many other state and federal partners Nava works with. To accelerate development while maintaining quality, we’ve found that the best approach is to reuse human-centered design practices, proven patterns, and modern development frameworks (such as React) to build progressive web apps for users. In Vermont, this approach enabled our team to build quickly and flexibly, adapting the solution to the changing needs of our partners.
The Vermont and Nava teams recognized early on that the benefits application is a highly policy-driven form and that we should focus on learning from existing research and reusing as many existing collection patterns as possible. This approach maximized the time our team spent to understand and meet the state’s specific business and policy requirements. Two core resources we benefited from were the expertise of our partners in the Integrated Benefits Initiative, including Code for America and the CMS Design System.
The CMS Design System provided a strong set of foundational, coded UI components and patterns that gave us the flexibility to build the application quickly. We partnered with Code for America on this project, and leveraged their expertise from designing a healthcare and economic assistance application for the state of Michigan. That experience, plus Nava’s previous work redesigning the benefits application on HealthCare.gov, provided the foundation for the design of the Vermont benefits application.
Three key design patterns in benefits applications allow users to quickly and more easily complete long forms:
Include yes/no questions, with skip logic
The majority of benefits applications ask about resources, income, and expenses. Usability testing and secondary research has shown that we should aim for “tapping over typing.” This means asking yes/no questions with conditional logic so people are only asked follow-up questions where applicable. For example, instead of asking people to list their vehicles, we first ask “Do you have a vehicle?” and only show follow-up questions like “Tell us about your vehicle” if they answer “yes.”
Limit questions to one per page, where possible
Through usability testing and secondary research, we’ve learned that taking a one-question-per-page approach increases people’s comfort and confidence and reduces their cognitive burden while moving through an application. This approach does increase the total number of application pages, but we have not seen this hinder people because they’re able to absorb information in small chunks and complete the application more quickly.
Show people everything they need to complete and indicate their progression through the application
Before starting an application, people want to have an organized list of all the information needed from them. After starting an application, people also want to know which steps are completed and which are incomplete. As an example, in our work on HealthCare.gov and in Massachusetts, people have had success with the application “Checklist page.” The Checklist page organizes application questions into logical sections so that people know upfront the categories of information they’ll be asked about. The Checklist page also contains statuses to communicate when a section has not been started, when it’s in progress, and when it’s completed. Find more details on guiding people through a complex process in content strategy for user onboarding.
Define baselines for key metrics to enable objective decision-making
Vermont had a clear understanding of the metrics that were important to their team and the people using their service. But after digging through multiple data sets and automated outputs from their economic assistance legacy system, we quickly realized that their legacy system did not provide the reliable data and reporting needed to define baselines for each metric.
In the absence of the automated reporting we sought, our team realized that the best way to determine the baseline was to follow the methods used by federal partners to conduct quality control: to review a set of sample cases and pull out key metrics.
The most important metric to the state was decreasing the days it took them to provide an eligibility determination to a Vermonter who applied for benefits. After reviewing the current state business process, we determined the stages in the process that required actions by caseworkers and Vermonters, and reviewed a sample set of 50 cases, pulling out the dates each action occurred. Adding this information to a spreadsheet provided us with data we could be confident in and a stage duration analysis (which measures how long an eligibility application spends in each stage of the determination process) that we could baseline against.
We then had our Vermont partners repeat this effort for people who used the web application, enabling us to make an accurate, apples-to-apples comparison of data and feel confident in what we were learning and reporting up to executives.
We highly recommend that other states and agencies focus on what they know matters to their users (both beneficiaries and staff) and do whatever is necessary to quantify baselines, conduct experiments, compare the outputs, and iterate from there.
Ensure integration of new and legacy systems
Successfully launching a new benefits portal — or any new consumer product — hinges on its integrations with legacy systems and existing business processes. Since this is such a critical component of a new web application, launch strategies should prioritize the development, testing, and operational planning for the new release as early as possible.
We’ve found that the path with the lowest risk in releasing any new consumer products is to:
limit changes to staff workflow as much as possible,
build on existing integrations with legacy systems,
and release incrementally.
For example, the new Vermont benefits portal sent documents and benefits applications to the legacy case management system using the same API endpoint as their legacy benefits portal. This reuse limited the amount of work required by the state team who maintained the legacy case management system and also limited the operational impacts on staff.
Working alongside our partners like Vermont and other states, these strategies have helped agencies reduce risk, respond more quickly to peoples’ needs in the near term, and continue to do so efficiently and cost-effectively — even as policies and needs change — in the future. We hope they’ll be useful for you, too.
Written by
Product Manager