small town main street

Why the American Community Survey matters, especially for rural Minnesota

By Marnie Werner, Vice President, Research & Operations

Much of the data used to understand life in rural Minnesota comes from one source: the American Community Survey, an annual random-sample survey of Americans conducted by the U.S. Census Bureau. The ACS is important because the data collected by it play a central role in how funds for federal, state, and even local services are distributed around the country, including in rural areas. Many of these programs rely on ACS data to determine who’s eligible for services, which of these services get funding, and how to deliver those services. 

In Minnesota, these services include rural housing programs, transportation funding, workforce development efforts, community development grants, and services for people with disabilities and the elderly. For small towns and rural counties, the ACS is often the only source of local data on these topics. 

The ACS is sent to a random sample of households and asks questions about housing, income, employment, disability, transportation, and more. Unlike most surveys, participation in the ACS is currently required by law, but recently, there has been discussion in Congress about making the ACS voluntary. Supporters of this idea raise understandable concerns about privacy, individual choice, and the burden of responding to a detailed survey. These concerns deserve to be taken seriously. 

At the same time, though, it’s important for rural communities and local governments to understand what would change if ACS participation were no longer required—and what the tradeoffs could mean for Greater Minnesota.

If the ACS became voluntary, research—and experience—shows that participation rates would almost certainly decline, which is problematic enough. The smaller the participation rate, the less confident you can be in the overall accuracy of the picture your data is drawing. 

The bigger issue here, though, is that response rates do not drop evenly. When it comes to voluntary participation in a survey, rural households, lower-income households, older adults, renters, and people with disabilities are all less likely to respond. When fewer people respond, especially in smaller communities, the data become less precise and less representative.

The important thing to remember here is that unlike the every-ten-years Census, the ACS doesn’t count everyone. It takes a sample of the population, sending forms to about 1 in 40 households picked out randomly across the U.S. The problem, though, is that the people who choose not to respond are not random. That’s because specific groups of people are more likely to not respond. 

For rural Minnesota, this matters in very practical ways. Many grant programs require communities to provide data on their characteristics—such as income levels, housing cost burden, or disability prevalence—to qualify for funding. When ACS data become less reliable, some communities may no longer be able to meet those documentation requirements using standard data sources. In response, they may be asked to conduct their own local surveys, which can be expensive and difficult for small jurisdictions with limited staff members. In other cases, communities may simply be passed over because the data are simply too unreliable to support their application.

Weaker ACS data can also affect how funds are distributed statewide. Even when data quality declines, agencies still have to make decisions. Less reliable rural data increases the likelihood that funding formulas rely on broader regional or metropolitan averages that do not reflect rural realities. Over time, this can make rural needs appear smaller or less urgent than they really are. That can result in the funds going to more densely populated cities where the survey results are much more reliable.

Disability data offer a real-world example. Programs rely on ACS data to plan transportation services, workforce supports, housing accessibility, and independent living programs. Rural areas often have higher disability rates, but smaller sample sizes from these small populations mean those estimates are already not great—the smaller the sample size from a particular area, the less accurate an estimate becomes. That in turn increases the risk that disability-related needs in rural communities become statistically invisible.

It’s also worth noting that making the ACS voluntary would not necessarily reduce costs or government involvement. In fact, the Census Bureau would likely need to spend more on follow-up efforts and statistical adjustments to compensate for lower response rates, all while still producing less reliable data.

The debate over the ACS is not about whether privacy and individual freedom matter. They do. The real question is how to balance those values with the need for accurate, locally relevant information. For rural Minnesota, where each data point carries more weight, participation in the ACS helps ensure that communities are counted as they really are—and that decisions affecting them are grounded in reality.

 

In this article: