An Environmental Impact Statement was written for this liquid natural gas terminal–with a section covering impacts on local fishermen.
How would you find a copy of it?

Project Metadata
TaglineFulfilling NEPA’s promise through the power of data science
PurposeConnecting science and people to environmental decisions by making a trove of scattered or lost government documents searchable.
CompanyUniversity of Arizona, Udall Center for Studies in Public Policy
Time frameOctober 2019 to January 2023
ResponsibilitiesUser testing, UI design
FrameworksREACTjs (developer), WordPress (UX designer)
ToolsPencil,, Figma, Photoshop,
Key metricsSearch success rate. Wide adoption by 6 persona groups.
CollaboratorsAn interdisciplinary team of computer scientists, developers, environmental and social scientists, public policy experts, lawyers, and students

Introduction: the background

Congress passed the National Environmental Policy Act (NEPA) almost unanimously in 1969, marking a significant milestone in environmental policy. Over the decades, NEPA has generated a tens of thousands of environmental reviews, spanning diverse infrastructure projects such as highways, mines, energy production and transmission lines, and land management. These reviews encompass a wide array of topics that address the human impact on the environment.

Environmental impacts are complex. Assessing the environmental impacts of infrastructure projects is a complex, cross-disciplinary challenge with multiple causes and significant uncertainty. Data from past environmental impact statements can provide valuable insights into solving these problems.

Historical data management challenges. However, in 1969, the practicality of storing and making this invaluable data available for contemporary problem-solving was severely limited. The information accumulated in physical libraries, filed away in cabinets, and was dispersed among various government agencies, rendering it inaccessible and unwieldy.

Opportunities of the digital age. With the advent of the internet, digital media, and, most recently, machine learning, there exists the opportunity to collect, categorize, organize, and create an online searchable repository for this treasure trove of scientific data. Such a transformation could make this wealth of knowledge available to anyone interested in environmental policy, be they decision-makers or the general public.

1. Problem definition

NEPA environmental reviews are currently difficult or impossible to find, yet they are a valuable source for solving contemporary problems involving the impacts of development on the human and natural environments.

2. Proposed Solution

Employ modern data science tools to locate and systematically organize these scattered documents. Create a user-friendly search system and user interface that enables individuals, including government officials and private citizens,have easy access to the data for improved decision-making in environmental policy.

My role was UX researcher and UI designer. I collaborated with ​a developer, and a team of around 20 people: a developer, editor, data scientists, environmental and social scientists, public policy experts, lawyers, and students.

Discovery phase

1. Stakeholder interviews

I began by talking with project managers. Listening to them, I gained insights into how they perceived the problems arising from the difficulty in locating NEPA documents:

Accessibility Matters: As one project leader put it, “If you can’t find something, it might as well not exist.” Another explained that, “Being available and easily available are two different things. That’s what NEPAccess will do. The public, which means everybody, has the right to understand what’s going on in each process and have a say.”

Navigating Complex Documents: A significant pain point stems from the sheer size and complexity of these documents. “If there’s a pain point, is it likely to be about finding similar documents or just the fact that the documents are so huge that you don’t know exactly what’s going on in. It can be really challenging to navigate them and get to the information that you need. NEPA users may not even know that they have those pain points because people can’t really imagine it being any different.”

Understanding User Needs: One of the challenges lies in anticipating how people will utilize this data effectively. Once we understand these needs, their job to be done, our focus will shift to ensuring that NEPAccess can meet those requirements.

These insights emphasized the importance simplifying the search process, based on the needs and goals of a diverse range of users.


2. To understand: try it yourself

To understand the current state of NEPA documents, I made some searches on Google for projects in my area.

I was optimistic. How hard could this be? I found the website hosted by the government agency tasked with archiving environmental impact statements. I decided to look for the EIS for a controversial copper mine that has been in the local news for the past decade.

PAUL: typed Rosemont mine into the search box because I had heard that phrase on the news. 
System: No records met the search criteria

How can that be? I know the document exists and NEPA requires it to be made public.

I clicked a link to a library site that had a large collection of EISs. I got the same (lack of) results. Did I do something wrong? I felt confused and anxious. Surely a famous mining project like this would at least return a catalog record.

I wrote to the reference librarian in the sidebar. She wrote back, explaining that the document was on CD ROM. She did give me the full title to see if that helped in my search: Final environmental impact statement for the Rosemont copper project: a proposed mining operation, Coronado National Forest Pima County, Arizona.” Most EIS titles are long like that.

PAUL: now armed with the title, pasted it into the search box labeled “title.”
System: No records met the search criteria.

PAUL: tried two words that were in the title: Rosemont mining
System: No records met the search criteria.

PAUL: tried Rosemont copper.

Why did Rosemont mining return no results while Rosemont copper was successful? All those words were within the title.

Apparently, the two search words had to be a consecutive phrase in the title, not just somewhere within the title, or the document would not be found. I downloaded both the draft and final EIS.


3. Usability testing

Search is difficult to design. A search user interface (UI) must be simple. It has to translate computer logic into elements that are either intuitive to humans or easily learned using plain language. Because people are so used to Google searches and take that speed and invisible power for granted, we needed a high success rate. If people could not find relevant search results or got no results, the site loses credibility

I set up a series of usability studies to learn how our audience searched and used NEPA documents. These were my early research questions:

      1. What are the users’ Jobs to be Done? (Gather goals and context)
      2. How do they currently do this? (Analyze workflow)
      3. What could be better about how they currently do this? (Find opportunities)
      4. Does their level of NEPA domain knowledge affect the usability of a search interface?
      5. Understand common search psychology.



From system models to mental models

This project was a unique opportunity to work on an inovative product from the ground up. A tool that would benefit the larger world in a pragmatic way–making science available to a complex legal process, to make it more efficient, less costly, and potentially make better social and environmental policy decisions.

When I came on, I started from scratch with the user interface–the developer’s first screen. As developers invented new features, I tested each iteration with users using usability walk-through techniques, a long incremental process that paid off. The users I interviewed became part of a growing community that challenged and supported us, suggested new ideas, eventually spreading the word to their colleagues.

The yard-sale effect

The most challenging part of this project was explaining to funders and administrators why we built this and how making lost and scattered documents available is a game changer for decision-making. Yet, during user interviews, people immediately grasped the usefulness of the system over the hacks and tedious work-arounds they were used to. As in a yard sale, laying a complex set of items out where it’s easy to see everything allows a synergy to occur where new knowledge is created.

Cognitive load

People don’t want or need to think about how the system works–they have research questions to answer. By simplifying a user’s critical path through an interface, and knowing what they need, a designer frees up a user’s mental energy for additional creative thinking about the problem they want to solve.

Tech savvy?

The question ” How tech-savvy are our users?” comes up in team meetings from developers. Often domain knowledge is lumped with technical knowledge. To me, this assumption is not as useful as understanding basic human psychology. I found that making software simple and usable for everyone makes it work better for highly experienced professionals as well. We all share a common set of human sense-making capacities.

I listened to a legal research Professor share her students’ universal difficulty understanding a basic advanced search interface. A top law firm partner had difficulty with keyboard search modifiers simply because the specialized databases he was used to used different keystrokes. I tried to accommodate all of these insights.


Usability testing has a subtle magic. Observing people apply a realistic task on a system generates insights that melts through our assumptions opinions. User behavior is often surprising. While watching people move through a scenario, I often think to myself, “I never would have thought of that until I saw them do it.” Human behavior forms the “most likely truth” that guides design choices and even generates new directions.


Participants. I interviewed around 30 people from 5 different personas or user groups.

Conversions. A year and a half after the site went public, and still in beta testing phase, we had 429 registered users, logging 1,158 searches. Downloading is considered a “conversion” and people downloaded 2,129 environmental reviews.

Search success metrics. This is a next step, interrupted by a funding pause.

A testimonial from a team member

Laura shared with me your findings based on the three interviews conducted thus far. Excellent overview and analysis of priority fixes! … First, we learned SO much from these interviews. Thank you so much for organizing them and thinking through the resulting changes that need to be made. I also wanted to support the recommendation that the programmers work on some of these high priority fixes before we schedule additional interviews in September… With some of these basic items fixed now, we can then learn more about how users are likely to dig more deeply into more complex searches.

–Kirk Emerson is Professor of Practice in Collaborative Governance at the University of Arizona School of Government and Public Policy







Leave a Reply

Your email address will not be published. Required fields are marked *