Automating a critical internal process

I migrated a manual and error-prone Excel-based workflow into a legacy system, redesigning the flow to bring more control and reliability. The solution was tested with 4 key users and achieved 100% accuracy on its first live use.

Image 1: question input interface

Context

The process involved publishing an unofficial answer key for a major exam, under high time pressure and the need for absolute accuracy.

Before the solution, multiple team members were entering question data simultaneously into a single shared Excel file.

This led to:

  • Accidental data deletion

  • Confusion about task ownership

  • Delays in answer validation

  • High risk of human error at a critical moment

The problem

The main pain point was brought up by the team coordinator:

“We’re losing time and quality trying to work all at once on the same spreadsheet.”

Additional issues included:

  • The team was racing against the clock to be the first to publish the answer key.

  • Constant stress and fear of mistakes.

  • No clear control over who was entering what, and when.


Image 2: previous workflow

Research and discovery process

Method used: Direct interviews with the process coordinator and the actual users involved in the workflow.

Our goals:

  • Map the existing flow

  • Identify bottlenecks

  • Pinpoint common failure points

Tools:
Whiteboard sessions and quick flow simulations with POs, Developers, and Design.

Image 3: usability test for the new flow

Proposed solution

I designed a new flow that broke the process into controlled stages, ensuring:

  • Each person could only work within their specific step—eliminating overlap.

  • Sequential data entry—reducing the risk of overwriting someone else’s work.

  • A system that guided the user with simple validations and clear visual feedback.

UX Deliverables:

  • Ideal flow mapping

  • Mid and high-fidelity wireframes

  • Interactive prototyping

  • UI componentization

  • Basic accessibility testing (keyboard navigation focus)

Image 4: mapping the new flow with questions and uncertainties

User validation (usability testing)

We tested with the 4 main users responsible for data entry and validation.

Test approach:
A lightweight, moderated usability test—without a strict script.
We gave a quick explanation on how to start and observed their natural navigation.

Results:

  • 3 out of 4 users were able to use the system intuitively from the start.

  • 1 user needed brief guidance but completed the task easily after that.

Overall feedback:
"Much easier and safer than before."

Success rate: 100% task completion and accuracy on the first live use.

Image 5: new interface modal for standard test color selection

Handoff and development

Throughout the project, I maintained close collaboration with POs and Developers, ensuring smooth transition from design to build.

My contributions during this phase:

  • Design System standards: Ensured consistency by following established Design System guidelines

  • Page Anatomy: Documented the structural patterns for each page type

  • Detailed Screen Flows: Provided developers with end-to-end navigation flows and screen-to-screen interactions

Image 6: ilustration of the design system and homepage anatomy

Post-launch results

  • First platform to publish the complete answer key on exam day

  • 80 questions entered with 100% accuracy

  • Zero rework

  • Zero errors

  • Positive feedback from all stakeholders involved in the process


Image 7: new workflow

Image 8: institutional instagram post validating the new work process and showcasing its effectiveness, even if implicitly, since the post does not directly refer to the project but highlights the success of the answer key correction

Lessons learned

As a designer, this project reinforced the value of active listening—the real problem only became clear after interviewing the business team.

I also learned how early collaboration with POs and Developers can fast-track feasible flow definitions from both design and technical perspectives.

Finally, it validated that not all usability tests need to be complex—a simple, behavior-focused observation test can provide highly actionable insights.

If I were to do it again…

If I could go back, I’d structure ways to track execution time before and after the solution to build stronger comparative metrics for the portfolio.

But even without that, the qualitative data and the final outcome made the efficiency gains clear.

Create a free website with Framer, the website builder loved by startups, designers and agencies.