Optimized field mobile programming tool

Redesigning a cross-device tool that helps field technicians program more efficiently on-site.

Time

My Role

Team

Tools

May - Jul 2025 (10 weeks)

User Research
Product Design

1 Product Manager
3 Software Engineers
1 Product Designer (me)

Figma
Figjam (Technical workflow mapping)
Zeroheight (Design System management)

Overview

The context

Mobile Programmer is a field programming application used by technicians to install Meter Transmission Units and ensure proper network connections and data transmission. However, users reported that it frustrating and difficult to use in the field.


Could a simple color change really fix an efficiency problem?

I didn’t think so.

Instead of jumping straight into visual fixes, I paused
and stepped back to understand the user’s real working context.

I dug deeper into the user context by reviewing previous on-site documentation, examining user feedback dialogs, and talking with as many people as possible from the market and training teams.

Since I couldn’t visit clients on-site within the project timeline, I focused on collecting and synthesizing available internal resources instead.

What field work really looks like

1. Challenging environment

Technicians frequently shift between bright outdoor sunlight and dark underground basements, requiring a display that remains visible and legible under extreme lighting contrasts.

2. Gloves and messy hands

Technicians often wear protective gloves or work with dirty hands, requiring touch interfaces that accommodate gloved interaction.

3. Limited screen space

Technicians frequently use mobile programming alongside other platforms simultaneously, this constrains the available screen real estate, necessitating efficient information hierarchy.

4. Evaluated by work efficiency

Technicians are paid by the hour and judged by how quickly they can complete their jobs, which makes every delay or confusion directly impact their income.

Where the current design falls short in real field use

Then I audited the current interface (based on using context) and identified 3 usability gaps. In addition to the visual contrast issue raised by the product manager, I discovered 2 more critical barriers that affect workflow efficiency and accuracy.

Visual Accessibility Issues

Technicians frequently shift between bright outdoor sunlight and dark underground basements, requiring a display that remains visible and legible under extreme lighting contrasts.

Touch Accessibility Issues

Technicians often wear protective gloves or work with dirty hands, requiring touch interfaces that accommodate gloved interaction.

Workflow Inefficiencies

Technicians frequently use mobile programming alongside other platforms simultaneously, this constrains the available screen real estate, necessitating efficient information hierarchy.

These issues don’t just affect technicians (my primary user),
they impact our business (the people who hired me).

I proactively communicated with my manager and internal stakeholders to understand how the Mobile Programmer fits into the larger business ecosystem, from field operations to client contracts and training processes. Through these conversations, I was able to connect the usability issues I identified to their real business consequences.

1. Slow work = Unhappy clients = Lost contracts
Field techs are paid hourly and are judged by how quickly they can complete jobs. If the software is confusing or slows them down, it hurts both their experience and our company's reputation, utility companies may think twice before renewing contracts.

2. High training cost from low intuitiveness
The system’s lack of intuitiveness leads to repeated training and support, increasing operational costs. In a high-turnover environment, the tool must be intuitive enough to stand on its own, not dependent on training to be usable.

Strategic alignment before redesign.

To ensure alignment, I prepared a concise report that connected each usability issue to its broader business impact and illustrated why a simple color change wouldn’t solve the underlying problems.

By framing the findings through both a user and business lens, I facilitated a productive discussion with the product manager, aligned on the need for a deeper redesign, and gained buy-in before moving into the design phase.

The solution & impact

All interfaces were redesigned to optimize field operations with two core goals: Improved program efficiency and enhanced visual and touch accessibility for all users in challenging field conditions.

Measurable impact: The redesign delivered significant improvements across all key metrics.
(The results come from usability testing conducted after the redesign, with 7 participants completing standardized tasks in simulated field environments.)

Key improvement #1

Enhanced Visual & Touch Accessibility and Navigation Efficiency

BEFORE
  • Low color contrast ratios failed to meet WCAG standards
  • Single light theme forced users to strain their eyes in dynamic environments
  • All action items were buried within a single scrollable menu without clear visual hierarchy
AFTER
AFTER
Optimize for visibility

Dual-Mode contrast system, both modes guarantee WCAG AA compliance.

Optimize for efficiency

Restructured the interface with a tile-based layout, organizing functions into clear, scannable groups.

Surfaced action items upon landing, eliminating the need for scrolling.

Increased touch target sizes and breathing room between elements.

Key improvement #2

Enhanced Task Guidance and Data Accuracy

BEFORE
  • Unclear navigation: Installers didn't know what to do next, the label was not clearly presented as the next step.
AFTER
AFTER
Optimize for efficiency

Clear context within the
multi-step workflow.

Added digit counts beside each field, enabling installers to verify correct entries even when terminology might vary.

Optimize for touch accessbility

Separate scan button, easier to tap for installers wearing gloves.

Key improvement #3

Optimized Selection Pattern with Smart Defaults

BEFORE
  • Extra interaction cost: Users had to click to expand drop downs even for fields with only 2-3 options.
  • Contextual inefficiency: For gas equipment installed only outdoors, users had to repeat the same "Outside" location selection every time.
AFTER
Optimize for efficiency

Replaced dropdowns with button groups for fields with limited options.

Smart defaults, for gas equipment, "Outside" is pre-selected to reduce repetitive selections.

Key improvement #4

Clear Operational Feedback and Structured Data Display

BEFORE
  • Operational results was buried in a small message bar, and all data appeared in a continuous list.
AFTER
Optimize for efficiency

Grouped related data into modular cards with a clear visual hierarchy.

Users can verify results at a glance without reading technical details

Prioritized critical information at the top, users can quickly troubleshoot errors.

My Approach

Research

Design

Evaluate

Implement

  • Interval site visit documentation
  • Stakeholder interviews (customer support team, PM, internal engineer)
  • Daily dev team stand-ups
  • Identify core efficiency issues
  • Created high-fidelity prototype
  • Internal rapid testing
  • Prototype iteration
  • Dev-ready assets (prototype, specification, design system, responsive guideline)

Process highlight: My approach when things didn't go as expected

Highlight #1

Couldn’t get access to the field techs — so I reconstructed context from multiple sources.

Ideally, I wanted to visit the field, conduct contextual inquiry, and observe workers using the tool in their real environment.

But in reality, I couldn't schedule site visits during this timeframe, so I pivoted to alternative research methods:

  • Dug into previous site visit documentation to understand existing pain points

  • Initiated conversations with training/marketing teams and sat in on training sessions to identify recurring struggles

  • Created visual diagram maps to synthesize findings, organize unknowns, and formulate targeted questions for stakeholders

By connecting the dots, I reconstructed the field context and built a clear, reliable picture of user frustrations.

Highlight #2

Not everyone understood design — so I translated UX into business impact to get buy-in.

Ideally, all stakeholders would see the value of user-centered improvements and support the design changes.

But in reality, the organization didn't yet have a mature design culture, and not all teams had strong design intuition. So I did internal persuasion work:

  • Translated UX proposals into business outcomes, showing how improvements would reduce support costs, improve customer retention, and win RFPs.

  • Aligned recommendations with PM and leadership priorities — connecting user pain points to their KPIs and business goals.

Before implementing, I first had to earn buy-in by speaking their language.

Highlight #3

Limited engineering resources — so I prioritized impact vs effort.

Even with stakeholder support, I faced implementation constraints: Engineering bandwidth was limited, and not every good idea could ship within the timeline.

So I prioritized strategically:

  • Worked closely with engineers to understand code structure and estimate implementation effort for each design change.

  • Identified quick wins, prioritizing fixes that required low effort but delivered high value.

  • Focused on frequency and impact — targeting pain points that caused the most user errors and business friction.

By making data-informed trade-offs, I maximized user impact within constraints.

Interested in the full case study?
Please feel free to reach out to learn more about this work and my process in detail.

Curious? Feedback? Collaboration?

I'd love to meet you :)

Feel free to grab a virtual coffee with me via Calendly!

This website is best viewed on desktop
Copyright © 2024 Yuhan Ke