The Challenge
Supercharge operator instincts
Chemring Group’s handheld ground-penetrating radar (GPR) allows operators to “see” explosives that have been buried underground. Naturally, confidence and clarity are non-negotiable for a tool like this.
When we were brought in, the engineering team behind the product had hit a wall:
The UI felt cluttered, the hardware was rigid, and users were overwhelmed by the interface, especially in mission-critical environments. Our initial efforts to simplify the onscreen experience had helped, but it became clear that a screen redesign alone wouldn’t be enough.
The real opportunity? Going beyond what operators can “see”: Transcending pixels to build a true, human-centered experience that bridges together sight, sound and touch. Only by engaging users’ full senses could we ensure rapid, instinct-level responses to potential threats.

The Work
Here’s how we helped Chemring
We began by mapping out every user action the tool supported—21 in total. For each, we cataloged the visual, aural, and haptic feedback users encountered.
From there we rebuilt the experience from the ground up, both digitally and physically.
Highlights:
Multisensory Design: In low-light or covert conditions, visual and audio cues may not be an option. By studying tone ranges and vibration detection thresholds, we designed a set of parallel haptic, audio and visual signals that were unique for each function of the tool. When multiple signal sets harmonize and reinforce one another, users can have increased confidence when interpreting critical feedback without being fully reliant on any one signal type.
Ergonomic Input Overhaul: The original control system used four interdependent rocker switches, producing up to 81 possible combinations. Using ergonomic research for optimal thumb reach, we simplified this to an intuitive 8-button layout, now with just 17 defined sequences.
Adaptive Layouts for Handedness: Because this was a single-handed device, we also created a mirrored display system that could adapt based on left- or right-handed use for wider accessibility and ease of use.
Why multisensory design matters
In mission-critical environments, users must rely on more than just screens. In covert operations, tools have to be usable with audio and visual cues completely disabled. By designing feedback systems that synchronize visual, auditory, and haptic senses, we created redundancy that supports faster response, higher confidence, and safer outcomes in all conditions.

The Impact
Sharper senses, safer on-the-move missions
By aligning user input, feedback signals, screen design, and ergonomics:
Information overload was reduced, allowing users to focus on the task at hand.
Critical actions were prioritized, placing potential threats front and center.
Multi-sensory cues improved awareness and safety, even in high-stakes environments.
Fatigue and cognitive load were minimized, thanks to more natural input and feedback patterns.
By stepping back, deconstructing how the tool communicated, and rebuilding it with humans at the center, we delivered a product experience that harmonized sight, sound, and touch—where it matters most.