The Challenge
Command & (Remote) Control
The United States Army depends on Unmanned Ground Vehicles (UGVs) to operate in dangerous mission spaces. Operating these vehicles depend on a cutting-edge combination of hardware and software, but what makes it all work is the human at the controls.
Visual Logic had been brought in to deliver a functioning operator control unit (OCU) for an experimental UGV in time for a live exercise event. When testing revealed a significant mismatch between the system model and the user model, we knew it was time to get to work.

The Work
Here’s how we helped the Army
We arrived ready to test. This UGV was no simple organism: it would be operated remotely using a complex combination of mounted camera sensors, connected via Wi-Fi to a head-mounted display worn by the operator, then controlled with a mouse.
What we quickly discovered: the operating experience being tested wasn’t field-ready. While the system was functionally complete, our testing showed that it simply wasn’t usable in a real-life operational context. The way the system was designed (the system model) didn’t match the way users thought about operations (the user model). We knew this meant it was time to learn, iterate, and bridge the gaps.
Within days, we completely reimagined the OCU based on user observation and hands-on, iterative testing.
We replaced the remote mouse with something instantly more intuitive for the young operators: an Xbox controller. We also overhauled the UI into a 3-dimensional heads-up display, like something out of an Iron Man movie. This allowed us to layer visual cues directly onto the video feed, delivering a familiar, immersive control experience. We also introduced a new waypoint based navigation system. While waypoint navigating is common unmanned aerial systems today, at the time this was seen as a new and experimental system for remote ground vehicles.

The Impact
Walk-up easy control for mission critical vehicles
Operators instantly understood how to operate thanks to the familiar gaming controls and an augmented reality intuitive interface. Our waypoint-based AR control system remains ahead of its time. More than a decade later we have yet to encounter another UGV that operates with a waypoint system as intuitive as what we designed and tested at Fort Benning.
By identifying—then bridging—the gaps between user and system model, we forged a design model that was uniquely intuitive, empowering users to operate safely and effectively.