menu
The Role of UX in Robotics
The Role of UX in Robotics
Gini Keating, Director of Product Design at Brain Corp discusses the role of UX in robotics, explains what UX design entails, and elaborates on why it’s important to keep humans at the center and make robots easy to use

Gini Keating, Director of Product Design at Brain Corp discusses the role of UX in robotics, explains what UX design entails, and elaborates on why it’s important to keep humans at the center and make robots easy to use

It’s no secret: Robotics helps countless businesses across a spectrum of industries achieve greater efficiency and productivity. This is especially true for businesses struggling to keep up with everyday operations during the ongoing pandemic. A large part of the successful impact of robots – like our BrainOS®-powered autonomous floor scrubbers, shelf scanners, and inventory delivery tugs – hinges on the robot operators knowing how to use them quickly, easily, and accurately. After all, how can any technology designed to increase efficiency and productivity do so, if it is too complex for end-users to understand? It is the role of user experience (UX) in robotics to make robots easy to use.

For any technology (an app, website, robot, etc.), UX’s primary concern is making the experience intuitive, unintimidating, and downright delightful.  The payoff of achieving user-friendly status is immense: Faster and more widespread adoption, higher product impact, and stronger user loyalty. For robotics, increased comfort in operating robots translates into speedier deployment, better and more consistent quality of work, measurable results, and improved ROI. Additionally, user-friendly robotics results in higher acceptance of robots by the existing employees that become robot operators, the other associates that work alongside the robots and the general public that shop, travel and move through areas where the robots are operative.

What is UX design, exactly?

User experience is the human-centered process of designing something, usually technological and innately complex in nature, with the goal of hiding that complexity, and exposing only the benefit to the user.

GOOD UX ANTICIPATES THE NEEDS, EXPERIENCES, BEHAVIORS, LANGUAGE AND COGNITIVE ABILITIES, AND OTHER FACTORS OF EACH USER GROUP. IT THEN, LEVERAGES THOSE INSIGHTS, LEADING TO A FINAL USEFUL AND USABLE PRODUCT OR SOLUTION.

When it comes to robots, UX starts with developing an understanding of a robot’s intended task and environment, while taking into consideration any possible social impacts the robot could have on humans operating and interacting with it. BrainOS®-powered robots are designed to share space with the general public: We have robots in airports, malls, retail stores, public transportation terminals, and other heavily trafficked environments. Our robots must enhance those spaces, not only by getting their tasks done, but by safely navigating through those shared spaces.

UX designers work mindfully and empathetically (with a hefty dose of psychology) to ensure each aspect of the human-robot interaction is seamless.  This is no small task when it comes to robotics, as there are often multiple user groups (more on that later) and multiple layers of user experiences that must be addressed for each user group. Comprising these layers are the physical, visual, and aural elements of the robotic machine that UX designers must consider for seamless and frictionless operation. They include a robot’s:

  • Form Factor: This is defined by the robotic machine’s physical design, meaning its size, shape, configuration, color, and human accommodations like seats and handles. As a whole, these affordances (in UX lingo) set user expectations and clearly signal what the device does and how it should be used.
  • Operating System (OS): The OS is the system software that drives the physical components of the robot to execute the intended tasks (aka its brain). It controls how a robot moves, shows its intent, and negotiates with people when navigating through a shared space.
  • User Interface: Hardware buttons, on-screen menus, pedals and steering wheels provide physical touchpoints that help end-users operate the autonomous robot. Other elements like visual and audible cues, voice tone and manner, design system, graphic style, icons, readability, and internationalization also contribute to an end-user being able to simply and quickly deploy, operate, and interact with a robot.

Who are the users in UX?

Each autonomous robot that is designed to take on a unique task will have different user groups, though there can be some overlap. For the autonomous mobile robots (AMRs) powered by BrainOS, there are a variety of user groups who interact with our technology in different ways.

It all starts with OEM factory workers on the manufacturing line. Brain Corp technology leads these users through the robot manufacturing process for a smooth rollout experience, helping them transform from mechanical technicians into robot calibration experts.

Our next – and primary – user group includes staff from maintenance, facilities, operations, retail, and other teams who become our robot operators. They are assigned to teaching and running our autonomous floor scrubbers and other robotic machines that take on tasks the busy staff would otherwise have to handle. These are the people operating our robots every day. We have a very wide range of robot operators, including mechanical experts, differently-abled adults, and an array of individuals who are relatively inexperienced with technology. For example, we’ve found that many of our end-users still carry flip phones. Therefore, the user interaction with our robots must be simple and intuitive for anyone, regardless of their comfort level with new tech. By designing a platform that is easy to adopt and use, we can watch these users quickly become proud robot operators.

Our last user group includes bystanders, in other words, people sharing a public space with our self-driving robots. This can include shoppers in a grocery store or other retail environment; travelers at an airport; patients, visitors, and medical staff in a hospital; or the general public in other environments where our autonomous technology is hard at work. Often, a BrainOS-powered floor scrubber or inventory delivery tug is the first autonomous robot these people encounter in the real world. It is the role of UX to make sure this first experience – and every experience – with our robots is as unobtrusive as possible. Our goal is for these users to notice the clean floor or the stocked shelves the robot provides, but not the robot itself. If they do encounter the robot, UX ensures it will be polite and clearly show its intent when passing a shopper or negotiating to share an aisle.

Because of our greatly differing user groups, our UX process must delineate between and incorporate their varying needs, abilities, priorities, and perceptions at the core of every aspect of robot design, from our cloud-based BrainOS platform to the robotic end product. This is anything but simplistic.

Humans at the Center of Robotics

Self-driving robots have the power to transform lives, businesses, and entire industries. Yes, they bring greater efficiency, cleanliness, and safety to the environments in which they are deployed, but even more importantly, robots provide tremendous relief to the already overwhelmed staff, especially during COVID-19. Pivotal to autonomous robots reaching their fullest potential is their ease of use, and that comes down to a design that keeps people at its core. 

If you’d like a deeper dive into user experience for robotics, watch The Role of UX in Robotics and Artificial Intelligence from the 2019 Robotics Summit.

This article has been republished with permission from the Brain Corp blog.