User Experience in the High Tech Era

By MJ Johns, M.E.T.

Humans evolved with some amazing adaptations—a few notable ones being our brains, our ability to stand upright, and our opposable thumbs. We built technology to complement our skills and fill in the missing pieces we needed to conquer the world. Now technology is evolving, sometimes faster than we can keep up.

When we use technology we interact with a user interface (UI), which allows us to give and receive information to and from the computer. This can be a physical interface like a keyboard, monitor, or virtual reality (VR) headset, or it can be a digital Interface like a menu or heads-up display (HUD). When these all come together—the physical Interface, the digital Interface, and the user—we call the result the user experience (UX), the experience the user has when interacting with the technology. What does the user touch and interact with? What does the user see, hear, and feel? As technology evolves, it becomes ever more important that we carefully craft that user experience. Technology is exploding in every direction, and in some ways it has become more intuitive, but in other ways its capabilities seem almost unfathomable to the average user.

We’ve all used software that has far more capabilities than one individual could hope to discover—Maya comes to mind, with its various toolsets for modeling, texturing, rigging, animating, lighting, physics, rendering, simulation, and more. How do you teach a new user all of the capabilities your software has to offer?

In software development, we have what’s called a first-time user experience (FTUE), which is the experience the user has the very first time they interact with the software. In game development, we typically want the FTUE to include a tutorial so we can teach the user how to play the game.

For a physical product like hardware, there is an out-of-box experience (OOBE), which is the experience the user has when opening the product and turning it on for the first time. Let’s take phones as an example.

A long time ago, back before smartphones, when you got a new phone the first thing you would have to do when you opened it would be to charge it, often for several hours the first time. As a hardware developer, that means the first time the user interacts with your product it will be to open the box, plug it in, and wait. In the field of UX design, the term “friction” is used to describe any point in the interaction that slows the user or makes it difficult or unpleasant for them to achieve their goal.

When the first iPhone came along, Apple shipped the device pre-charged as a way to reduce that first friction point, allowing the user to jump directly to interacting with the touchscreen as soon as they opened the box.

The touchscreen itself revolutionized the user experience. In your normal interaction with a computer, you likely moved a mouse around on a flat surface, and that movement corresponded to the cursor you saw on the screen, which allowed you to “click” on things in the digital interface. There is a disconnect between what your hand is doing to move the mouse and the effect you see on the screen, which takes some getting used to and is not completely intuitive.

With a touchscreen, the place you tap with your finger is the actual point of interaction with the digital display. There is no disconnect, which is why young toddlers (and even some pets!) are able to play games on a tablet.

Virtual reality is another example of an intuitive interface with no disconnect. To look around in the virtual world, one physically turns one’s head rather than sliding a mouse side-to-side to look around, as in the typical way of interacting with a first-person shooter video game.

VR also allows for physical gestures with hand-tracked controllers, taking advantage of proprioception—the awareness of the positions of things relative to your body. Consider an inventory system in which items are stored in your virtual backpack, and you reach over your shoulder to pull items out. Proprioception allows you to reach over your shoulder even while wearing a headset, because you do not need to see your hand to know where it is.

People often confuse user interface design with user experience design, or use the terms interchangeably. While UX design is more focused on the experience of the user, what they do and how they interact, UI design is more concerned with the design and layout of the interface, where on the screen buttons and icons go, and what different icons represent.

Although the two fields have clear distinctions, it is important to recognize where they overlap and how much the design of one is dependent on the design of the other. The design of VR apps and games can offer further insight into the UI and UX dilemma.

Unlike with traditional software, in VR users wear the interface on their face and physically turn their head to look at things. One of the worst things you can do for a user’s comfort is to paste 2D UI elements to the display like a health bar or timer. If you wear glasses, imagine having a sticker stuck to your glasses all day. It can be distracting and difficult to focus on and can also increase the risk of VR motion sickness. A better option is to design UI elements into the virtual world such as physical 3D buttons the user can walk up to or a clock on the wall as a timer.

From this example it is clear that designing the UI is heavily dependent on the intended experience for the user. This can also be seen in the design of everyday things we interact with such as cars.

Driving a car is something that nearly all of us have experienced at some point. Cars are shaped by incredible advances in technology—however a few things remain completely consistent across makes and models old and new.

The core functionalities of a car— to move forward, stop, turn right, turn left—are dictated by the steering wheel, the gas pedal, and the brake pedal. When you get into a new car, you already know how to start, stop, and turn. This consistent design is integral to the core experience of driving a car. If you’ve ever changed vehicles, you might have found it difficult to remember where to find the stereo button or how to turn on the windshield wipers, but it should never be difficult to find the brake pedal or remember which way to turn the steering wheel.

The interface is the thing that makes the experience possible, but in the case of bad UI design, the interface can be the thing that gets in the way of the experience. A good way to tell the quality of the user’s experience is to ask the user to explain the experience they are having or what they are doing. Ideally you want them to talk as if they are immersed in the experience, as opposed to explaining how they are using the interface.

If a user is driving a car, you might expect them to say, “I stopped at the stop sign and then turned left.” You wouldn’t expect them to say “I gradually pressed the brake pedal until the car came to a stop, then I rotated the steering wheel to the left while pressing the gas pedal.” Even though that is literally what they are doing, if they are immersed they won’t be thinking about the interface.

Similarly, with a video game, you’d like them to say something like, “I ran up to a zombie and shot her,” rather than “I pressed ‘W’ on the keyboard to move my character forward, then clicked the mouse to fire a projectile at the enemy character.” If the user has to think about which button to press, that is a point of friction in their enjoyment of the experience.

This is where tutorials come in handy for games. A lot of gamers might say they dislike tutorials, or even hate tutorials, but that is probably because a lot of tutorials involve reading a lot of text and going slowly through repetitive tasks. There are elegant ways to design a tutorial that is integrated into the experience, allows the player to explore the UI elements, and rewards them for their discoveries.

It may be tempting to expect users to have some basic understanding of gameplay (like using the WASD keys to move), but it’s important to remember the first-time user experience and that for some users, this may be the first game they’ve played using WASD, or perhaps their first game ever.

As technology continues to evolve and expand what is possible in the digital world, the role of UX Designer will become a necessity. Whether you are designing software, hardware, or games, creating a frictionless experience for your user is paramount, and that requires careful consideration of the interactions you expect your user to perform.