“You’re on 33”

Confusing elevator buttons

I went to the 2016 UXPA conference in Seattle and stayed in a tall hotel downtown. I took a shuttle from the airport and when we arrived I got out, thanked the driver and made my way to the front desk to do the usual check-in. When I was just about done the woman behind the counter told me “you’re on 33”. I smiled and said thank you. I like high floors because you usually get some sort of good view.

I walked through the lobby towards the elevator looking forward to getting to my room so I could relax for a while. I’m not a big fan of flying so the 5 hour flight from Atlanta was a bit stressful. I waited a minute or so for the elevator doors to open and I got in. When I turned to push the button for my floor I was confronted with the picture in this post.

I froze. What in the …? Where am I? As I tried to think about where to even start I could sense my stress gauge start to work its way back up to flying level. I seemed to find the ’30 something’ area but still wasn’t sure where my finger should go. 35? No. 31? No. The hole between 28 and 29? No! Where is 33? There’s two of them! I pushed what seemed to be the correct button and the elevator doors closed. Where was I going? I had no way to be sure but I was pretty certain my room wasn’t on the same floor that I started on so I had to do something. Up I went with the horrible confusion of numbers until the doors finally opened to what seemed to be my floor. I looked around for my room number, found it and slid in my key card. I made it!

We have to think beyond the belief that this is okay because the cognitive self will fix the problem. Our first encounter with our environment is, as Daniel Kahneman says, Mode 1, based on patterns and shortcuts and mental models. Why are you making me do all this work?

Not what I was expecting

Colorful and stylized hard plastic animals

UX was never just about screens. If you’re designing to drive behavior then you’re doing UX. I would say it’s also about more than helping users achieve their goals. It’s also about designing an environment for intent.

These toys have been in my doctor’s office for years. I’ve always liked them but I had never touched them. Their shapes and colors always made me think of them as solid but soft, structurally sound yet playful.

I was recently in the office and a mother was letting her small son play with them. He picked one off the couch and dropped it on the wooden floor. It made a hard, sharp sound that surprised me. Just like that they were different. What I had once considered comforting and engaging became loud and disruptive. This single experience in my journey caused me to change my expectations of something I once looked forward to.

Details like this matter and must be considered when you’re creating a context for an experience. I’m still a little sad about it.

The power of touch

Tactile feedback gives us a rich and visceral way to experience our environment. It allows us to discern things such as shape, texture, volume, density, and relationships between objects.

Current use of tactile stimuli in touch screen devices is primarily addressed through vibration. Research has also produced surfaces that are able to physically distort a screen.  It’s not unreasonable to think that touch screen devices might one day be able to mimic more complex and subtle surface texture.

Meaning is ultimately determined by context. The meaning of one thing is determined by its relationship to other things. This is the same principle we employ when we organize visual information on a screen. We use elements like white space, color, alignment, grouping, and hierarchy to draw attention and give meaning to elements and sets of information. The use of texture on a touch screen device could provide users another facet of understanding and context.

Touch could make information more accessible on surface devices. A sight impaired user might be able to complete a task through touch alone. Tactile qualities could also be used to support visual elements, skeuomorphic or not.

The possibilities for tactile feedback on touch screen devices are likely as varied and nuanced as the tasks themselves. Of course, each implementation would have to be considered within its own needs and design constraints, just like any other smart design element.