What’s next for Design and User Experience (UX) in 2017

(updated Jan 2017 with data visualization and a comment on VR.)

Looking for my next big thing means talking to a lot of people. I’m sometimes asked what I see next for design and user experience. I also run into students who ask advice for their careers. The two answers are connected.

I see these trends over the next few years:

  • No great innovation in visible UI
  • The strengthening of invisible UIs
    • Think voice activated devices like Amazon Echo
  • Big data
  • Machine Learning and AI (and bots)
  • More data visualization

Students should look for companies that are set up to succeed in these areas.

A Recent History of UI Design

First a bit of history. We need to understand how we got here to see where we will go next.

macosxpb

The “Aqua” look and feel in Mac OS X

ios_6_home_screen

iOS 6x.: The last skeumorphic iOS

icecreamsandwich

Android 4.x “Icecream Sandwich”

Skeuomorphism and “Lickable UIs”

Around 2010, Skeoumorphic UIs driven by Steve Jobs’ push that a UI should be “Lickable” drove the Mac and iPhone to have round, glistening designs (the Aqua Look and feel).

Aqua Style Button: there were hundreds of tutorial on the web showing how these were made

Aqua Style Button: there were hundreds of tutorial on the web showing how these were made

Android (Icecream Sandwich) and Open Solaris (Nimbus L&F that I worked on) did the same. Graphic design had a heyday, and all of us spent many hours learning how to and creating aqua buttons (I know I did), and textures like “Eggshell”.

While I labored diligently creating aqua buttons and other similar UI components, I was not a fan of skeuomorphism. If skeuomorphism was bad in Microsoft’s BOB, it was bad in other places as well. I prefer matching the UI to the device. The UI is a glass screen. It should not look like wood grain.  (ignoring the factoid that I took many, many pictures of wood grain…)

Enter Touch and Gestures

When Apple introduced the iPhone, they disrupted many businesses. In the UX space, the biggest contribution of iOS was gestures. Touch (and lickable interfaces) existed before, but the capacitive touch screen really pushed the envelope on gestures. Pinch, pinch out, swipe, swipe from an edge, all these came into vogue. In the past few years (approx 2010-2013), the big improvements in UX were in gestures and aesthetics / visual design.

Two standout apps that really leveraged the technology at the time were Clear, the todo list app, and Roam BI Analytics, the data visualization app.

Clear Gestures

Clear Gestures

clear-app-realmac-software-3

Some of the gestures in Clear.

Clear is a todo app that has a delightfully consistent set of gestures to navigate up, down and to add new lists and items to lists. Many of the clear gestures (like swipe to complete / delete) made their way into other apps. E.g. in iOS mail, you can now swipe to delete an email.

roambi2 pieview-ipad-portrait roambipiechart screen-shot-2012-05-30-at-5-42-29-am

RoamBi Analytics took boring old data and made it sexy by using both vibrant visuals as well as the rules of physics. For example, its Pie charts have inertia, and when you spin them you hear a clicking sound. The combination makes it delightful to play with your data, which in turn should make you better at understanding it.

The end of Skeuomorphic, the start of flat

windows-8-product-key

Windows 8 came along and introduced edge gestures. But its greatest contribution to the UI world was the flat look and feel (Metro). Tiles, buttons, icons, everything was flat. And it worked.

ios_7-1_homescreen

iOS 7, the first iOS to move to a flat UI.

android-material-design-lists

List views in Android using the Material Design language

Soon after, both iOS and Android switched to flat designs. Google’s Material design is considered by many designers to be a great design language. I like flat designs because they are true to the material (no pun intended).

The flat design languages focus on typography and color palettes, designers spend a lot less time doing buttons and curves and the like. The focus pushed designers to think about flow and layout and less on visuals.

The current (2016) state

We now have really, really good designs for smartphones, the web, tablets, convertibles and desktops. They use flat designs, touch, gestures and are getting more sophisticated. For a while, we’ll see small changes. Force touch, watches, activity trackers, none of these has made huge changes to UI design.

UX 2020

(Apologies to my old team where we spent a lot of time talking about UX 2020.) The big shift in UX in the next few years will be not at the front end, but at the back end. Our devices are all connected to servers in the cloud (just the way the computer that I’m writing this blog post on is connected to servers hosted by Dreamhost). There are gobs of data sitting on those cloud servers. The companies that find innovative ways of using those gobs, and the interfaces that work seamlessly with them are the ones that will create the next big move in UX.

UX designs will be driven by:

  • Cloud design. Software will assume data in the cloud, which is quite different from UX that assumes that data is on the device. As an example of the difference, when I type a document in Microsoft Word, the data is local. For this blog, the data is in the cloud. Gestures that users make on their devices must be supported by cloud data. For example, when a user types into a field, that data must go back to the cloud and the cloud throw up suggestions. UXs that understand this architecture and leverage it this will be successful. Of course, this transition is well under way, and “cloud first” design is the way to go.
  • Big data: Closely related to cloud design is big data. Specifically where the companies who run the cloud services figure out ways to leverage aggregate data to help individual users. For example, when the search field above is driven by big data, it will suggest trending topics.
  • ML and AI: As organizations learn from their globs of data using machine learning and AI, UI designs will need to take this into account. Some of what this means is unclear, but it is safe to assume that the UI will morph based on the result coming back from the cloud.
  • Data visualization: We are collecting ever more data, about people, devices, interactions, behavior, the list goes on. Human beings need help understanding the data. I expect there will be lots more work on data visualization, making data visualizations more common. The dashboard concept above is an example.
  • Machine Learning, AI will enable invisible UIs. Specifically, voice recognition (like Siri, Cortana and Google Assistant) and the ability to generate sentences and voice responses will make devices where the only UI is verbal. Amazon Echo is a front runner, as is the voice recognition UI in cars. Sadly, we have this in two of our cars, and they both are hard to use.

In order to make these experiences great (take the car voice command for example), UX designers need to work on hitting close to 95% success. This likely means reducing the voice recognition features and focusing on the most common use cases, and making those successful.

Advice to UX designers

In order to succeed as UX designers, we need to excel at cloud based design, and learn big data, ML and AI. We need to understand what these technologies can do, and understand how to apply them to our designs.

I asked a couple of folks to review this post, and MV asked: “I am curious to know and would like you to expand on how you think UX will be more focused on the backend rather than the frontend. I see your examples but would also like to know how you imagine what they day of UX designer focused on the backend would look like.

Designers will need to design the front end while “looking through to the back end”. At the front end, users will see the UI and will have goals to meet based on what they have done, and their expectations. The responses will be generated in the back end. Designers will need to tune the front end to make the UI show the intersection (subset) of what users expect and what the backend can accomplish. In a lot of cases, this will mean fewer capabilities visible at the front end till the back end’s capabilities improve.

My thoughts on VR

Some folks have suggested that VR will make a breakthrough in 2017 and become commonplace. I disagree. I think that VR relies too much on VR headsets. Until either VR headsets become common place, or VR doesn’t need headsets and can somehow render on phones, tablets and desktop screens, VR will remain a niche.

Advice to students

I mentioned that I’ve recently met students in design and HCI programs looking for advice on what to do and where to look for jobs. My advice is: look for companies that are in the cloud, who can leverage big data and apply ML and AI to it. They will innovate more rapidly than the others and set the direction for design. You need all of them, just being in the cloud or just having big data will not be enough.

😀

Thanks to MayaV for her feedback on this post.

How to Confuse ATM Users

Closeup of the Envelope Insertion slot in an ATM

Closeup of the Envelope Insertion slot in an ATM

ATMs are great. You’re no longer restricted by where your bank is and when it is open. But, I wish they’d spend a little more time getting the designs right. The image above is from an ATM I used. The ATM has Braille, which is wonderful. You can see the Braille in the strip above the envelope insertion slot. I can’t read it, but I assume it says that the envelope slot is below. This is a very nice touch, great attention to detail. In addition, all the edges are gently rounded, there are no sharp corners, overall a very well executed design.

Given the attention to detail, what I don’t get is the image showing you how to insert envelopes into the slot. Its bad…

  • The biggest problem is that the orientation of the envelopes+slot icon and the arrow is wrong. It looks like you’re supposed to pull the envelopes out of the slot.
  • The second is that the envelopes look like they are going into the slot along their width, rather than the edge.
  • It also looks like you’re supposed to put in a stack of envelopes.
  • The arrow may have been intended to indicate where the slot is. However, that is confusing. It is so close to the slot+envelope graphic that it looks like it telling you what action to perform.
Better ATM Slot Graphic

ATM With the insertion graphic moved to the slot

A better design would be to move the envelope insertion graphic to the deposit slot (and fix its orientation) like the image above.

Companies are always looking for ways to cut costs. When I managed Sun’s industrial design team, I learned about keeping costs down. The graphics are printed onto the metal using silk screening. Images on two separate bits of metal is twice the cost. The two bits of metal (silver and gray in the picture above) might come from different sources. The silk screen should not bother trying to tell you where the slot is. It is large compared to the graphic that it is hard to miss. Also, trying to show the envelopes / envelope stack is too much detail, and adds to the confusion by suggesting the envelopes go in face down. Simplifying the graphic has a better result.

Better Silk Screen Design

Simplified Silk Screen icon

I’m guessing the team that designed the ATM had a decent design. Most likely, someone who was responsible for cost cutting made some design changes without consulting the design team. Whenever Sun’s team needed to cost cut, they worked with the ID team who modified their design to fit the constraints.