January 2022  •  10 min read

Great products feel great because they are the right solutions. They meet our needs, support our goals and sometimes exceed our expectations. Product teams need a way to step back to arrive at the right solutions and in order to do that, they need robust methods to empathize with humans. This brings us to the Story –

A narrative, either true or fictitious designed to interest, amuse, or instruct the hearer or reader.

Human Stories can inspire our understanding and unlock the empathy required to build great products. Stories offer a way for us to create empathy by experiencing our solutions first-hand within the context of daily life. They challenge our point of view and help us re-evaluate where to invest time and resources to get the right solution.

There are three primary tools to define the story – Goal, Journey and Lenses. Defining the Human Goal is key to ensuring the product experience is meeting needs and acts as the ground-truth for outlining the Human Journey. The Journey can have several permeations through a system, but shows a sequence of user tasks to accomplish their goal.

Let’s break down the lenses of the story worth considering. Depending on the desired outcome, it may be best to emphasize certain lenses or combinations thereof. 

Story Lenses. Influenced by How to Future by Scott Smith.

Human Stories come in many forms – e.g. writing, verbal narration, illustration and film to name a few. Let’s take film as an example. By undergoing the process for writing the narrative, thinking through the interactions and acting it out – the creator is forced to live the experience by empathizing directly with the human using their product.

Film can also act as a way to invent, cheaply. Here is a series of films that helped us quickly sketch Google AI experiences with our understanding of current technologies. With concept evaluation research, the films helped the product team gain understanding which concepts would most support human goals. Ultimately, they were critical in deciding which experiences to build and what type of technology investments to make. These centered around the human and context lenses.

Similarly, stories can be used to articulate possible future scenarios for macro-trends. We sponsored a project with Estonia’s Academy of the Arts, masters of Interaction Design to better understand the implications for trends like homeownership decline among younger generations. Students completed film prototypes within 72 hours of project kickoff. The output was engaging, fun, and helped us have better conversations for how technology can augment these futures.

As we broaden our stories to include multiple experience touchpoints, we find ourselves needing to think deeply about the seams and how to get those right in the product. Here is a case study leveraging this approach to reimagine our devices augmenting two humans within their daily life. Each showcased a new design system for several incubated experiences that addressed unmet needs. In this scenario, Sara is a busy young professional who’s goal is to feel connected with family – so trip planning is at the top of her to-do list.

Example Human Story film; Microsoft Windows Visioning, c. 2012 

In this short film, technology recedes in the background to augment Sara’s goal. The products support her journey and share context as she moves across devices. They smartly recognize her current task and aid her by making it faster to reach her goal.

The creative process enables us to gain an understanding for what feels natural and what doesn’t. This could be an interaction behavior, interface element, or evaluating an entire product feature. As a prototyping method, film can save significant engineering investment.  Here is another scenario with Sam, whose goal is to ensure his partner has a good birthday. It helped to get support for dark mode interfaces and helped define a  direction for Microsoft’s Metro design language.

Example Human Story film; Microsoft Windows Visioning, c. 2012

Film jumps to the end, enabling the creator to render a future that feels like it already exists in the world. It puts a stake in the ground as a future the intended audience can now have a discussion around to make decisions. This often times can save significant time and budget as well as align product teams towards a unified set of features that deliver on the original human goal. It can ensure we get the right solution.

As creators, we can use methods like film, narrative writing, audio or illustration as prototypes to get buy-in from stakeholders and evaluate with humans well before we implement them. The output can enable us to have better conversations and steer us toward a better future.


Thanks for reading. Also big thanks to my friends and colleagues Amid Moradganjeh, Don Barnett, Matt Jones, Alison Lentz and Kyle Pedersen on example stories shown above.

One of my mentors was professor Richard Branham while I studied Interaction Design & Human Factors. He was a pioneer in design systems, human-centered interaction design and large-scale wayfinding projects (Sears Tower, Chicago; Kingdom Tower in Saudi Arabia; St. Luke's Health Systems) and branding projects (Ford; JC Penney; Target). Richard was a passionate, deeply caring individual who worked through his retirement to teach young design thinkers how to evolve their mindset into one that is human-first. He once told me that it’s critical to always be evolving your design philosophy. Looking back, I think he was teaching us more – how to develop conviction, and he made a lasting impression on me. This is for you, Richard ︎.

Tim lives in Pacifica, CA with his family where they explore the Northern California coastline. You can read more about Tim on LinkedIn or feel free to Contact him directly. Also, check out past work in his Feed ︎.

Copyright © 2022 Tim Wantland. All Rights Reserved. Images from this site may not be reproduced without prior written permission. Featured brands are the intellectual property of their respective owners. Select works may be shown in concept form and may not represent final published work.