Chapter 5 - Open

Happy Monday Uplifting Crew,

Do you know how "Hey Google" was first created? Read on, and learn why great design comes from opening up your process and letting everyday people into the heart of it.

Opening up,

// dan

Dan Makoski // Chief Design Advisor


Chapter 5 - Open

Case Study: Hey Google

Let me share an example from my time in Silicon Valley that shows the power of opening up your design vision to co-creating with the people you design for.



The year was 2012. Motorola had just been acquired by Google. Google was locked in an innovation and intellectual property battle with Apple for dominance in the smartphone market. Apple’s iPhone had been out in the world for about 5 years, and Google had launched the smartphone operating system Android, which had about 50% market share. Like many other handset manufacturers, Motorola used to heavily modify Android with dozens of its own apps, which resulted in an inconsistent and often confusing experience for users. With the acquisition of Motorola, Google wanted to launch a smartphone that was pure Android, with only one or two new features that were differentiating. As global head of Design Research, my team was asked what one or two things we should design that will make the most impact.



With that challenge in mind, my team launched a design research program that started with a photo diary study, deepened with ethnographic home visits, and concluded with a co-creation exercise. Let me briefly break down these three methods. A photo diary study is a qualitative research method used to gain insights into people's experiences, behaviors, and perceptions related to their life. In a photo diary study, participants are asked to document aspects of their daily lives or specific activities by taking photos and providing accompanying captions or explanations. This method allows researchers to understand the participants' perspectives, emotions, and routines in a more in-depth and visual way.

Ethnographic research is another qualitative research method that focuses on understanding people's behaviors, beliefs, attitudes, and experiences within their natural and cultural context. It is an observational and participatory approach that involves immersing researchers in the everyday lives of people to gain deep insights and empathize with their needs, challenges, and aspirations.

Both of those methods provide rich insight into people, but neither of them are focused on the future. Remember Liz’s model of making. You need to create with people to generate ideas from their hopes and dreams. So we developed what is called a generative toolkit: a set of props people can use in their homes to act out the future.

broken image

Our generative toolkit of whiteboard-wrapped foam core.


In this picture, you can see a pile of white squares of all sizes on the chair on the left. These are bits of foam core that are wrapped with a writable/erasable surface. They are like little whiteboards than can be used and combined to create any type of screen or device. The woman in the photo has created a large smart TV using four of the props locked together. When she acted out the 5-year future in her home, she imagined coming home after work and her TV would welcome her home and ask if she wanted to turn on the news like she usually does. She described the frustration of doing this little ritual every day and sometimes not being able to find the TV remote.

broken image

Talking to a smart watch prop.


On the smaller side, this couple is acting out a scene where the wife is cycling home and can easily communicate with her husband in a safe, hands-free way with her smartwatch.

The next photo is of another participant acting out a future scene in their kitchen with a smart tablet on the refrigerator. He can write on the screen to update the household shopping list, or he could simply talk to his fridge. He talked about how the list would be digital and shared, so that if his wife was driving close to a grocery store, she would receive a real-time alert that milk had just been added to their shopping list. And if she bought something, that it would sync and update the list on the fridge and on her husband’s phone.

broken image

Talking to a smart fridge prop.


These scenarios may not seem far-fetched as of this writing in 2023, but in 2012 none of these things had yet been designed. While things like this had been shown in science fiction movies since the days of Star Trek, none of them had been turned into a reality. In the past week I’ve been able to do every single one of these things, and I imagine many of you have been able to as well.

The key insight from this co-creation research was that people were naturally talking to their screens, without fiddling with buttons and apps. The design team started brainstorming different ways we might be able to make this work, but there was one problem: hardware wasn’t designed to always listen for your voice. And even if it could be adapted, we thought the battery drain would be too high to make this feature worth it.

All that changed when we started bringing others into our design brainstorming. One of the hardware engineers said that they were already working on a something like this, the X8 microprocessor. It was low-energy and could continually run in the background. They imagined it could be used for checking emails and waking up the phone with alerts, but when they saw the storyboards from our co-creation, they got excited about this new scenario.

broken image

Brainstorming natural spoken interactions.


broken image

Storyboards of people naturally speaking to screens.



So this became the signature new feature of the Moto X: the ability to naturally talk to your phone.

broken image

The Moto X industrial design.


broken image

The Moto X configurator.


It was the most loved feature of the phone, which complimented the ability for the owner to personalize the phone in an entirely new way. Both of these elements aimed to take the most personal technology in most peoples’ lives and make it even more helpful and reflective of their needs.

broken image

Touchless Control on the Moto X


On August 23rd 2013, Google and Motorola launched the Moto X. It was the first time you could naturally talk to a screen. Owners could say “Hey Moto, what time is it?”, or “Hey Moto, set tomorrow’s alarm for 6:00AM.” Owners (and Google) liked it so much that it was quickly changed to “Hey Google…” and tied into Google’s plans for making its Assistant more capable. More than a year later, Apple launched “Hey Siri” with the iPhone 8, and a month later Amazon launched the Alexa smart home speaker with the same functionality.

What I love about this project is that some of the most advanced design and technology at the time didn’t come from the minds of geniuses working in the heart of Silicon Valley. Instead, it came from the minds and hearts of ordinary people imagining how they wanted the future to unfold in their own homes.