Design Thinking for the Greater Good

Home > Other > Design Thinking for the Greater Good > Page 31
Design Thinking for the Greater Good Page 31

by Jeanne Liedtka


  Sometimes, you have good existing data and can conduct thought experiments (similar to traditional analysis). We always start here, because these experiments are the least expensive, least intrusive kind. If we don’t have relevant data already in hand to test our assumptions, we need to go out and get it through field experiments. Starting with low-fidelity prototypes, we go into the field again to interact with a few real stakeholders and simulate reality by having conversations with them. The most accurate kind of experiments are live, 4-D experiments conducted in the real world by offering higher-fidelity prototypes. We call these experiments “learning launches.”

  To assess What wows, the GCCA team reviewed their napkin pitches and identified the assumptions underlying each one—what would need to be true to make each a good idea. They sorted them into assumptions about why each pitch created value for the students, how each pitch would be implemented in practice, and how each could be scaled and sustained.

  For each assumption, they considered how best to test it. They could:

  • conduct a thought experiment using data they already had in hand;

  • conduct a simulation by beginning a dialogue with the key stakeholders involved, using a storyboard or other kind of prototype; or

  • test it via a live experience.

  Thought Experiment

  2D & 3D Simulation Experiment (one-on-one)

  Live (4D) Experiment (one-on-one)

  • Learn through analysis of existing data • Typical time frame: one or two days

  • No exposure to third parties required

  • Learn through dialogue with stakeholders using storyboards or prototypes • Typical time frame: one or two weeks

  • May require us to expose our intentions to selected stakeholders

  • Test via a live experience of the offering (e.g., a 30-day live trial) • Typical time frame: 30 to 90 days

  • Requires us to expose our offering to many stakeholders

  Experiments to test assumptions.

  For the “Welcome Meetings” concept, the assumptions included that students would value one-on-one meetings and would experience reduced anxiety and improved performance as a result of such meetings.

  The key assumptions, with the experiment type for testing.

  As the team deliberated which approach—thought experiment, simulation, or live test—would be best to test each assumption about welcome meetings, one execution assumption stood out: that counseling faculty would make themselves available to students during what was normally vacation and prep time. In the past, counselors had not met students until the first week of school and would then cobble everything together. The new concept required that counselors meet with students as they registered and then to continue to meet as students needed throughout the semester.

  On the leadership team, director Miguel Contreras was from counseling ranks, and Robin Acosta, the dean of students, was GCCA’s lead counselor. When the team tested their assumptions with thought experiments, their experience and rapport with other counselors suggested that GCCA counselors would be willing to make such a commitment. But the team believed that this key assumption needed more than a thought experiment; it needed to be tested directly with counselors.

  Step 12: Make Prototypes

  Throughout the stories in part 2, we saw the importance of creating visual representations of ideas, which we call prototypes. When many of us hear the word prototype, we think of fully featured, almost-ready-for-prime-time versions. Design thinking prototypes start off much more crude, with details and specificity mattering only if they create a preview of the experience in stakeholders’ minds. These kinds of prototypes are, as Michael Schrage described them, playgrounds, not dress rehearsals. The prototyping goal in design thinking is not perfection; it is bringing concepts to life for potential users. Where architects create blueprints and models, designers generally use visual or narrative approaches: images and stories. Prototypes can even include role-playing and skits. The reason that we surface assumptions before we prototype is because, early on, we want to test the most critical assumptions first. These prototypes do not need to capture a concept in its entirety; they only need to represent individual elements so that each can be tested separately.

  While it is easy to prototype a new toothbrush, prototyping in the social sector usually means designing experiences, which requires not physical objects but storyboards, user scenarios, experience journeys, animations, and concept illustrations.

  Prototyping elicits more accurate feedback by creating a more vivid experience of any new future. Psychologists report that helping people “pre-experience” something novel is an effective proxy for the real thing and significantly improves the accuracy of human behavioral forecasting. Indeed, evidence emerging from neuropsychological research suggests that imagining a specific future activates the same neurological pathways as experiencing the actual event does.

  Whether in the form of storyboards, journey maps, user scenarios, or business concept illustrations, low-fidelity and often two-dimensional prototypes offer specific tools to make new ideas tangible and allow us to solicit more accurate feedback during What wows and What works. In the most successful innovation projects, designers prototype early and often, leaving “emptiness”—room for others to contribute—in their early iterations. A prototype that leaves room for input invites stakeholders to complete it and helps them become invested in the idea. Prototypes that appear perfect may encourage users to say what they think we want to hear, and the last thing we want to encourage is the false positive as we sort out the nonstarters from the good ideas (remember the venture capitalists and the critical importance of figuring out which eight projects to stop investing in). It is the false positives that raise the costs—and risks—of innovation.

  Prototyping can sound intimidating, but it needn’t be. Just figure out the story you want to tell and visualize the concept in pictures, using as few words as possible. Always visualize multiple options. Create choices for the stakeholder. This can be as simple as asking users to check the option they like best. In one of our favorite examples, a health care clinic prototyped potential office layouts by hanging bedsheets from the ceiling to act as walls and using cardboard boxes as sinks and desks. Asking doctors and nurses to move around the space while performing their normal activities, designers shifted the sheets and boxes so that the medical staff could pre-experience different options and determine which layout best suggested efficiency.

  Back at GCCA, the team’s prototypes came together quickly. At the July 16 meeting, counselors began crafting prototypes. GCCA already had experience with Remind.com and Edmodo (which is similar to NovoEd or Facebook but is designed with stronger privacy for teachers and students). Both were already in use by some faculty, so the team did not need to prototype these platforms. They only needed to craft common messages and add an assignment for the incoming cohorts. For Remind.com, they developed a message, and for Edmodo, they created an assignment that students would bring to their first one-on-one meeting with a counselor after registration day, which would also be tweaked to be a more welcoming experience. After the July 16 meeting, the extended counseling team continued to share prototypes of messages during planning meetings and via e-mail to iteratively settle on the series of Remind.com messages and a first Edmodo assignment. Members also tweaked their first counseling meeting protocols and prepared to meet with students immediately after registration.

  Having created journey map prototypes, and armed with their assumptions to be tested, GCCA team members were ready to step outside of themselves and seek input. They were ready to consider What works.

  What works? Overview

  In the What works stage, we take our ideas to real stakeholders, first in one-on-one conversations and then in field experiments. The challenges in What works are designing experiments and then listening to feedback nondefensively. Team members who struggle with the first two design questions, the fuzzy front end of idea
generation, are often more comfortable in the What works stage because it requires different expertise. Whereas Geoffreys generally love What is and What if, Georges tend to excel at What wows and What works. An ideal team will include both sets of skills, as we talked about in chapter 2.

  Because designers are generally taught in studio settings, where critiquing is key, they learn to detach their egos from their creations and to hear criticism nondefensively, as part of their training. The opposite is true for most of us. When we ask teams to seek feedback and present learning launches, one major principle is that they not defend their choices. What matters, instead, is whether the teams understand others’ criticisms. As long as team members understand the assessment, they have the choice to accept it and change or to decide that any disparaging analysis is not important and ignore it. First, however, their job is to listen carefully.

  For many, especially Geoffreys, design thinking gets difficult in What works. In many ways, design thinking is about emotion, empathy, human-centeredness, and understanding what someone else is thinking or feeling—areas where Geoffreys flourish. But the experimental phase is about focusing on data. It is OK to “fall in love” with our stakeholders but not our solutions. Here, we are conducting tests on our hypotheses, and we need to think like scientists.

  Step 13: Get Feedback from Stakeholders

  Prototype in hand, you now return to your key stakeholders to seek their feedback. Seek, and hear—especially the bad news. If your prototype is crude, they will fill the “emptiness” with ideas on how to improve it. Again, observe their reactions at least as much as you listen to their words, and truly invite them to co-create with you a better product, concept, creation, or experience.

  Remember, your primary reason for assumption testing is to learn. Let them teach you. Take photos of whatever your stakeholders wrote on your storyboard prototype; capture when they laughed, when they sighed, and when they threw up their hands. If they crushed your concept for reasons that can’t be fixed in the next iteration, throw in the towel (but inventory that napkin pitch!). Then, go back with your iterated prototype (less crude, adapted to their feedback) and co-create with your stakeholders again. You might continue for three or four rounds until they are as invested in the napkin-pitch concept as you are.

  As the GCCA team moved closer to testing, getting feedback from students was not possible because the summer term had ended and students were no longer on campus. On July 16, Robin met with the rest of the counseling staff to seek their input and, in particular, to test the assumption that counselors would be willing to implement the early-advising approach. Robin experienced a mix of anxiety and excitement at this step:

  I was so anxious. I must have looked like I needed medication when I was asking the counselors to take that on and change their days off, knowing what I was asking. But it was also a high point because it’s exciting to try something new and to implement something.

  As the design team had hoped, the counselors were on board with the concepts the team had developed and were willing to sacrifice vacation time for testing.

  Step 14: Run Your Learning Launches

  Having run an initial set of thought and simulation experiments, you are now ready to conduct experiments to determine whether your napkin pitches will fly in the real world. You know who your partners will be, you’ve got a solid grasp on the users’ needs, and your prototypes have become more sophisticated. But the future is still uncertain. We call experiments conducted in the real world “learning launches.” Though similar to a pilot or a rollout, they are not the same. Forming a bridge between one-on-one stakeholder co-creation sessions in step 13 and the ramp-up in step 15, a learning launch is about moving into the real world and how much more you can still learn.

  Remember that say-do gap? Until this point, you haven’t asked any stakeholders to truly put skin in the game. No money, no resources except a bit of time. The launch, however, needs to feel real to a number of stakeholders and to everyone on your team. The true test of any new concept includes stakeholders demonstrating their enthusiasm through their actual behavior, ideally over a period of time. Do they value it? Do they use it the way they said they would?

  But now, before you move into step 14, go back and compare the design brief and criteria to your now-developed concept to verify that it addresses what you wanted to accomplish, that it meets the “Our ideal solution would …” requirements, that the constraints are addressed, and that you have some way of measuring success. You may love the idea—and we hope you do—but does it do what you set out to do?

  Designing the launch itself is straightforward. First, you need a working prototype that focuses on the key assumption you’re testing and goes beyond the low-fidelity models you created in step 12. Then you need to set tight boundaries and plan for the launch to end. Set concrete limits on variables such as length of time, number of stakeholders, features, and geography. Though you want to work in fast feedback cycles, you should expect surprises when your napkin-pitch idea meets reality. Consequently, processes for handling dissent, resolving conflicts constructively, and adapting on the fly are great assets.

  You will do a series of learning launches as you iterate your offering to match the new learning that the launches are producing. Each launch, based on what you learned in the prior one, narrows the search and focuses the desired outcome until you’ve addressed the metrics you set in your design brief. Ultimately, learning launches result in decisions. If you decide to move ahead with additional development, the learning launch should tell you how. Continue your learning launches until all issues and assumptions have been addressed (and, you hope, solved) and you’re ready to move into implementation.

  The GCCA team felt ready to move their ideas into reality. Around that time, the leadership team (Miguel, Robin, and Kathleen) headed to the Gateway Network’s annual conference in Boston. This event gave them time together, and they seized on it to plan their learning launch. They realized that they didn’t need to select just one napkin pitch; they could combine four of the five into an integrated solution. As Robin explained, “When it all came together, we realized, ‘Wait a minute. We don’t have four or five counselor pitches; we actually could integrate them into one.’”

  The team hunkered down and brought everything together and created a Welcome Month, fostering a new, nearly monthlong welcoming phase within the GCCA recruitment process prior to the first day of school. Upon their return to campus, the GCCA leadership team launched the concept almost immediately, simply because the semester was starting. As Joan described it:

  We did a quick thought experiment: How would this look? What might we do? Key assumptions had already been tested with the counseling staff: Were counselors available? Would the technology work? So we went live because the students were coming in the door.

  Learning from the Learning Launches

  Initial feedback from the learning launch was encouraging. GCCA surveyed the students in classroom assessments during the first week of school. Students were asked, for example, how much contact each student had with his or her counselor prior to day one. With almost the full class responding, 38 percent had been in contact with a counselor four or more times—a higher percentage than in the past. Another item asked about the effectiveness of the welcome meetings. Nearly 60 percent of students responded that their counselor contacts were effective in helping them plan to manage attendance barriers. Seventy-nine percent responded that the Welcome Month undertakings helped them feel less anxious about school, and 82 percent reported that counseling and activities helped build their awareness of program expectations. Notice that the queries directed at students did not ask whether they liked the Welcome Month or wanted it continued. Instead, they tested for the specific outcomes around attendance, anxiety reduction, and increased awareness of program expectations.

  Both counseling staff anecdotes and students’ responses indicated that students did indeed understand both GCCA’s and other studen
ts’ expectations better than in the past. The team used GCCA enrollment reports to monitor retention and attendance and discovered that the new student cohort had much better September attendance in weeks two and three than past cohorts had demonstrated. By mid-October, 92 percent of the cohort was still enrolled, reflecting an improvement over the 88 percent fall 2013 retention rate reported by the Gates Foundation, which sponsored the Gateway program. The changes appeared to be producing positive differences.

  As the team met in January 2015 to review progress, a few next steps were already on their agenda. One was to create a second learning launch for the spring term. This task proved more challenging than expected because circumstances in January differed markedly from those in August. January students did not have as much time to think about the challenging program as the fall beginners did. They often enrolled with the feeling that GCCA was their “last chance,” but without much thought as to the commitment required to succeed. Counselors, too, faced different situations. They were busier in December and January than in the summer and could not provide similar availability.

 

‹ Prev