Employing Differences

Employing Differences, Episode 94: How do we know what people think?

March 01, 2022 Karen Gimnig & Paul Tevis
Employing Differences
Employing Differences, Episode 94: How do we know what people think?
Show Notes Transcript

"If you're not acting on the things that people are worried about, and you're not gonna act on things that people are worried about, why are you doing a survey?"

Listen on the website and read the transcript

Watch this episode on YouTube

Karen:

Welcome to Employing Differences, the conversation about exploring the collaborative space between individuals.

Paul:

I'm Paul Tevis.

Karen:

And I'm Karen Gimnig.

Paul:

Each episode, we start with a question and we see where it takes us. This week's question is, "How do we know what people think?"

Karen:

So I think what we want to talk mostly today about is one of the quick answers to that a super efficient thing that I find a lot of my clients like and use a lot, which is, "We'll just send out a survey." Google Forms has made this super easy, and there are other options out there. There's all sorts of great technology that makes it easy to send it out and collect data. And so that seems like a great way to find out what are people thinking? And what we really want to explore are the strengths and weaknesses, or the effective uses and ineffective uses of that. We started talking about it because I've been seeing examples where surveys are sent out we have a question, we haven't talked about it, we haven't maybe educated anybody about it, so there are some people involved who know a lot about the thing and some people who know basically nothing about the thing. There are people who've thought about the thing a lot, and people who haven't thought at all about the thing. And there may even be people who've talked to a few other folks about it, but pretty much nobody in the group has heard anything from anybody else about it. And we send out a survey and say,"What do you think we collectively as a group should do?" And what I see happen that's problematic with that is first, we're asking people to assess what would be good for the group when they haven't heard anything from the rest of the group. So they can't possibly do that. And also they haven't heard whatever education or information or expertise there might be. So not only have they not heard other people's values and opinions, they don't even have all the data that would support a strictly cognitive space assessment. And I have a sense that when somebody puts that down I answered the question that there is a sense of "I put my stake in the ground, and I better stay on it." I think we invite a kind of rigidity. And so at worst, I think a survey can have the effect of saying,"Before you know anything, before you have any capacity to make a good decision about this, and certainly before you have any capacity to assess what the broader group would need around this, would you please put a stake in the ground of what you want and think we should do?" And then hold on to that, because we've invited that rigidity? And people say, "Well, I came up with my answer and now I'm defending my answer." Because it's so so much in our culture, to defend the answer. So I think that's the thing I see happen that worries me. And it got me curious about, okay, when do surveys work? And when really do surveys, not give us what we need?

Paul:

Karen and I were talking about this before the show, and she kind of brought it up, and she's like, "I mean, I don't know if this is applicable to some of the work you do." And I was like, "I have some history and baggage around this particular topic, so..." Because it's actually even worse than that. Because good survey design is a skill. And I have worked with some excellent designers and researchers over the years who know how to do that. Because anytime you're putting something in front of somebody and asking them questions about things, you're introducing bias into their thinking process. The results you get the answers to you get to your survey are largely based on what does your survey say? Oftentimes, we don't realize the ways that we ask questions, the way we construct things what types of bias that will introduce. And so I'll just first start by saying, survey design is a discipline. It is hard. The problem with,it's super easy to just put out a Google Form to do a survey is that we get a lot of bad surveys that actually don't get the information that you really need. So I'll start by saying that. I'm not an expert on that. I know enough about it to notice when that's happening. But there's a whole bunch of other resources on the internet that people can find about actual good survey design. But what I want to point to is actually the process piece, which is that when we're going to make a decision as a group, there's two times where we want to know what are people thinking. What does the group actually think? One is sort of near the beginning where we just kind of want to get a sense of what's in the water. What are people concerned about? Particularly if you've got a small group that's been doing some initial work on this, are their thinking processes and the ideas they're bringing are they representative entirely of the larger group or are they overlooking stuff? Particularly because people get involved with the little committees like that are generally the people care deeply. But they may care away more deeply about certain things the rest of the group doesn't really care about. So I think it's useful to gather a sense of what it is that people think as an input to the process. And it's also useful to gather information about what does everybody think once we've gotten some options, once we're starting to move towards a decision? And I think the problem is when we conflate those two and collapse them together. So in the situation you're talking about with the whole we ask people, "What do they think we should do?" with, as you kind of said, no education, about what's going on, no bringing people up to speed, and also no clear framing of how their responses will be used that's where we really get into trouble. Because we think we're asking them to make a decision right now. And that's actually not appropriate at that point. Because as you point out, we haven't explored the space yet. We're not in a ready-to-decide space. But I think this is where we get people into that frame of,"Well, I have to make a decision, and now I get tied to it, and I need to defend it." So the thing that I try to do when we're using surveys, for that sort of information gathering, is frame it in that way. "Hey, we're looking for some ideas of things we might do." Like putting people into more of a brainstorming space, where you're not asking them for decisions, you're asking them for ideas. "What are some things that are concerning you? What are some things that might be concerning other people, maybe you've heard other people talk about?" But also then clarify,"What we will be using the results of this survey for is to inform the discussion process." Because what's really useful there is actually getting all of those ideas out before people have had a chance to talk to each other. Because that means that they're not anchoring on the first person to talk. We've talked before about how who talks when, who talks first, will will absolutely in a in-person discussion, a live discussion, anchor what you talk about. If you're able to get all these ideas from all these people without them knowing what each other is talking about, that's actually really useful. It can remove a lot of that bias. But you have to frame it in such a way that hey, we just want to get a sense of what people are thinking about our ideas or things like that, we're not going to use this to make a decision. And in fact, the decision is going to come way later. And we're going to share the results of what we gather with all of you so you can get a sense of what other people are thinking about. And I think when you start to frame it that way, that actually deals with a bunch of the stuff that you're kind of talking about. I'm curious what you're thinking with that.

Karen:

Yeah, I think all of that makes perfect sense. And I think that efficiency generally comes at a cost. And I do think this is one of those cases. So, yes, if you're going to do a survey, everything you said for sure. But let's also be thoughtful about where surveys work well. So for example, if you are thinking about getting a new medical benefits program for your company, the impact of that is very individual. It's across everybody, but it's each individual. It's not affecting workspaces. What works for me, what's good for me, is what's good for me. And knowing what's good for you isn't gonna change what's good for me. Because it's not this collaborative space that we're talking about. That's an example of when yeah, let's get a survey. Let's see what each individual person needs. We've got some numbers. That's gonna feed us with something. But if you're talking about something like, we're gonna rearrange how we're using shared space, that affects all of us collectively. And what's going to work for me about that actually depends a lot on whether or not it works for you. Because we're within a system where we're all bouncing off of each other. And if it doesn't work for you, then that in and of itself doesn't work for me. So there's lots of causation that bounces all around this space. And then I think it's less useful to know what each of us is thinking in isolation. And so for something like that, I would much prefer to start with a conversation even if you're breaking into small groups but a conversation that helps us remember that there are other people in the decision. Because even though theoretically we know that, when I'm sitting alone in front of my computer clicking little buttons, it's really tough to remember that I'm part of a whole system, a whole network of people. So I think that's one thing is what kind of decision is it? And then the other thing that I'll add to your thought about survey structure is an example that I've seen that I am told works beautifully and I believe it, it seems like it would which is something a pair of architects, Laura Fitch and Mary Kraus in New England who work with cohousing groups, created a design process. And they said, Okay, there's a bunch of stuff that community members don't know about, that we need to teach them about. And then we need to find out what they're thinking and get a sense of, do people in the community want different things? Do they all want the same thing? And they thought, well, we could do this really efficiently with a survey. And in that survey and it's a long survey that they have put together, I think they have one for each of several stages but in that survey, each question has like three paragraphs of text that says,"This is best practices. This is what we typically do. These are the sorts of impacts. If we went this way, we get these benefits. If we go that way, we get those benefits." There's an education piece right there. And then they're getting first impressions. And if everybody's landing the same way, then when they meet with a group for the workshop, "So it sounded like everybody wanted this, so we're headed this way unless we hear something different." And then others it was like, "Wow, we really have something to talk about, because we landed in different places." And they can dig into that and use that to inform the workshop. But I think the education piece and just that awareness piece built into it makes it actually quite an effective tool, in a way that if they stripped the education, and they just asked the questions, it would be disastrous. Totally the other way.

Paul:

Anytime there's a group that's putting together a survey to gather information from a larger group, you really have to step back and go, "What do we know that they don't? And how can we on-ramp them to this process?" What do they actually need to know about so that we're getting an informed opinion, lthat the thing that they're telling us is actually meaningful? And I think that's what that three paragraph setup does really well. We actually need them to know this stuff in order for their answer to be at all meaningful. That's a thing really to think about is, "What would it take for someone filling out the survey to give a meaningful answer?" I'll also say this, because I've seen this go wrong a couple of times, and it's not so much about survey design. You shouldn't send out a survey if you're not actually interested in what people think. If you're not gonna do anything with the results of that. Because I've totally seen,"Well, we should send out a survey," and well-intentioned then the results come back and they weren't what the people who sent it out were expecting, and they don't know what to do about that. So then they just keep going down their path that they were already going on anyway. So I think you you really do need to ask, "Where are we prepared to pivot? What do we actually need to know? What decision are we looking for this to inform? Why do we need this information? What might tell us and how would we act on it?" Then that's useful because from the education side you can then tell people in the survey, in the setup for it, "Here's how we're going to be using what it is you're giving us." That's actually a big thing for me, because I've seen that a lot in in groups where a survey goes out and one of two things happens. Two bad things. One is the group that's sending it out doesn't actually know what they're going to do with the results. Or two, they know what they're going to do with the results, but they don't actually tell the people who are taking the survey. And that can create an anxiety and uncertainty of,"What are they going to do with this information?" So I think that that's also an important part of it. Being clear about what you're doing the survey for what questions you're trying to answer what you're going to do with the information and then sharing that with the people. Because I think that you get better information out of it. You get more useful information. So it's it's educating about the context the what it is that they are being asked about but also how will it be used. They're educating about the process as well.

Karen:

Another piece maybe this is survey design of encouraging people to be flexible. How can we structure things in a way that invites flexibility of answer? So for one thing checkboxes are great, as opposed to the single choice, multiple choice. Which of the following would work for you? Which of the following would be comfortable for you? If the first question on the survey was, how important is this to you? This is something I care a lot about. This is something I actually don't care very much about. This is something I don't have any interest in at all skip to the end of the survey. Give people permission to say,"There are other people to whom this matters. Let them have the influence." Looking for and framing questions of, these are the ideas we've heard so far, we're listing them here, which of them would work for you? And feel free to add others. Now, of course, the problem with that is that the others don't appear on the next people's surveys, although if you watch the inputs very closely, you could in theory, add them as you went, and people at least later people filling it out could see those, too. But just looking for ways to invite flexibility of thinking and invite people to consider. Because I think when we ask the question, "Do you want this or that?" people begin to get attached. If we ask the question, "Do you care whether it's red, or whether it's blue?" most people will say, "Doesn't matter to me." And so if you can get clarity about that, too, so that you're not inviting accidentally people to get rigid about a thing that someone's going to need to flex on. I'm going to assume that this is only interesting if people have different opinions. If everybody's completely aligned, then it won't matter how you use your survey, you'll be fine. But, but assuming that there's some differences of opinion, to be really thoughtful of making sure that your survey invites flexibility rather than discouraging it.

Paul:

Yeah, yeah. Because it is about exploring the space of possible solutions that could work for this group. And so you're pointing to how can we design things so that it will help us explore that space, rather than saying, "These are your options; pick." Or even worse, "Come up with an option, and then advocate for it." That that can happen later in the process. That's in your decisional phase, where you're eliminating options, you're moving towards a decision. But surveys are often really useful in that expanding phase, early on, when you're just trying to figure out what are the important considerations? What are the things that actually matter to people? What is the space of possible solutions that could work for everybody? And so designing a survey that's about making a decision doesn't serve you in that part.

Karen:

Yep. And I think asking questions about what are the values that you bring to this? As opposed to which of these three choices would we like best, what's your thinking behind that? What are the values? How does that align with what you're hoping for? Then the other thing I want to delve into a little further you've mentioned it as the transparency piece that you share out the results. And a lot of surveys that I've seen go out, people hold those results tightly until they've edited them or made sense of them or done something with them. Maybe they never get shared, or maybe they get shared a lot later, and so there's this other processing. One of the worst thing that I think happens is that they get shared with some people. Some people see them and other people don't. So you've got this disparity of knowledge in the group. And I'd be curious about what would happen if you even Google Forms has set up so it can automatically feed a spreadsheet. And you could put the link to the spreadsheet right next to the link to the form, and anybody who wants to could read everybody else's stuff before they fill out theirs. I mean, obviously, the first person to fill it out can't. So you get that first speaker effect again. But if you feel strongly about this, just go ahead and fill out your survey. That's great. If you haven't thought a lot about it, maybe wait till later in the survey period and review what other people have said, and then you can start to form some opinions. I mean, that's a sort of extreme transparency. But I think there's some value in extreme transparency of everybody has the same data at the same time. They can watch it come in if they want to. That can spur other conversations happening outside of the survey that can be really useful.

Paul:

I worked in an organization once where they did a yearly employee engagement survey that was all free response. It was not multiple choice. There were one or two numerical things. But it was mostly the free response stuff that was interesting. And that fed into a Google sheet. They kept it closed while the survey was still open. But basically, as soon as the survey was closed, they just shared out the link to the spreadsheet. And so everybody in the org had access to that. And what I ended up doing, actually, with that group for a while, was we had people who volunteered to sort of sort through that data. What patterns do you see in it? What things are you noticing about that? And then that was sort of the thing that got presented to management, because largely, this was supposed to be about shaping the environment to work with people's concerns, right. But what was very interesting is that it wasn't management that sort of said, "Here are the important things we saw in the survey." It was actually sort of the group who said and I facilitated the process of finding those patterns of saying, "Here's what we think are the most important things in here. Obviously, there are some outliers, there are some things that only matter to a few people, but the big things we see are this." Now, the challenge of that was that if management didn't address those things, they needed to have really good answers for why. And I think that's actually one of the reasons why transparency doesn't happen, is because of concerns like that. It's like,"Well, what if we don't act on the things that people say?" I'm like, "If you're not acting on the things that people are worried about, and you're not gonna act on things that people are worried about, why are you doing the survey?" It requires a different kind of stamina to work in that that type of transparent environment. But it does mean that the important things actually get talked about. Because it's there. And so I do think that can be really useful. One of the things and I think this is the thing that makes me nervous about the notion of like having that data available while it's still open is that there are times when it's just one or two angry people, who are who are putting stuff in there. And that isn't to say that they're wrong. Their feelings are valid. What they're what they're working through is real. And also that can be infectious. Negativity, in particular, can be super infectious in groups. And so if I haven't yet filled out the survey, and I go out and I'm like, "Well, I don't feel particularly strongly about this," and I go, and I read this, and there's a couple of people who felt very strongly and very negatively about a few things, I may start shading towards that, where that's not what you would have gotten from me otherwise. And so the transparency stuff is important. And we need to figure out how to work with it in a way that we're really getting the real sense of what's going on in the group. And I think that's a that's a challenge to work with sometimes.

Karen:

Yeah, I think that makes sense. And it's one of those look at the pros and cons for the thing that you're working on. You know, if you're pretty sure that there aren't strong negative feelings in the group, and then it may make more sense to have increased transparency during the survey. If you know that you've got some of those voices that could really funk the whole thing, that it may make sense to hold that transparency, until the survey closes, as you say. But I do think I'd have to have a really good reason to recommend holding that data back. Basically, as soon as anybody is looking at it, everybody should be able to see it. And as a survey organizer with the ability to see the data early, I'd encourage you not to. Don't yourself look at it until you're ready for everybody to look at it. Information is power, and it really changes power dynamics, if one person has it and others don't, or two people or whatever it is. And as you said, often the people who have it are the ones who are most passionate, or people who were passionate in one direction. If the thing that it's about is, "I want to see a change happen." And other people, "Yeah, I don't really care because I'm happy where we are," and then at the end of the process, they're like, "Whoa, whoa! I didn't have any idea you were talking about that kind of change. Well, now I care." That was a thing. So just really being thoughtful about what is it that we're trying to do. Where are we in our process? What stage are we in? What kinds of data are we looking for? It's really important.

Paul:

Yeah, it's a big and complicated topic. But also, when done well, I think can really improve your processes. Because there is an efficiency to it. If we send out a well-designed survey, and we know why we're using it, and we know what it's going to be used for, and we do these things around it, it can get a lot of information to a lot of people in a relatively short period of time, without having to have a meeting, which can be which can be super valuable. So to just sort of track where we then on that... One of the things we really want to avoid is sending out a survey that's going to lock people into positions before they really understand what the space that we're exploring is. So we want to create flexibility rather than rigidity, as a result of what we're doing. So, ways that you can do that. One is think about the level of education that you're going to need to do with regards to the people who are taking the survey. If you're designing a survey, you probably know a bunch about the topic at hand. So figure out what is it that the people taking the survey need to know, in order to be able to give you an informed opinion, to actually be able to give you useful information? Think about what is the information you actually need? What are you planning on doing based on the information you're getting, and thus design it so that you're actually going to get that information. And make clear to people how that information is going to be used what are the next steps in the process going to be? because the education both about the content and the process will get you better results and also reduce that anxiety and that locking in of positions that can happen. And then being as transparent as you can with the the data that is gathered. I really liked the thing you said, Karen, there about how as soon as one person has it everybody should have it. You may need to do a little bit of "Hey, we're just going to go once over here and make sure there's not any personally identifying information in here," or things like that. There's sometimes a little bit of tiny data cleanup you might need to do. But in general, I found that the more often you share all the data with everybody, the easier it is for people to actually work with it. And then figure out, "Okay, great. What's the next steps from here? How are we actually going to go about that?" And I think that really allows you to take advantage of doing the sort of asynchronous data collection that that surveys allow you to do in a way that still moves your process forward with a minimal amount of bad side effects.

Karen:

And one piece I'll add is be very cautious of the idea that a survey takes the place of a meeting. It may be able to do things that you might otherwise try to do in a meeting. But what a survey is really good for is very different than what a meeting is really good for. So make sure that you're doing a thing that can be done well with a survey and not trying to do something that needs a meeting and thinking a survey will give you that.

Paul:

Yeah, it's not a replacement. That's a complement.

Karen:

I think that's gonna do it for us today. Until next time, I'm Karen Gimnig.

Paul:

And I'm Paul Tevis. And this has been Employing Differences.