Listen to The Small Nonprofit podcast on your favourite platform:

testing, testing, 123... with Cherian Koshy



Also listen at: iTunesGoogle MusicStitcher



“People describe themselves often in the best possible terms. We want to appear professional and smart, and moral and ethical to people, so that we're not social outcasts. [But] if we took that information, then put it pro forma against giving data, it clearly is not the case. So where is that disconnect occurring? It occurs because we have a higher opinion of ourselves and want to believe certain things about ourselves that actually aren't true.” — Cherian Koshy


You might have heard about the benefits of testing in your fundraising. Tweaking copy and sending to two different groups to see which performs better. Testing different web pages to see which gets more conversions. Maybe you’ve also tried to survey your donors to see what they like. This can be daunting, but it doesn’t have to be! On today’s podcast join Cherian Koshy to deep dive into all things testing in a way that’s accessible to small organizations with limited resources.


Highlights:

  1. Why you should be testing and getting to understand your donor’s behaviours

  2. How to use market research and other tools to get a clear picture of how your donors actually behave (vs. what they say they will do)

  3. Testing behavior through conjoint analysis (and what even is conjoint analysis)

  4. Framing questions to avoid bias and social desirability bias

Resources Mentioned In This Episode:

  • Endowment Partners

  • Robert Cialdini’s Pre-Suasion

Connect with Cherian:

  • http://www.cheriankoshy.com/

  • LinkedIn: linkedin.com/in/cheriankoshy/

  • Twitter: @cherian_koshy

Our friends and sponsors at Keela asked me to put together a guide of fundraising tips I wished every small nonprofit knew. Go ahead and download the guide here: https://www.keela.co/consultant/the-good-partnership (and don't forget that Keela offers our listeners 40% off their first year's subscription).


Transcript:

[00:00:00] Cindy: Welcome back to the podcast. So you might have heard me say things before like we don't really know who our donors are or, you know, we don't have a really good handle on what we know about our supporters. And today we're gonna dive into that a lot more. Now I will, in full transparency, one of my favorite ways to get to know your donors is to meet with them, and I stand by that. But, there's lots of other ways that I'm still learning, and so I'm really excited for you to join me on that learning journey, where we're gonna dive deeper into some really cool like market research opportunities to get to know your donors and test things with them in a way that helps you do your work better.


[00:00:58] I'm your host, Cindy Wagman, and you are listening to The Small Nonprofit podcast, where we bring you practical, down to earth advice on how to get more done in your small organization. You are going to change the world, and we are here to help.


[00:01:14] With that, it is my pleasure to welcome Cherian Koshy to the podcast. He is the Chief Development Officer at Endowment Partners, which is an investment firm that only works with nonprofits. He has three kids. One of whom I just met, talking about Lego, and is a former debate coach, which I think is so cool. So Cherian, welcome to the podcast.


[00:01:56] Cherian: Thanks so much. Thanks for having me, thanks for entertaining my daughter for a hot second.


[00:02:02] Cindy: It's a pleasure.


[00:02:04] Cherian: I'm thrilled to be here. So excited for our conversation.


[00:02:08] Cindy: I’m so excited. And you and I totally geeked out last time we spoke, 'cuz we have a lot of love of kind of business books, and thought leadership, and applying that to our sector. And, today we're gonna be talking about market research. Because I think that, as I mentioned in the introduction, like I love donor meetings; I find them so valuable. And, we often will also use things like donor surveys, but one of the things that both of us know to be true is that what donors say and how they behave are not always the same. And sometimes they will say things and do the opposite. And that's where testing comes in. So I'd love for you to sort of get us started with this idea of, you know, testing and learning by doing and just sort of waiting into the deep end.


[00:03:09] Cherian: Yeah, absolutely. So, I mean, I loved our previous conversation where we were just talking about... Yeah, right? And of course, I mean, yeah, that's the problem. So I think one of the biggest, the first thing that I'll say is that there is no substitute for meeting face to face with a donor and having that conversation with them. Just like we're talking here, it would be even better if we were meeting in person for body language and, you know, intonation reasons. But, when it comes to kind of surveying and testing, I feel like we've set the bar really low for even small nonprofits. And just said, you know, throw out a survey or test two options, and where I think we've done a disservice or where, you know, thought leaders have done a disservice, is not really unpacking. What that means from a decision making perspective?


[00:03:58] Sure, you can survey donors and you can ask them, 'Do you prefer email or print material?' And I promise you, you don't have to do that survey, 'cuz they're all gonna say email. Now, are they gonna open the email? No, they aren't. Are they going to read the thing that you send them? Yes, they are. So there's an element of they're indicating a preference that actually isn't true. And if you don't believe me, there's a reason why Amazon sends a gift catalog to your home. There's a reason-


[00:04:29] Cindy: Dude, I'm in Canada. I don't get that.


[00:04:31] Cherian: Oh, you don't?


[00:04:32] Cindy: No! Amazon, send me a gift catalog.


[00:04:35] Cherian: No, no kidding. We'll tag them in the social media after this, but, there are lots of other stores, like I think Target sends one, right? Their catalogs-


[00:04:46] Cindy: So American.


[00:04:47] Cherian: Oh, sorry.


[00:04:49] Cindy: No, we get catalogs. Canadian Tire, for all my Canadian folks, still send. Yeah.


[00:04:56] Cherian: Yeah, I'm like, it's been years since I've been in Canada, so I'm like, what are my Canadian references? But you get the point.


[00:05:06] Cindy: I tell you, I tell you Canadian Tire, for sure, like a catalog week or a flyer a week. Yeah. A flyer. Yeah.


[00:05:13] Cherian: Yeah, absolutely. And I mean, check your mailbox. Right? What do these marketers know that apparently is not common wisdom for a lot of other organizations. I mean, that has to be working or they wouldn't continue to do it. It has to be working at some level. Right? In order for them to continue to be doing it. So, you know, we throw out this sort of generic advice to survey your donors, and to your point, Cindy, what was so spot on about what you said is: people describe themselves often in the best possible terms. And we, you know, from a behavioral science perspective, that's called Social Desirability Bias, right? We want to appear professional and smart, and moral and ethical to people. You know, so that we're not social outcasts. So one example of that is when you ask a question about how much people donate per year, there always, you know, a order of magnitude greater than what the data actually indicates. Right? If people were, if we took that information, then put it proforma against giving data, it clearly is not the case. So where is that disconnect occurring? It occurs because we have a higher opinion of ourselves and want to believe certain things about ourselves that actually aren't true. And there's, you know, some really neat examples. I think we were talking about Robert Cialdini's book, Pre-Suasion last time we talked, and he has an example in his book of someone who went around a party and did a like party trick. And it was palmistry. Like they would read your palm.


[00:06:57] And so Cindy, let me read your palm and you know, you'd look at it, and I would say something like, 'Oh, Cindy, you know, you're a very open person. You're very flexible'. Just these, you know, 'You're open-minded to have these, you know, ideas that you're willing to explore and you're adventurous' and whatnot. And then a few drinks later, they would go up to the same person, and do the same type of reading a little bit, you know, you haven't really remembered what you said. And I'd say, 'Cindy, it looks like you're, you know, you're not really open to change. You're kind of set in your ways'. And each time the person would say 'Yes, that describes me'. Right? And it's just a function of us, not fully understanding who we are as people and why we make decisions.


[00:07:39] I was in a course last fall, and our professor gave the example of this public study. They said, you know, 'Do you consider this person moral or ethical?' And so when I do ethics talks around fundraising, I'll use this example, 'cuz they'll say, you know, Bill Clinton, sorry, US example, Bill Clinton, 57% people say he's ethical; Mother Theresa, 89%; yourself, 98%. And we're like, 'Come on, you are not Mother Theresa, like, come on. No one is'. And we have such a high opinion of ourselves that it's just, it's sort of absurd. So when it comes back to a practical application, particularly, when it comes to small nonprofits, you need to be mindful of how you're framing questions.


[00:08:35] Does your question introduce a bias? Does it presume a certain set of answers that would incline the donor to social desirability bias? Even asking a question of a group of donors, remember that that's a selection bias, right? These are folks who have already stuck up their hand and said, 'We love your nonprofit. We care about your nonprofit'. So do we want to hear from you? Sure. Do we think what you're doing is great? Sure. I've already contributed my money, right? Like, I've got to believe that you're doing good work.


[00:09:10] Cindy: It's a sunk and cost. You're gonna justify that donation. Yeah.


[00:09:14] Cherian: Precisely. Right. The Sunk Cost Fallacy.


[00:09:16] So in that respect, we have to be super mindful of not just what we're asking, but who we're asking. And I actually, when I'm designing donor surveys for clients, one of the things that I'll ask is rather than doing like open-ended questions, is do behavioral questions. Have you visited, have you taken a tour of an organization before? Right? That's a factual statement that is, you know, either true or false. And now we know that they've done that before. They're inclined to do that. So now we can, as fundraisers interpret that data. In the US, one of the questions that we ask about capacity is around 'Are you concerned about the estate tax?'


[00:10:02] The reason why we ask that question is because we're interested in their mental thinking around their own wealth. The only reason you're concerned about the estate tax is if you are above 25 million or you anticipate being above 25 million, right? And there's nuance to that question on its own, as well, but those are some ideas that I think would help nonprofits just thinking through if you're self-deploying a survey, what are the ways in which you want to go back and look at those questions and assess that piece.


[00:10:32] Cindy: That's so cool and really helpful. And of course, the best way to assess donors' behavior is to test it, is to actually, get them doing things and tweaking. Before we talk about what to test, let's talk about what is testing and how we can apply it. Because sometimes, we set it up in a way that's not gonna give us the best insight. So tell us, how do we test behavior?


[00:11:10] Cherian: Yeah, absolutely. So, a lot of times the advice from folks is, 'Oh, you should test that. You should try it out'. And I'm here to tell you something probably rather contrarian, that that's really bad advice. That you, especially as a small nonprofit, you should be very skeptical of that advice. And so I wanna start by saying like the challenge when it comes, to testing is this concept of statistical significance, right? So, when we test two things, we wanna know if there was a meaningful difference in Choice A or Choice B, if you're gonna test that way. So if you take half of your database, and half of your database is, your total database is a hundred donors, and you segment 50/50 on an email appeal or a print appeal, you'd wanna figure out- there are free calculators all over the place that tell you what statistical significance is. It would have to be a pretty big number with that small sample size. You would have to see a pretty big shift in order to make a decision about whether to do an ask string this way, or, you know, a wording that way, whatever you were testing. Right? So if you have 55 that said yes, and 45 that said no, I haven't done the, you know, calculator, but I'm pretty sure it's not statistically significant at that small of volume.


[00:12:39] So that's the first thing to consider. The second thing, I'm glad you mentioned that I coach debate. Because in the debate world, we use this concept called Force Choice, meaning you choose A, or you choose B, you choose the affirmative or the negative. The reason why that works in academic debate is because the Canadian Team has to either win or lose versus the United States Team, right? Somebody has to end up with the W, somebody has to end up with the L. In the real world-


[00:13:09] Cindy: That sounds harsh, but-


[00:13:10] Cherian: I mean, that is true, right? Like you can't have a tie in debate. But in the real world, no one ever makes a decision that way. No one ever, ever makes a decision that way. Right? So, do you go to the vending machine and say 'Coke or Pepsi?' The only reason why you say 'Coke or Pepsi' is if all the things are equal, right? The cost is equal, right? Your preference is for one or the other. It's, you know, a certain temperature in the day, like every other contextual factor is equal.


[00:13:47] So the thing that if you're listening to this, you need to understand, is that when you test two elements, those elements don't address the myriad of other factors that are going into that donor's decision. Right? So, if you test two factors of two ask strings in an appeal, right, do you give a hundred dollars one time or $10 a month? That's a common thing that folks tell you to test. Well, you're ignoring the fact that this donor may have, you know, may often give by check, or by, you know, bank ETF or whatever they may. And so the idea of doing a monthly gift is not in their head. Things like the day of the week, you know, other things that are happening in their life, all of these things come into play in making that decision. So, one piece that other marketers in the for-profit world know that nonprofits tend to not focus on, is that when you're testing these things beyond statistical significance, you wanna be mindful of being able to contextually test ideas. So putting multiple pieces together.


[00:15:00] So for those of you that are listing, just do a quick Google search on a concept called Conjoint Analysis. Conjoint Analysis has been around for decades. Decades. Every Fortune 500 company, when they're testing a new market, or a new product, does a conjoint analysis. To say, if we were to produce a thing that has never been produced before, let's call it a computer that doesn't have a keyboard, but it's a screen and you could carry it around with you. Right? This is, nobody said that they needed an iPad, but when it-


[00:15:37] Cindy: They're probably gonna say they don't want one.


[00:15:39] Cherian: Right. That's right.


[00:15:40] Cindy: Yeah.


[00:15:41] Cherian: If you asked people in the 18 hundreds if they needed a electric powered or gas powered transportation device, they would've been like, 'No, we're fine with horses'. Right? 'Why do we need this?' Now, ironically, the thing that I talk about in leadership talks all the time is that we still measure the power of cars by horsepower. And NASA said this ridiculous com- I'm sorry, I'm going off track. NASA made this ridiculous comment about the rocket power of their new shuttle, being 37 million horsepower. And here's the fun fact, there are only about 50 million horses on the planet. 37 million horsepower just, doesn't-


[00:16:27] Cindy: It's inconceivable.


[00:16:29] Cherian: Right? It just doesn't make any sense. But our language doesn't make sense in terms of some of those things, and the reality is we don't know what we need. Right? So conjoint analysis says, if you had this vehicle with these color options, and this horsepower, and this kind of gas mileage, and it's at this price point, and you take those four variables and iterate it four times, so you have 16 variables, then you can map out what are the things that people would actually buy. There's a regressio