73. Dig's Founders on The DIY vs Custom Research Debate

00;00;06;23 - 00;00;33;06
Meagan
Hi. Welcome to Dig In the podcast brought to you by Dig Insights. Every week we interview founders, marketers and researchers, from innovative brands to learn how they're approaching their role and their category in a clever way. Hello. Welcome back to this week's episode of Dig In, the podcast brought to you by Dig Insights. My name's Meagan and I'm lucky enough to be joined by the founders today.

00;00;33;07 - 00;00;40;21
Meagan
We've got Michael, Dom, Ian and Paul. This happens every time. How are you doing? I never know.

00;00;41;25 - 00;00;44;15
Meagan
Which person is going to start.

00;00;47;04 - 00;00;54;17
Paul
Well, Michael and I are wearing very similar clothes today, so if it goes on video, you might not be able to tell us apart. I'm the good looking one.

00;00;54;23 - 00;00;55;20
Paul
Oh, I kid.

00;00;56;12 - 00;00;58;27
Paul
People might see it's. You might just think it's you.

00;01;00;04 - 00;01;06;15
Dom
And I was rockin my standard hoodie until an hour ago, but I had a client call, so I put on, got all fancy.

00;01;07;03 - 00;01;12;28
Meagan
Yeah, Ian, you got fancy. I was just on a call with you, like, 4 minutes ago. I think you're wearing something different.

00;01;13;12 - 00;01;15;10
Paul
He's wearing pants. It's why you put on pants.

00;01;15;13 - 00;01;16;13
Meagan
I don't know. It's funny.

00;01;18;09 - 00;01;24;02
Ian
This is a harkened back to the nineties. I'm channeling, you know, some Kurt Cobain plaid. So.

00;01;24;02 - 00;01;53;18
Meagan
Yeah, I absolutely love it. We're going to talk today about which types of research definitely require sort of an expert to sort of hold your hand throughout it. So those types of research that really require a consultant to help you figure out how you achieve the outcome that you need. There's such a focus right now on making as many things as possible and DIY, and sometimes that's just not doable and it's something that we we sort of run into a lot with our clients.

00;01;53;19 - 00;02;07;16
Meagan
Obviously, we want to make sure that things are as scalable as possible and are always working hard for our clients. But yeah, I think more and more this question is coming up and I mean, I start with Paul. I'm wondering why you think this question.

00;02;07;16 - 00;02;09;08
Paul
No, I'm joking about.

00;02;11;02 - 00;02;17;15
Meagan
Why do you think this question is so relevant based on the landscape of, you know, the data analytics space right now?

00;02;19;07 - 00;02;24;05
Paul
Why the question of needing or understanding whether or not you need an expert to help you with the research is relevant?

00;02;26;18 - 00;02;47;25
Paul
Well, it kind of has to go to the fact that there is so much technology that's kind of crept into this space and the industry has really changed over the last little while actually. You know, this kind of almost this question almost brings me back to a presentation I did at the dining at a summit last year. Dom, you'll remember I had an image of Spider-Man wearing a tight outfit that I.

00;02;48;27 - 00;02;49;22
Dom
Learned on my.

00;02;49;23 - 00;03;21;25
Paul
Burn on Here's a kids kids party. I can't get it out of my head. And it was it was really focused on the fact that the industry is changing so much that firms need to almost somewhat differentiate because of the of the, you know, emerging dress tech availability and stuff around focused around automation. And so you're seeing firms actually move into a space where there are specializations that they're focused on, whether it's healthcare or, you know, vertical expertise so that they can actually add additional value because they don't necessarily have the technology.

00;03;22;07 - 00;03;46;11
Paul
And you'll see other ones that are moving into even more management consulting type places where elevating the offering that they're that they're providing their clients because again, they can necessarily leverage or compete in a technology automated space. And as they as they started doing that, I think there's a lot more emphasis on providing higher value, higher quality type of work that you can't get in an automated environment.

00;03;46;18 - 00;04;16;17
Paul
And, you know, for the most part, clients I think, understand that there's things that you can do that are sophisticated enough using technology that gets you the answer you need. But if you do require some sort of, you know, whether it's whether it's a certain methodology that requires specific rigor or whether it's more elevated consulting associated with the data that you're collecting, I think that's why you're seeing some firms actually move into those adjacent spaces where that's the specialization or that elevation.

00;04;16;24 - 00;04;23;19
Paul
And I think part of the question of why it keeps coming up is because firms, honestly, they're trying just to differentiate in a sea of emerging technology.

00;04;25;13 - 00;04;55;15
Meagan
Yeah, it also makes me think about the I mean, we will come back to it, but that step in between. So like you mentioned, consulting or help, that's like sophisticated or sophisticated enough. And there's this idea of like enablement to help you use those DIY platforms or, you know, DIY tools, as are solutions. And then there's sort of those really strategic projects where like sophisticated enough isn't going to cut it.

00;04;55;27 - 00;05;12;10
Meagan
You really need the the people who truly know their stuff or truly understand the category of stuff to, to actually get the project done. Does that make sense? What I'm saying, like for sure, middle ground it does.

00;05;12;10 - 00;05;41;24
Paul
But, you know, I think it's because also there is there is a certain level of clients do need to have some external validation on some of the decisions that they're making. And usually you're going to get that was somebody that really understands consumers, really understands what you're collect, the data you're collecting. It can interpret that information. And while people are sophisticated and technology sophisticated enough to guide you through some of it, having that person who's the expert and can help you interpret that information and help you make the right decision is, I think, always going to be a need.

00;05;41;24 - 00;05;51;19
Paul
And I still think there's a there's a place for those that middle ground of it's not it's sophisticated, but not enough compared to what you might get with an external, you know, advisor expert.

00;05;53;25 - 00;06;18;13
Meagan
Yeah, that makes sense. I'm if this makes me think of for any of you actually to say you're the head of insights for I don't know a big CPG company is there what types of research would you absolutely know that you would outsource. Like what types of research on a regular basis would you say like, like, so we talked about segmentation before we hopped on.

00;06;19;06 - 00;06;34;07
Meagan
It sounds like that's absolutely something that you would want help with from a company like Digg. Anything else that you would really be looking to to sort of outsource or to get help from an agency like ourselves. If you were in that sort of client side space.

00;06;35;08 - 00;06;53;03
Michael
I can jump in on this one. So Digg insights, the way that we sort of segment the work that we do is in a four buckets and the first one is foundational research. So that's your segmentations, your and use all of those sort of big, big media studies of which you might then build your business over the next few years.

00;06;53;13 - 00;07;23;07
Michael
Then there's building and refining ideas, assessing ideas, and then tracking market performance. So if you use that model, I would say it's that first one, that foundational research where you often do need that external consultants simply because they're really complex, They're a huge amount of work. If nothing else, it's not even a skill gap. There is a huge amount of work to build that, to build the questionnaire, to make sure that's correct, to translate it if necessary, to program it, to test the programing, to feel it, manage the field, to get that data back, to analyze it, to report on it.

00;07;23;16 - 00;07;40;09
Michael
It's just it's way too much work. Even if you have the most brilliant people within your internal team is often a capacity issue. So I think you need to outsource the, the, the building ideas. That's where we often do things like conjoint. Depending on your skill level. Some clients might want to do that internally, some might prefer to outsource it.

00;07;40;29 - 00;08;05;16
Michael
Then then you get into the the testing ideas and that's you typically want a very standardized methodology because you want norms or you want to be able to democratize the data or something simple like that. So that's pretty DIY friendly. And then with tracking again there, you're often getting into a need for an external consultant, not only if the tracking simple, if it's very standardized, then maybe you can have that be DIY friendly.

00;08;05;16 - 00;08;10;09
Michael
So again, just that it's that foundational stuff where I think you absolutely need an external consultant.

00;08;12;24 - 00;08;13;25
Meagan
Anyone want to add to that?

00;08;16;17 - 00;08;35;25
Paul
You know, I have I kind of almost want to throw in a question to you guys, too, because I totally I agree with you, Mads, in terms of like needing some sort of expertise at those meaty foundational stuff. And also because it's like they're huge is a complex. It's not something you can easily do on a, you know, on a DIY platform.

00;08;35;25 - 00;08;57;18
Paul
You need interpretation. But like there's so much now that people just do on their own, like remember back in the days when you actually needed they or you had a designation, an or a certification for being like a market research expert, you know, does anyone care if you have that anymore? Clients ever wonder whether or not that's like a thing that you that you have?

00;08;57;18 - 00;08;58;10
Paul
I don't think so.

00;08;59;09 - 00;09;00;11
Michael
Did anyone ever care?

00;09;00;23 - 00;09;14;06
Paul
That's a good point. Yeah, I don't know. I do think back in the day there, there was people did actually care because you cared about whether or not a question was asked the right way to avoid bias and make sure that the data you're getting was correct. I just don't think it's big a deal anymore.

00;09;14;18 - 00;09;26;02
Meagan
You And are you saying that rightfully so. It's not as big of a deal. Are you saying that, you know, we've moved on and that's not something that we necessarily should care about?

00;09;26;02 - 00;09;43;14
Paul
Well, we've moved on. Yeah. So I think the purist would actually say that you do need to have that level of knowledge and sophistication to be able to ask the question in a right way. But now you can just say, okay, I want to selected this question from my library of questions and you can ask the question the same the right way.

00;09;43;23 - 00;10;02;19
Paul
So, you know, it's not like you need to know and it's not like you need to hire somebody that really needs to know. Like, I just don't think that I think you're going to hire people, you know, the experts, the specialists, the people who are the consultants that you're going to are not the ones that say, Oh, yeah, I have this certification and market research that I got back in 1998.

00;10;02;19 - 00;10;25;00
Paul
I think it's the people who are saying, yes, I understand the category, understand the trends, I understand how consumers interact, the change in behaviors. And they're an expert because they they're in tune with the consumer, not necessarily. They know how to ask the question in the right way. And I think there is there is a little bit of I think companies who are a little bit worried about the profession of market research just completely evaporating.

00;10;26;10 - 00;10;30;00
Paul
But that's honestly where the industry is kind of going, in my opinion.

00;10;31;12 - 00;10;31;22
Meagan
Yeah.

00;10;32;00 - 00;10;56;14
Ian
Yeah. I think I think there's question libraries with like certain standard question types like appeal and purchase and then stuff that people don't really need to change very much. I think it's once it gets into custom and how often see like double barreled questions or questions that just so are so clearly biased are leading, I think in those cases they really would have benefited from some professional support.

00;10;57;18 - 00;10;59;23
Ian
And because I see, I see that all the time.

00;10;59;23 - 00;11;01;21
Paul
So yeah. How do you.

00;11;01;21 - 00;11;04;27
Ian
Communicate that information can be more dangerous than no information.

00;11;04;27 - 00;11;23;29
Paul
So how do you communicate whether or not you should have know? Like it's like saying it's like saying somebody should do it or you could do a conjoint you. Michael, you talked about this before, so maybe you could do a conjoint online. But like, there's lots of like things around conjoint like to design a really good conjoint, you know, it takes some level of knowledge or experience, know what the outputs going to be.

00;11;23;29 - 00;11;43;04
Paul
And when you get that information, how they interpret it and why something might be more important than other attributes because you the way you've designed it. I clients don't know that. I mean, is that something that we should, you know, avoid letting clients do on their own or is there a way to guide them in a way into a process that's more automated, that lets them know that it's confident in the way they're doing it?

00;11;43;05 - 00;11;43;22
Paul
I dunno.

00;11;45;06 - 00;11;46;17
Ian
I think if it's simple things.

00;11;48;06 - 00;11;50;12
Paul
You know, you said disagree. Let's hear it.

00;11;50;21 - 00;12;11;12
Michael
Yeah, it's, it's a guy who says things are going to get spicy. I think I disagree a little bit with both so it's like a little bit spicy in as much as I, I always think research isn't that hard. It's not rocket fire, but I've also seen it done so shockingly bad throughout my career From from a range of start.

00;12;12;00 - 00;12;45;11
Michael
Even senior people I've seen do do research terribly badly. So I think DIY research can be good if you're using a template. And you know, we obviously have templates on the upside platform, but it's on just lots of platforms have templates. So I think that it can be good because someone has spent presumably hopefully spend a lot of time thinking about question wording, question, order, maybe an analysis template, all of these sorts of things, because Ian sort of said it there that a little or bad information could be worse than no information at all.

00;12;45;21 - 00;13;05;11
Michael
And one of the great dangers of market research is that you can ask the most ridiculous, almost impossible to comprehend QUESTION and people will answer it. Those who don't give up on the survey will answer it. Just to get to the next question. Maybe like, oh, 80% people said whatever, and people will take that and run with it as if it's a real finding and it isn't.

00;13;05;24 - 00;13;23;00
Michael
And so I think that we have to be careful here where if you're just going to go blue sky and start doing DIY research, it does leave the door open to some pretty poor quality things. And I don't know who's going to show up in the second, but I don't think the designation ever mattered. But I do think quality matters and I think that it's very easy to do that.

00;13;23;18 - 00;13;29;04
Paul
Well, I mean, I agree with you. So I don't you know, I don't know which part we disagreed on, but I totally agree with you on that.

00;13;31;09 - 00;13;33;19
Michael
I think you're a little more open about some. I was sorry.

00;13;33;28 - 00;13;54;00
Dom
Just maybe just a final thought on this topic. I think all types of research, at some point, it benefits from bringing in a research expert. And I was kind of like you guys. Meantime, we have like three examples I can think of like that I've learned over my my career. So I remember years ago I used to be a research manager.

00;13;54;00 - 00;14;13;17
Dom
I was on the client side. I worked for Canadian Tire. So this was just 115 years ago and I think I was the research manager, the buyer, for about four or five different segmentations. And this is really early in my career, and I was just kind of learning, just learning this industry. And I hired a very reputable supplier to do these segmentations.

00;14;13;28 - 00;14;24;03
Dom
And in every single segmentation there was always this enthusiast group. So we did a lawn and garden segmentation. Those are gardening enthusiasts. And then we did a decorum on the car enthusiast and then there was.

00;14;24;11 - 00;14;28;25
Meagan
A situation everyone could see the the way we're moving right now, it's amazing.

00;14;29;29 - 00;14;31;13
Paul
You have very enthusiastic like.

00;14;32;12 - 00;14;46;29
Dom
Yeah, but what I didn't back then and because this supplier didn't know any better, there was this enthusiast segment because there was this a group of people who just answered hi on all the scales. And now I know we know a dig because now we've got this really great analytics team. You do something called Hipster Ties in the data.

00;14;47;05 - 00;15;13;29
Dom
I'm going to try and explain what it is we can get Jordan to do all session on that, but that removes that whole problem, right? There's other things like Conjoint where you can do a very simple conjoint on your own with some of these online platforms. But as soon as you get into the situation where you have like prohibitions like this price can't go with this feature or this brand can't offer this thing or whatever you have, all of a sudden it becomes much more complicated and these off the shelf methodologies just simply can't do it, you know?

00;15;14;00 - 00;15;29;17
Dom
And I think the third one that new to mind is like qualitative research. If you just want to like talk to someone to really quick five minute, you know, what do you think about something that's fine. Anyway, anyone can do this. But, you know, I was fortunate enough way back again and again they put me on this Riva moderator training course.

00;15;29;17 - 00;15;51;15
Dom
I never moderated, but I just wanted to assess the moderators we were hiring and I learned like, so much, like so many quality people, they ask quantitative questions. They'll just basically go around the table. This is back when focus groups in person and they would just sort of tallied what people were saying like, and I think like the key to good qualitative is to be able to like use projective techniques and really dive in and probe on things.

00;15;51;15 - 00;16;08;21
Dom
So I think almost any research methodology, at some point you can do a basic like segmentation let's segment based on the products they buy or a demographic segmentation, but as soon as it gets a bit more sophisticated, you can expert. Same with conducting with anything really. Like it is a really simple thing and you need a quick answer.

00;16;09;10 - 00;16;20;27
Dom
Use an off the shelf like a DIY product, but as soon as it gets complicated, I think that's where you need an expert who frankly does not use years of experience to know what's right and what's wrong.

00;16;23;08 - 00;16;34;18
Meagan
Yeah, I think that's really helpful. Just like the the examples that you've given, just as a way of framing up what might be complex and what might be a little bit easier. Paul, I interrupted you.

00;16;34;18 - 00;16;49;14
Paul
Well, I was just I just say like I dumps totally right. And I think that's kind of why the reason why when we built our own software, we had a lot of our experience that went into how it should be designed and how it should be displayed and what the deliverable should be and how the analytics built tied into it.

00;16;49;28 - 00;17;15;13
Paul
But to actually communicate some of these things to like a new person in research where all they see is a DIY platform, you know, Dom's all Dom's examples are precisely why you need an expert. And I feel like that that level of information is somewhat lost or not as communicated as what it should be. And maybe that's maybe that's our role is to continue to communicate that for experts to to the broader industry as well.

00;17;15;13 - 00;17;41;06
Meagan
And I think like this. So by the way, I'm just totally throwing out all the questions that I sent you prior to this starting, which is super helpful for you guys. You're welcome. But yeah, I'm really interested in this idea of like, so what is it that a consultancy like Digg or, you know, research experts in the field, like what is it that they are bringing to the table as we start to take more of the sort of manual or tedious tasks away?

00;17;41;27 - 00;18;15;26
Meagan
So when we talk about like DIY approaches, even automated approaches that leverage technology, like what is it for those strategic projects or those foundational projects? It doesn't actually what Digg brings to the table, but where do we see that? Where do we see the benefit other than like understanding a category or understanding research about best practice? Like the first thing that I think of, for instance, is doing things in a different way than maybe people have done before, like being able to approach understanding consumers in sort of a unique way, leveraging different methodologies.

00;18;15;26 - 00;18;19;25
Meagan
What else does a consultancy like ours sort of offer?

00;18;21;13 - 00;18;29;23
Paul
I mean, so I'm actually going to pass this to Ian so you can get ready. Yeah, move the Caddy and we think that's.

00;18;29;24 - 00;18;30;15
Meagan
The cat.

00;18;31;18 - 00;18;47;10
Paul
Because I think, you know, I almost feel like sometimes we undersell what we do, and I'm sure other firms do very, very similar things in terms of, you know, you talk about doing things differently. Well, the reason why we know how to do things differently is because we've researched that doing the old way matches a different way of doing it.

00;18;47;10 - 00;19;08;26
Paul
And you get same the predictability of the answer is the same because it's still grounded in robust research and there's a lot of time spent from our analytics team that Ian oversees on doing research, on research, on making sure and understanding the outcome of asking a question in a certain way, of making sure that the answer is still predictable of what we're trying to achieve.

00;19;09;05 - 00;19;30;16
Paul
And you can go and Ian can provide a ton of examples where we've gone up against the boxes of the world who are using a certain type of approach or a question method that we can point to and say in a way it's actually not the best way to ask this question. Your your data is going to come out in a way that's not actually going to be harder for you to use and doesn't give you the answer you need and doesn't create the variability you need or whatever it is.

00;19;30;16 - 00;19;44;23
Paul
And I think, you know, my my personal experience working with with Ian and these guys at this company has been there's been a lot of focus on the on the robustness of our analyst and interpretation of data and the approaches that maybe maybe it's kind of maybe the second or none.

00;19;48;04 - 00;20;05;27
Ian
Yeah, I mean, I've looked at, you know, Dom will talk about this as well, but it's funny, like, well, one way to think about this is one of the projects that you get pulled into by clients because a previous either research company or an internal function like really.

00;20;05;27 - 00;20;06;23
Paul
Straight or strategic.

00;20;06;23 - 00;20;44;07
Ian
Consultancy and or well, yeah, definitely. Or strategic consulting. And I'd say segmentation is one of those ones that comes up a lot. Like companies will spend like $1,000,000 on a segmentation and it's in they are just so frustrated. And then by the time they call us, it's like, can you fix this? And we'll look at even just like the way the questions were asked to look at their typing tool and they'll be all kinds of like really obvious errors, like they included questions that everyone didn't answer, but they did it up with two means or they didn't flip the scale on questions that were both positive or positive versus negative, like stuff like that.

00;20;44;07 - 00;20;58;15
Ian
And it's just like, wow, like they just give you a wrong answer and you spend a million bucks on it. So that would be one good example. I think another one where we get pulled in is like a company will do a contract. Now in those cases, you can't really fix it with the data. You've got to go back to field.

00;20;58;16 - 00;21;19;23
Ian
But they'll totally screw up things like damage and like prohibitions, but also things like, you know, conjoint have attributes and levels. Like instead of doing sometimes one attribute with lots of levels, you may want to split that up into a couple of different attributes that might even be binary on or off. And people don't think of that. Not even other research companies think of that sometimes.

00;21;19;23 - 00;21;22;13
Ian
So why would an end user be able to think.

00;21;22;19 - 00;21;30;03
Paul
There's like four people listening to this and who are just nodding enthusiastically to what you're saying, like, yes, No, no, K means for sure.

00;21;30;07 - 00;21;31;22
Paul
Wow. Wow.

00;21;34;19 - 00;21;35;23
Meagan
I am personally.

00;21;35;24 - 00;21;51;14
Ian
And then also people who care about this stuff, right? Like you can do anything yourself, but it's like it's you know, Michael said it's not rocket science. Well, first of all, I think we should call it like air science. Is rocket science even a thing anymore.

00;21;52;01 - 00;21;53;07
Ian
But that just exploded.

00;21;54;13 - 00;22;01;07
Ian
Yeah, I think US rockets still think rockets are popular, but I just don't feel like it's the cutting edge of science. I'm just saying.

00;22;02;29 - 00;22;04;11
Meagan
That's going to be this snippet.

00;22;04;26 - 00;22;06;13
Meagan
Rockets are popular.

00;22;06;26 - 00;22;07;18
Paul
Rockets. I stop.

00;22;08;02 - 00;22;08;15
Ian
You know.

00;22;10;28 - 00;22;12;21
Ian
I can't even remember what I was saying.

00;22;12;21 - 00;22;16;22
Paul
The rockets and the rockets. It's just saying Michael said it's not rocket science.

00;22;18;03 - 00;22;43;28
Ian
But. Right. But there is a certain amount of knowledge that you learn either from reading about the topic and or doing it for years and years and years and for some really important questions. You probably don't want to do it yourself. It's like like Paul said, if it's an overnight, we need a quick go. No go read or in our case, with upside, if it's going to be like it just ranking your, your ideas.

00;22;44;19 - 00;23;06;06
Ian
Yeah, those are things you can do and you can have automated reporting and you can have automated inputs and analytics and stuff because they're relatively simple. But if it's going to be nuanced, then you're going to be looking for, you know, deep recommendations based on looking at multiple cuts of the data, maybe different analytical techniques. So some bivariate stuff, then you're going to need to you're going to need a consultant.

00;23;06;06 - 00;23;16;07
Ian
And I would say if it's a big enough decision, you probably it's probably worth the spend. It's like selling your own house. You could do it, but most people hire a real estate agent.

00;23;17;19 - 00;23;23;26
Michael
And it's sort of a metaphor that might or might not be interesting, but it's it's like very.

00;23;24;17 - 00;23;25;14
Michael
I don't know what's funny.

00;23;25;21 - 00;23;29;27
Paul
You just you just prefacing it with I'm going to say something. It may not be interesting.

00;23;30;19 - 00;23;32;02
Michael
You might, but I you know.

00;23;32;06 - 00;23;40;15
Michael
I didn't want to point it out. But Paul, you did start by referencing an inspiring conference presentation that you saw that you delivered.

00;23;42;20 - 00;23;44;10
Michael
By was amazing.

00;23;44;16 - 00;23;48;29
Paul
Although I didn't say amazing, but thank you for pointing out that you thought it was amazing. I appreciate that.

00;23;51;12 - 00;23;52;02
Meagan
Michael didn't.

00;23;52;02 - 00;23;54;05
Meagan
See that presentation finish.

00;23;55;29 - 00;23;58;27
Ian
And maybe a metaphor.

00;23;58;27 - 00;24;15;25
Michael
It's like building a chair where it's like, if you want to do a DIY idea, I was like, going to do it. You're still making it, but you're making it from a kit. You're very guided. The chance of it turning out is very high. The chance of it being a complete disaster is very low and then doing doing full turn out DIY.

00;24;15;25 - 00;24;31;28
Michael
The full bespoke research on your own is like buying a bunch of lumber and trying to make a chair is a very high chance that it's going to turn out like a complete disaster and collapse as soon as you sit on it. But if you get a carpenter, someone who spent 20 years making chairs, very good chances, it'll turn up very nice.

00;24;31;28 - 00;24;39;05
Michael
It's going to cost more, but it's going to be very custom. It's going to be exactly what you want. And I think I think that's sort of like a metaphor.

00;24;39;29 - 00;24;45;07
Paul
Just so you know, I still fuck up all my IKEA build. So and even DIY can still be messed up.

00;24;45;20 - 00;24;45;28
Michael
But if.

00;24;45;28 - 00;24;46;09
Michael
You have.

00;24;46;15 - 00;24;47;06
Michael
Parts right?

00;24;49;17 - 00;24;58;21
Meagan
Yeah, I, I, you should see my dresser. It's, it's not good. So I feel you, but I think that that metaphor is, is definitely spot on. I mean, like.

00;24;59;03 - 00;25;01;00
Ian
That's, like, it's a similar.

00;25;01;24 - 00;25;01;29
Ian
One.

00;25;03;14 - 00;25;10;04
Meagan
I think you and I didn't catch that.

00;25;10;08 - 00;25;11;23
Ian
It's important it's going to it.

00;25;11;23 - 00;25;53;01
Meagan
Is very important. It's very important. I, I feel like I have like I've taken us around and around and around a little bit on this topic of what research requires an expert, I guess to end off, I wanted to talk a little bit about what bad research looks like, which doesn't sound like a super optimistic way of finishing a podcast, but you referenced it, Michael, of like you've seen a lot of bad research in the past and a few of you have mentioned like segmentations that you've had to sort of revive or make make better.

00;25;53;10 - 00;26;05;04
Meagan
I'm just wondering, as someone who's not a researcher, like when we think about those more custom sort of strategic projects or foundational research, what makes them, quote unquote bad?

00;26;08;23 - 00;26;09;28
Paul
Oh, who wants to go first?

00;26;10;02 - 00;26;30;12
Ian
I think simplistically when it can't, it doesn't. Not only does it not answer all the questions they have, but it doesn't even answer the main questions sufficiently. Right. So and I see it all the time, like though they dance around it or they ask it double barreled like double barreled questions or the kiss of death, because you don't even know what they're really answering anymore.

00;26;31;03 - 00;26;58;09
Ian
And I see it all the time in DIY particularly. Yeah. So it's hard to know how unless you do this for a living, it's hard to know how you have to ask the question, how the data has to be answered, in what structure to do the type of analysis you want to do on the back end. If you're going to do a drivers analysis, you better ask first person to write that image of something I did 15 years ago.

00;26;58;09 - 00;27;22;08
Dom
Right now, 15 years ago, I have to ask that question and yeah, can I can I jump in on this one? Because the idea of what makes research bad, I think, is when it fails to result in like a change in behavior by your client where they don't take action and actually do something with it. We're like, if it's just research for the sake of learning something like that to me is a waste of money, like it needs to generate more revenue, like do something to help the business.

00;27;22;19 - 00;27;42;15
Dom
And I, going back to the segmentation example, because I think that's one where we see other companies maybe not do it as well, like like Ian point out there sometimes where it's just a garbage model, it just doesn't work and it doesn't make any sense. And in those cases we redone them completely. There's other cases where another company does a segmentation, but for whatever reason they just can't bring it to life.

00;27;42;15 - 00;28;16;05
Dom
So we've been asked to come in and sometimes do qualitative research on existing segmentation, create videos, create ways that they can socialize it within the company so people will actually use it. And another one that, you know, like one thing that we've done a lot with segmentation lately is merging primary survey data with secondary transactional sales data in fact, we presented an IPX a year ago with one of the big banks where we've done that and something where we've seen players come to us like they've done a segmentation, but they don't know how to apply it to their customer database.

00;28;16;05 - 00;28;31;25
Dom
So that that's an instance where we've come in. And so again, not necessary. The segmentation was bad, just the client doesn't know how to make it actionable and how to actually drive their business forward. And to me that's the worst type of research when there's no positive outcome that comes from.

00;28;32;01 - 00;28;52;15
Paul
I would just I just want to add to both Ian and Donna's points. The one research that I see is bad is where the answer comes out. Or, you know, in this case I'm talking about more quantitative data or survey data. When it comes out, it just it just it just screams bullshit. Like you're just like, okay, we know that that that's not actually right.

00;28;52;21 - 00;29;09;15
Paul
And listen, it can be done for a variety of reasons. The panel that you use, the people who did it were just not engaged in the questions. They weren't paying attention like how you designed the survey. It's critically important because you want to get the right answer. You want people to actually provide you the answer, not just like skip through a survey.

00;29;09;15 - 00;29;31;02
Paul
That's why we make our stuff as engaging as possible. But like when someone says, you know, like there's like, you know, do we try to find people who have thought like a luxury item in the past, whatever? And like, 30% of the respondents say they have a helicopter. You know, it's not right. It's not right. And so you have to be able to look at something with a critical lens and say and call bullshit on it, because, you know, ultimately that's not the right.

00;29;31;02 - 00;29;51;07
Paul
So something's wrong. And I think sometimes that the something's wrong aspect, it goes through too far. It doesn't get seen enough. It's not pointed out early enough and to the point where it's gone all the way towards the end and you're like, okay, yeah. Now somebody I could tell from either say, Hey, listen, sometimes it happens even internally too, and we will catch something.

00;29;51;07 - 00;30;09;28
Paul
They have to redo something because you end up realizing, why is that? Is that actually right? But when you see research from other people who is provided our clients and you say like you could tell something's not right and they will and they will question that. And when you question the data, when someone has a question about the data, it calls into question the entire thing.

00;30;10;05 - 00;30;24;28
Paul
And so it's so important to like, figure out if the data is right, it right at the beginning because otherwise you just open it. Everyone gets opened up to the scrutiny of the first slide says that 30% of people own a helicopter. I don't know if I'm going to believe the rest of this report.

00;30;27;06 - 00;30;57;22
Ian
The other one is screeners like this won't even happen in service research, but they'll turn the screener into such an insane obstacle course that you get down to like 2% of the population. And then when you talk about the results, you can't projected to any known population. You're like, well, people who went shopping for dog food on Wednesday and they also saw the new John Wick movie, You know what I'm like, It just gets crazy.

00;30;58;15 - 00;31;01;13
Ian
It's just crazy. Not me or.

00;31;02;18 - 00;31;03;20
Paul
It's a target market.

00;31;03;21 - 00;31;05;22
Meagan
We actually only found one person.

00;31;07;01 - 00;31;07;10
Ian
Yeah.

00;31;08;05 - 00;31;09;26
Paul
This is the most loyal customer.

00;31;13;26 - 00;31;37;18
Meagan
Thank you so much, guys. This is great. I think hopefully a lot of people who love to nerd out about research are going to get a chance to listen to this. And as you said, Paul, be nodding their heads Yes. We will be back enthusiastically. And we'll be back in a week with another episode. Please like and subscribe or available everywhere.

00;31;37;18 - 00;31;41;04
Meagan
Apple, Spotify, all the good ones. Yeah. Talk to you soon, guys.

00;31;41;12 - 00;31;42;17
Ian
Rockets are popular.

Dig Insights