00;00;06;23 - 00;00;32;23
Speaker 1
Hi. Welcome to Dig In the podcast brought to you by Dig Insights. Every week we interview founders, marketers and researchers, from innovative brands to learn how they're approaching their role and their category in a clever way. Welcome back to Dig In this week. So happy to have you guys and so happy to have Patrick Lambert, who is our VP of Customer Experience at Upsiide.

00;00;32;24 - 00;00;34;08
Speaker 1
Patrick, how are you doing?

00;00;34;21 - 00;00;35;25
Speaker 2
I'm great, Thanks, Meagan.

00;00;36;11 - 00;01;03;27
Speaker 1
Yeah, very excited to have you here. Actually, this was a topic that one of our founders, Ian, was really interested in because some of the work that your team has been doing and kind of getting large sort of global companies onto the Upsiide platform and really allowing them or enabling them to do this well and sort of democratize access to it in a way that's sustainable.

00;01;04;20 - 00;01;30;01
Speaker 1
That's something that's like a lot harder than I think. It sounds like you don't you can't just give away access to your platform and sort of expect everything to go seamlessly. So we wanted to dig in to what we've essentially learned as an organization to to make this easier for other people who are trying to do the same, whether that's, you know, clients that are looking to do this with a supplier or other agencies who are looking to kind of make their customer experience better.

00;01;30;19 - 00;01;39;18
Speaker 1
So I wanted to ask you, I guess just as a starting point, what does customer experience actually mean for us? And Upsiide?

00;01;40;01 - 00;02;09;19
Speaker 2
Great. So since I joined Dig, I probably changed my title two or three times and expecting to change it soon again. And so right now we're using customer experience as a way to define all customer teams for the Upsiide platform. For us, that includes customer success. So you can see those folks as the account managers, people really knowing the business of their clients, fully dedicated and making sure they have a good experience so that they meet their objectives of customer support.

00;02;09;19 - 00;02;29;16
Speaker 2
Support tends to be something that's a bit seen as reactive to issues or even when there's a problem, they jump in. But we take a very proactive approach to support, which is more around creating resources. A lot of materials that whenever someone has a question or think they have an issue, that they find their answers right away. Right?

00;02;29;16 - 00;02;54;08
Speaker 2
So, well, dedicated group of users is, you know, it leads to much fewer issues. So so that's our proactive approach. And finally, customer enablement, right? So we're mainly here to talk about enablement. So customer enablement is our group of researchers that are there to help our users become more independent in a way more empowered so they can fully leverage their platform.

00;02;54;28 - 00;03;04;06
Speaker 2
So that could mean just, you know, more self-serve work, but really making the platform more like the client's platform, not just like a.

00;03;06;05 - 00;03;24;02
Speaker 1
Okay. I mean, I see customer enablement as a title and at a lot of different sort of SaaS research companies. Do you think there's anything different about the way that we approach customer enablement than the way sort of like, I don't know any other rustic platform, right?

00;03;25;04 - 00;03;55;13
Speaker 2
Yeah. Well, I think it starts from the fact that, like our product is extremely simple to use. So there's nothing in our enablement that is about, you know, enabling them to use the product. It's all about enabling them to do better research. So our goal is to and then what our and really like, definitely we've I like how Indian she explained it how did we learn how do we developed it because that really was the process when I joined the company a bit over three years ago.

00;03;56;15 - 00;04;19;20
Speaker 2
No clue. Anyone hiring me to do this and I wasn't planning to do this. And we just realized that's what our clients need right? So it's all about providing them that research, experience and expertise through a self-serve project. Like in many cases. That's why we do this. Our clients want to use it. They want to use it quickly.

00;04;19;20 - 00;04;42;22
Speaker 2
They don't want to spend too much. And they want the benefits of self-serve DIY automated while at the same time getting the benefits of working with a, you know, top tier research agency. So how do you do this is by building the expertise or locking in in their platform, in their account, the work that we co-create with our clients.

00;04;43;22 - 00;05;05;24
Speaker 2
So that every time they do a self-serve project, they do it with research or survey templates that are specific to their research needs and how they do things in that organization and that research program they accomplish through a platform in the end really becomes a competitive advantage for them because they feel like this is how I innovate, this is how I do research.

00;05;06;07 - 00;05;12;05
Speaker 2
It is not just like a can methodologies from, you know, a company out there.

00;05;13;14 - 00;05;22;13
Speaker 1
And when we think about the way that we've structured or we've we've built or you've built sorry, customer enablement, I always say we as if I'm doing it with you.

00;05;22;20 - 00;05;25;07
Speaker 2
You're helping.

00;05;25;07 - 00;05;56;16
Speaker 1
But yeah, when you think about the way that you've built out the customer experience team and specifically enablement. Talk to me about those. Those like pain points are those sort of specific mistakes that you're trying to address within any. So you've obviously mentioned making sure everyone understands not like how to use the platform, like understands best practices from a research perspective, but like what are some other things that are sort of uniquely challenging when it comes to getting a client on, on a research technology platform?

00;05;57;22 - 00;06;23;24
Speaker 2
Yeah. Yeah. Well, so the notion of doing research on a technology platform implies some kind of standardization, and all notion of a research program is a specific research that will be done to support a specific step in a process that a lot of our clients would use a platform that they for idea screening or early concept testing, maybe, you know, validating concepts from mentalities or things like that.

00;06;23;24 - 00;06;48;23
Speaker 2
Right? And so one of the challenge or and it's also a challenge for us, but it's a challenge for the organizations, and that's why they're leveraging our expertise is they have different stakeholders within the organization, have different views. Someone's been using X methodology from a vendor, someone's been using something else. And typically the people that come to us would be like from a corporate team if you want.

00;06;49;02 - 00;07;10;07
Speaker 2
And what they're trying to do is get people to do things in a more similar way across room organization and then they what is the aim is do it in a better way. So typical challenge would be just from the fact that there would be some stakeholders that have their own ways of doing things. And it's it's it's change management at that point.

00;07;10;08 - 00;07;43;13
Speaker 2
Right? So, so a lot of the work we do really is helping that person in that corporate role be successful at implementing change within the organization. And we do that by lots of dedicated support resources available to them. Training session and really giving a lot of visibility to those members of that corporate team so that they can easily report internally and show all of the wins and all the impact that this program has.

00;07;43;13 - 00;07;49;23
Speaker 2
Because if that person is successful, then we are successful, right? Yeah, we've got to have that that internal contact.

00;07;51;07 - 00;08;19;12
Speaker 1
I mean, and what level of customization do we think is required for some of this? Like, I know we're not going to speak about specific examples today, but we've worked with many big global companies implementing Upsiide specifically for sort of like early stage innovation work or testing. Sorry, but yeah, how much do you need to customize and are there specific things that you can kind of take a standard?

00;08;19;12 - 00;08;33;23
Speaker 1
So maybe for instance, you know, so that you need to implement survey templates or blueprints within the platform? Yeah, but like certain other things need to be customized, like how do you decide what needs to be standard and what needs to be custom?

00;08;34;02 - 00;09;13;24
Speaker 2
Yeah, overall, it's not that custom. A lot of it is standard or very similar. If you were to compare how different clients do are because there are just best practices around how innovation should be done and tested. The key differences come to more like procedural differences, how this specific client's wants to pay for research, how or which you know who, how, how democratic they want things to be in the sense of is are certain things have to be done by the insights person or is the marketer allowed to go in clutch without an insights person with you.

00;09;13;27 - 00;09;39;25
Speaker 2
Right. So it's typically more around the processes. But in terms of what an idea is, screen is a similar concept testing obviously from one category to the other. You want to evaluate on different product attributes or things like this. But the like the surveys like I did, we have what we call dig solutions, which are surveys that are, you know, like solid and robust ways of of testing.

00;09;39;25 - 00;09;59;11
Speaker 2
And what we end up doing with our clients is highly inspired from this. It's more that in place of giving clients a story template that each users for their test would have to customize, you know, a few attributes or add in. Or maybe if they really care in their category about an occasion. Question Or the screener needs to be specific.

00;09;59;11 - 00;10;10;10
Speaker 2
It's I guess it's a topic that we really I always make sandwich an analogies, by the way. So it's the toppings that we we adjust but the meat in the meat is the same.

00;10;11;01 - 00;10;12;25
Speaker 1
The meat and the cheese is the same.

00;10;13;02 - 00;10;19;29
Speaker 2
Best cheese cheeses, very important needs is very optional, like a cheese sandwich in the sense.

00;10;19;29 - 00;10;41;23
Speaker 1
Love it. Yeah. Okay. That makes sense. I mean, as you're chatting, I keep thinking how this can be super because obviously, like, the whole point of having Patrick on stay, I don't want to make it sound like we're, like, Upsiide is amazing and we're doing everything perfectly. I kind of want to I want to make sure that people really feel like they're getting something, I guess, tangible.

00;10;42;00 - 00;11;01;15
Speaker 1
Not that they're not up to this point, but like getting something they can take into their work if they're trying to do this themselves. So imagine you're speaking to imagine I'm a client and, you know, they've decided to work with us or whoever it might be. It's good to work with a provider. Like what would you say they really need to bear in mind?

00;11;01;15 - 00;11;14;02
Speaker 1
I don't know if it's like the top two, three things, like a few key things that you need to make sure that your aware of or taking care of when you're implementing something like this across multiple stakeholders.

00;11;14;21 - 00;11;39;24
Speaker 2
Yeah, Yeah. And I think these are going to be very applicable to other kind of research needs or not apply platform where any platform but the a lot of the trickier part is not so much like the methodology but the, the audience audience maybe more of our terminology, but I guess your simple plan or how you or you're going to work through in your service, right?

00;11;41;06 - 00;12;06;05
Speaker 2
Because we do a lot of work that would be, let's say, like across categories or across regions and so forth. And there is in the context of a research program, there is a lot of needs around comparing results and making sense of things. And how is that compared to a previous stats or all these kind of needs and variability in how you build your sample?

00;12;06;05 - 00;12;36;04
Speaker 2
We'll just make things, you know, not easy to compare, and that's where we see a lot of difference from maybe one user versus another. Yeah, So that would be I guess number one is we really need to make sure we're aligned on when we test ideas who should be. And again, they're very similar. I was saying earlier, there is a lot of like what is best practice If you're doing research, supporting your innovation process, it is to predict how well your product would do if you were to launch it.

00;12;36;27 - 00;12;47;11
Speaker 2
So you should be surveying respondents that are representative of who would be buying your product, not the target that your marketing agency told you you were supposed to go after.

00;12;48;06 - 00;13;06;24
Speaker 1
And oh, that's an interesting little caveat. Yeah, we yeah, I'm intrigued by that as a marketer on this call, when you say don't go after the people that like your marketing agency might have told you to, how do you define the difference between those two audiences?

00;13;07;16 - 00;13;40;06
Speaker 2
Okay, so a representative audience enables you to understand how well a product would do. So if one of your so said, we're launching a beverage and I'm going after Young because I'm young, French men living in Toronto, right? That's my market. If that's important to you too, you do well with that specific demographic. You you need to make sure in your survey or in your test that you have the ability to filter or to look at that specific group.

00;13;40;20 - 00;14;02;04
Speaker 2
But it's the most important thing is to understand is that product going to do well? Is it going to stay on shelf? Is kind of the right velocity? Is it basically understanding like the kind of profile of how well it would do? And so as a start, you always need to make sure that you understand overall how it would do.

00;14;02;14 - 00;14;20;23
Speaker 2
And then to confirm that there is it's a strategic fit, that it is actually doing well with that group. If you only test with that group and they tell you how they feel about this product, what do you do with those numbers? What do you compare it to? Right. And so so it starts by having something to compare it to.

00;14;22;01 - 00;14;58;06
Speaker 2
And that typically is having an overall understanding look at this product wouldn't be would be quite niche within the general consumers of say that beverage category. When I look more specifically against my target demo now it's say winner now is doing very well amongst that group because they really care about this. So then you know how to execute that launch and obviously your your your target demo is a group that you can execute against, a group that you can actually, you know, reach as a target.

00;14;58;06 - 00;15;05;28
Speaker 2
That demo that is unreachable is not so much a target demo. You can execute on it. So yeah, that's the marketing part and that's where I trust you guys. Then what you're doing.

00;15;07;10 - 00;15;27;06
Speaker 1
Now, honestly, thanks for going into that. That's like really important and not what I thought you were going to say in terms of like the thing that people really need to bear in mind is setting up their audience in the right way and making sure that they're executing it best practice. Is there anything else that you think people should be aware of?

00;15;27;06 - 00;15;32;14
Speaker 1
You know, if you were talking to a client like you're about to set up this program, make sure that you do X, Y, Z?

00;15;33;06 - 00;16;10;02
Speaker 2
Yeah, well, I think like a not so much about the set up of the program, but like a benefit of the program that even ourself initially weren't so much thinking about at the beginning is that or I mean, like fast forwarding to today, the way we set up program is with a full understanding that at year end we want to start doing some meta analysis, using some of our advanced analytics abilities to evaluate lift and shift potential or different things like this, because, you know, that's what all clients in a way ask us in, you're in, and then we realize, Oh really?

00;16;10;02 - 00;16;28;18
Speaker 2
You should have been doing things differently from the beginning. So yeah, maybe there's a client on the call right now. We're listening to this that's like, Yeah, that's me. Well, we learn based on you and thank you so much. And now, now we do it in a way where it's almost like it's part of a program that, that we get into those meta analysis and those meet the learnings a year in.

00;16;29;06 - 00;16;29;21
Speaker 1
Right.

00;16;30;16 - 00;16;31;00
Speaker 2
After that.

00;16;31;18 - 00;16;50;23
Speaker 1
Yeah. So almost thinking about it, it's not just about being able to execute studies on a platform, but it's also about like what the, what that large scale sort of meta, as you said, meta analysis mean based on those individual surveys, which is really cool.

00;16;51;11 - 00;17;32;23
Speaker 2
Yeah. And since you asked for three, I guess I get my number three now, it's really about also like the value we provide is not so much in the fact that we, we build a solid, robust research program that supports their specific need and how we make every individual project successful. But also that's more like project management aspect, like having a partner that you can connect with on a regular basis to help you optimize your program as well as provide visibility internally to your stakeholders of how things are doing, how many testify on which users are active.

00;17;33;17 - 00;18;01;26
Speaker 2
We're seeing this specific group to be adding a bit of challenge, to have a lot of questions. How do we address this? Do we need to set up a webinar to talk about best practice and maybe have the person from this group come in and share their stories? All of that proactive work is part of what my team does, and that's really, I think, like a key component of how we set up six very specifically for us is like I mentioned, support, success and enablement.

00;18;02;13 - 00;18;19;28
Speaker 2
So it's very important for us is that's not like through meetings that three silos, everyone on the team has a primary responsibility, but everyone in six is responsible of making sure their clients are supportive enablers and successful. Right? And so everybody helps. It's like it's one thing, really.

00;18;20;25 - 00;18;37;22
Speaker 1
Okay, I'm going to ask you an annoying question now. Do you ever if you guys made any mistakes that you would like never make again? So like, you know, you're doing a startup and I don't mean, you know, data quality mistakes. I mean, like setup.

00;18;37;22 - 00;18;39;12
Speaker 2
We don't do this stuff ever.

00;18;40;12 - 00;18;58;05
Speaker 1
And what I mean is, yeah, like from a startup perspective, you've approached it in, in a certain way or from a project management perspective of a certain way, and you've learned a lot because it wasn't necessarily the right way of sort of doing it in the first place. I'm just wondering if there's been any of those learnings.

00;18;59;07 - 00;19;25;15
Speaker 2
Yeah, well, I guess I'll stick to the example that I gave a bit earlier, but the the lack of standardization around audiences or a sample recruitment profile, it's really something. When we started getting to more developing and better analysis capabilities and we realized, okay, now every research program we implement and roll out with clients now requires a really set it up with that in mind, right?

00;19;26;15 - 00;19;30;22
Speaker 2
That would be the biggest hurdle I can think of. Go through.

00;19;31;27 - 00;20;07;01
Speaker 1
Like sense. Okay. The last sort of the last part of this I wanted to dig into, I mean, we're uniquely made up as an organization, so Dig Insights is obviously the consulting, the consultancy, and then outside is a product of dig and we are growing so quickly and we have these large clients that do sort of custom work with us on the dig side, and we have clients and those same clients are doing a lot of innovation testing through uptime and sort of work with your team.

00;20;07;11 - 00;20;28;25
Speaker 1
Talk to me about how we sort of break up the work across dig and uptake. So for instance, with one of the clients that we've got an innovation testing sort of program running with, how do we know when they need to work with dig and or a dig consultant versus someone on your team.

00;20;29;15 - 00;21;02;14
Speaker 2
Yeah, Yeah. So we're really obviously a manager. The first thing like this, it's sometimes a little challenging and something that, you know, we, we keep looking at improving and how we provide this kind of like one one company or one partner experience for our clients. And the way we do it really is. Um, so I guess I'll address the first part of your question is where do we split it and where is the line in the sense of who does what in a way?

00;21;03;07 - 00;21;24;06
Speaker 2
So any kind of engagements with our client that is about servicing or strategic consulting or basically, you know, servicing an individual project about, you know, we're going to do the work for you and provide you our insights and recommendation that's always our client service team, right? So we don't have like two brands or two ways of doing the same thing.

00;21;25;07 - 00;21;45;17
Speaker 2
So you can really see my team as the the team that oversees the whole you know, you've you've licensed the platform and you've got a program. So we're overseeing the program. We're there to implement it. We're there to manage it, to make sure it's successful. And a lot of our programs, I think I said the word self-serve maybe a few too many times during this conversation.

00;21;45;17 - 00;22;06;14
Speaker 2
But all of our programs are a combination of self-serve of and levels of servicing, right? So it's very common that our clients will say, okay, look, when I've got a brand manager that wants to test 15 flavors and all they've got to put into your platform is 15 flavors, well, they should be able to do that self-serve. So we really make sure that we empower them to do that self-serve.

00;22;06;24 - 00;22;32;13
Speaker 2
But some of the research do you do they may need a little bit more help. So we have maybe a lighter service option for certain kind of things. And it always happened that clients have bigger, more strategic initiative or they have something that's a bit more customer, something that's not a true fit with a program. And that always becomes like a fully service type of engagement, which is know always been the bread and butter of big.

00;22;32;27 - 00;22;54;26
Speaker 2
Our ability to, you know, take on complex customer questions for our client to provide clear answers. So yeah, so we, we just, you know, we work in partnership. We when, when we get a new client, they get an assigned or dedicated customer success manager and they have a insights VP assigned to their account as their principal. And we serve consultants.

00;22;55;06 - 00;23;04;06
Speaker 2
Those two people will work hand in hand making sure our clients are successful in both service and self-serve endeavors.

00;23;05;13 - 00;23;27;12
Speaker 1
Okay. Yeah, that makes sense. And I think I'm wondering, do you think that this is going to become this is kind of a leading question, but do you think this is going to become more because we're seeing a lot, I guess, within the industry, A lot of agencies or consultancies start to develop their own technology, a lot of sort of mergers and acquisitions in the space.

00;23;27;12 - 00;23;52;08
Speaker 1
Do you think that sort of this way of approaching insights work or consumer research of having a consultancy with a technology arm, do you think that we're going to continue to see more of that? And as a follow up to that, if we do, how do you think that how do you think we will need to sort of evolve to make that work?

00;23;53;24 - 00;24;17;22
Speaker 2
Yeah, because yeah, big questions. I spend a lot of time, you know, focusing on the more the day to day and not so much on this, but yeah well I think Yes right. So, so it's for me it's not so much like how will the companies be, Will they be like tech companies, research companies or will it be more like partnerships?

00;24;17;22 - 00;24;39;29
Speaker 2
And how exactly is going to look that? I'm not sure we want to risk myself in forecasting that. But the the use of technology will definitely increase. Right? There is when you think of like how a research project is done from the moment you collaborate with a client, come up with a design, should just be able to click a button that says lunch and then the results should be available there in a super easy way to interpret.

00;24;40;00 - 00;25;08;21
Speaker 2
I guess should never be like that. Should always be the case. Right? And that's kind of like what we're trying to build here. And it makes sense that a lot of companies are also trying to build the same thing. So technology is going to increase. We also have very cool things coming up. I can't wait for us to be able to announce some of those things, but I also is going to play a big role in how you even report.

00;25;09;06 - 00;25;40;14
Speaker 2
So technology is going to keep increasing. Those that don't develop their own technologies will have to license it or have, you know, be part of a partner program of sort where they can access that technology was built by someone else. And yes, I was like a hand client. And I have the option of working. I mean, in the end, if you're looking at a vendor that has no acknowledgment but it has technology, you'll have one that delivers you insights quickly and probably be more efficient at it and one that's going to be not efficient at it.

00;25;40;14 - 00;26;06;18
Speaker 2
And so I feel like it's going to be a mandatory. So yes, for sure it's going to be bigger because it's it's such a big impact on just the operational aspect of doing the survey. And will, I feel like in the question, too, there's also this notion more of like, will there be more programs or will companies be more internalizing the how they do things?

00;26;06;22 - 00;26;33;11
Speaker 2
Because I think that's that's a big, big difference because the clients we work with, they don't use for NFC here. I'm really talking innovation specific or that that's my background. I think we focus on here. But if you think innovation specifically, like a lot of companies, they don't do their own thing. They don't have a process of specific or research process specific to their category, how they want to do things, their vision of the category, their vision of how innovation should be made.

00;26;33;19 - 00;26;59;28
Speaker 2
They just use the way company or vendor X does it right. I think that's going to grow. I think it's going to grow The fact that more companies are going to want to make it their own. I've seen companies try to make it their own, like solely on their own. That doesn't tend to go that well. So what I'm really like that what we're doing now here of basically supporting and enabling organization to build their own specific way of supporting their innovation testing.

00;27;00;09 - 00;27;14;17
Speaker 2
I think that's going to grow because every person in a leadership role in marketing there is looking for something that's more efficient for their specific needs.

00;27;14;17 - 00;27;28;19
Speaker 1
I think that's a really nice way to close out. Patrick, This has been very interesting. Thank you for joining me today. And I mean, I'll see you probably on another meeting later today, but thanks for chatting about this.

00;27;29;18 - 00;27;31;01
Speaker 2
All right. Have a good one. Thank you.

00;27;31;02 - 00;27;33;02
Speaker 1
You too. Bye.

00;27;35;18 - 00;27;42;16
Speaker 1
Thanks for tuning in this week. Find us on LinkedIn at Digg Insights. And don't forget to hit subscribe for a weekly dose of fresh content.

Dig Insights