August 17, 2020
| Season
2
,
,
Bonus
Episode
33

How to make creativity a priority in CRO

With

AJ Davis

(

Experiment Zone

)

Why you will benefit enormously when you make creativity a priority in your CRO process and how to level up your CRO user research.
Recorded during
-
This is some text inside of a div block.
Audio only:

Episode guest

Episode host

Guido X Jansen

Guido has a background as Cognitive psychologist and worked as CRO specialist on both the agency and client-side. In 2020 Guido won the individual Experimentation Culture Award.
Guido X JansenGuido X Jansen

Shownotes

Book(s) recommended in this episode

Transcript

Please note that the transcript below is generated automatically and isn't checked on accuracy. As a result, the transcript might not reflect the exact words spoken by the people in the interview.

Guido X Jansen: [00:00:00] They were going to talk about how to make creativity, a priority zero process, and how to level up your research. And we'll be doing that together with EJI. David Aja is an industry expert and user experience with a proven track record for delivering measurable value to clients. He's the founder of optimization agency experiment zone.

And before that optimization strategy for a fortune and it's companies during her tenure at the clinic, and she was also a UX researcher for a product that we all at least played around with, which is Google Ultima. My name is  and welcome to Shiro cafe, the podcast where I show you the behind the scenes of optimization teams and talk with their specialists.

About data and human driven optimization and implementing culture and validation in case you missed it. In a previous episode, I spoke with  about why empathy is the most important skill for any professional. You can listen to this episode on the zero website or in the podcast app. You're listening now.

This episode of shearography is made possible by our apartment's SiteSpect online influence Institute square and online dialogue. Welcome to season two, episode 33. So AGA a warm welcome to the cafe. And, yeah, let's start off with how you got started with UX.

AJ Davis: [00:01:36] Great. Thanks for having me on. So what I do is I do think optimization crossed with user experience research.

So I started my career as a user experience researcher, and then I moved into conversion rate optimization. I was working at Google for a while and I got the chance to work on the Google optimized product and to see how people do CRO and what AB testing is all about and decided to go and do that. So I worked for a company called Clearhead in Austin, Texas for awhile.

And then I started my own, I started my own agency called experiment zone three years ago.

Guido X Jansen: [00:02:08] What would stand out the most that you, if you think back on looking at how people use Optimizely or sorry, Google optimize. what stands out the most? What are the weird things that you remember people doing?

AJ Davis: [00:02:19] I was on my team from the very beginning. So I was, we were in a hotel room and California. And not like an actual hotel room we were in, actually, we were in it, we were at a hotel. I like as a conference center and we had a chance to group and teams of seven to 10 people for a Google sprint week.

So it was falling the standard textbook, Google sprint. So five days of starting from an idea to having a product. And so I was coming to that group at, with the lens of a user experience researcher and asking a lot of questions about who the target users were. And at the time we were just starting from scratch and starting from zero.

And it was a chance to really learn about all the varieties of ways people use testing. And so I think what surprised me most was how mature some testing programs were in places that you might expect, like news agencies and how immature it was, and other places where you would expect it to be really mature.

And they might have one person spent a little bit of time thinking about it, but not really having a full system. And what I walked away with was that. why I wanted to go and do this was because it was so cool that we got a chance to do user experience and then have an opportunity to validate that it, in fact, truly was the right decision or not.

And that's why I wanted to go and take the lens of crossing and intersecting user experience research, where you're thinking about what the problem is, how we might solve it, and then validating whether or not that's true.

Guido X Jansen: [00:03:39] So your background is in human factors. how does this connect to Shira? How did you become working someone that works in Sierra?

Because we usually the up until recently it's hero was not something you could study at any school or college. There's a lot of people here in Sierra with different backgrounds. How did your journey look like? How did that come to be?

AJ Davis: [00:03:58] So I say economics and undergrad, and my last semester I encountered Dan Arielli and got to interview him for this behavioral.

About his behavioral economics book. So it was in this economics, journalism class got a chance to spend an hour with him. And really, yeah, I first got exposed to what behavioral economics was. So we had operated in the system of people being predictable and us being able to chart out how people would behave based on market forces.

And it just planted the seed for me. And I wasn't really sure how that could be applied in the real world. And I went and worked in banking for a few years. Didn't really like that. So I decided to switch career paths and get a masters in user, in human factors and information design. So I was applying the concepts of behavioral economics in the case of qualitative data.

And then when I finally encountered AB testing at Google, I had, I felt ah, ha moment. everything came together. It was like, I went from being an analyst and really getting into the data questioning if those assumptions were true and then being able to take real world experimentation to understand what's really happening.

So it all came together in CRO.

Guido X Jansen: [00:05:05] Yeah. It's basically a match made in heaven rights, combining those two, fields of work. and then use that to optimize. Or whatever you want optimize as long as there are a of customers, you can do that. and today we wanted to talk about, bringing more creativity into Shiro, instead of being overly analytical.

I think a lot of people have that Lauren and Liz, background or. I think also in lots of people from coming from SEO, having a more analytical mindset, more data driven, mindsets. so first of all, how would we define creativity in the scene and how would you look at that?

AJ Davis: [00:05:42] I think what often I, what I've often seen when I talk to other people who do testing is that we're really solution focused and we're trying to justify that solution using data.

And I think if we all took a step back from that and said, let's use data to define where we're heading. And what the problems we know about are, and then we can plug in creativity to explore different types of solutions. So solutions don't have to be constrained to, I see this particular problem and there's only one solution, but instead of giving ourselves the possibility that there's one problem and there might be a hundred solutions.

And so giving time and space to explore that. So you can ultimately put the best thing out there for testing.

Guido X Jansen: [00:06:23] How do we, how would one develop this and how do would, if you already have some creatives?  where people say, Hey, that's very creative because I don't do not see me see myself as a creative person at all.

But sometimes I have a light bulb moment apparently, and sometimes I'll just recognize a, so how would you find out

AJ Davis: [00:06:42] to the, to not shoot down ideas right away? what I often see is this sort of if you're in a conference room and you're you have to present your case for something, and then you only, you're presenting one case for one solution.

And there's not really this opportunity to say, but I might be wrong or there might be better ways to do this. And let's open up that conversation. So when I think about something like a sprint week, you take a week. And the first few days, you're not saying no to anything yet. You spend time literally saying, write down every idea.

There's no bad idea. Get them all out there and then don't worry about judgment yet. So I think it comes down to postponing judgment and separating the space between this is creativity and brainstorming. And then this is when we're evaluating which things we should move forward with

Guido X Jansen: [00:07:26] just postponing judgments on your cell.

But also that sounds like something that needs to be. cultural thing that you need to have in your team, right? There's no necessarily that's for just one person. It needs to be  and bred by the company.

AJ Davis: [00:07:40] Absolutely. It has to be something where you can, if you can define the ground rules before the conversation.

And then keep reinforcing them. It will make a, it will create a safe space or create an environment where people can put those vulnerable ideas out there. They don't have to worry about whether that idea makes them seem smart or informed, but instead you, you explicitly state, like there are no judgments.

We will delay any conversation. It negates these ideas and instead take something. I would borrow from improv, which I do a little bit of that on the side. and for improv, what we do is we say yes. And so there's this idea of somebody has an idea and you build on it as opposed to saying, no, that's not true.

This world doesn't have that thing to be true. And so it's even just laying ground rules, like yes, sanding in the conversation as opposed to let's shoot down every reason why this is a bad idea. I know when you do that, it really just creates negativity and doesn't allow people with the myth of Richie ideas.

Yesterday's brainstorm was so good. I really liked step's idea of running that test on the call to action buttons, making them orange will

Guido X Jansen: [00:08:43] really make them stand out. Don't you think? Yeah. Do you want to design a real AB test winners and achieve enormous conversion of lifts, then stop, brainstorming and take a scientific approach.

If you can read Dutch, follow the steps that online influence the best seller management book Delta now, and we're all in the office course and become an expert in applying proven behavioral signs. Yourself goes to all my new friends.com for more information and free down notes. I want you to do that.

If you start working with new clients, And you noticed that they don't really have this culture, they just want, we just want solutions, basically. We just want you to find the improvements and implement them, maybe even, how would you go about trying to get them convinced that adding that creativity?

AJ Davis: [00:09:33] Yeah, I think there's two tactics. I think one is. You can start with it and say, this is how we're going to do it. And here's an activity and you can come in and play fun place. You're like, we're going to come in and be the fun people. We're going to spend a half hour having fun, being creative and finding new solutions.

And people actually really liked that. It's a great way to get to know people. You can just like, open up and talk about your ideas. some cultures don't allow that and they're going to say, get to work. Give me those tests. I don't want to have any workshops. I don't need my team. Doesn't have time.

Yeah. And so if that's the case, then you can delay that and build towards it. So in the cases where it's very transactional in the conversation, they're just wanting to improve conversion. They're not wanting to think about, are we doing the best approach to it in those cases? I like to interject as we're talking about research, as we're talking about AB testing findings.

So if we see a test that lost, or it had an unexpected outcome, Then we can talk about, are there other ways that we could have approached this? And so you can start in starting those, let's talk creatively about this. I often find that if you take the same problem and you tackle it twice and you don't find a solution, that's when people are open to starting to really have these brainstorming sessions.

Or to take in some qualitative input, as opposed to saying, you guys know it, you're, you've done this a thousand times. Like I just trust you to find a solution because you're, their audience is different than every other audience. And so you do need that creativity and focus discussion around that particular audience.

Guido X Jansen: [00:10:57] When I, get through new clients, I like to do a little exercise called the marshmallow spaghetti challenge. maybe you're familiar with that one. Do you have any other specific exercises to. To yeah. To get them in the mindset of being creative, being open maybe before actually looking at the UX problem that you're.

AJ Davis: [00:11:18] Yeah. I have some just really basic exercises. I think it's a really important thing to do something like an ice breaker. So that everyone in the room has made themselves vulnerable in some semi personal way. So it might be something like name, your worst haircut you've ever had, or, tell me about your, like the first concert you ever went to.

And there's an angle of just like opening up a little bit personally and allowing everyone in the room to open up like that sets a standard of Hey, we're human. And we're all trying to work together to do something as opposed to like I'm coming in with my suitcase and my suit. And I'm putting on a show for you.

so I think that is a really important aspect to just get started on the right foot and to present. The ground rules like we talked about earlier. I also just, I love giving introverts a space to be creative as well. A lot of the times there's just this culture of sitting and throwing out ideas at each other.

And so I think sticky note exercises are really great where you can sit and doodle or have sketchpads and everyone can sit quietly and then report back. And that's a nice opportunity to go. if you take the time for everyone to sit and reflect individually, and then you go around and everyone presents it before anythings any feedback's given it avoids this like herd mentality that sometimes happens in creative discussions too, where people jump on a bandwagon, cause the first person said it and then those other ideas get left behind.

So that's, those are some of the principles that I like to operate with is give everyone a chance to think and get it up. Everyone, a chance to speak before, even weighing in on any of the options.

Guido X Jansen: [00:12:48] The default ID I would have from a brainstorm session about your exit sticky notes.

AJ Davis: [00:12:54] Yeah. It's always sticky

Guido X Jansen: [00:12:56] notes.

when you go, and do these kinds of exercises, trying to be creative, is there. A certain group composition that you would be looking for. That's ideal. Maybe it's certain people you specifically don't want to have in such a session or do want to happen.

AJ Davis: [00:13:13] Personally. I think having people who represent different, Points of view within the company can be really helpful.

So from a product background, that would mean having someone from the engineering team, someone from marketing, somebody from sales, like having representation across the board, because everyone has a different experience and lens into your customer experience. And so having those different inputs just lead to very different ideas and then you can feed off of each other.

what I wouldn't recommend is having a room just of analyst or a room full of just engineers. Like you want to make sure that the group is diverse. I think on the negative side, people you might want to leave out. You may want to leave out, especially in cultures where it's very top down, you may want to leave out leaders.

You might want people to be pretty much at the same level so that you have a chance to align as a group and then present the idea as opposed to feeling like. People are trying to impress the boss. I think that's the division. So I diverse room from different perspectives and then trying to keep people where they can all be equally on the same equal.

Guido X Jansen: [00:14:11] Yeah. Would you also include actual customers or actual users of the product or service into such a session or would it be a different session?

AJ Davis: [00:14:19] I personally do a group internally. So people get on board with the creative process before bringing in customers. I think often with customers, you want to have the customers talking to each other.

And so in an ideal, like there's no budget time is open. let's do it the best part thing you would do an internal session, you would do a group with just customers. And then you would do a group where you're mixing customers with internal. So that way you get those different mixings of perspectives.

there's a tendency. When you bring customers in that they. become the center, fold of the conversation and you, it's they're the boss, right? Like you're there people that you want to make sure feel like your team is smart. So you want to make sure that you can give people really comfortable situations to really open up and talk about different solutions.

Guido X Jansen: [00:15:03] Usually those sessions where I'm mixing clients and customers, it's usually, it's not about the session itself. It's more about. Your awareness of, getting those clients aware or the people working in, for example, digital marketing teams aware, Hey, there's actual. They're our customers and this is how they look.

There's real people

AJ Davis: [00:15:23] on the other side. Yeah. Yeah. There's the value of bringing customer voices in is just an like unlimited, there's just so much power in bringing them in at the right time and for the right moment. So I think when we're thinking about brainstorming sessions about solutions, I tend not to think about bringing customers in, but you absolutely need them because if their voice isn't in the room, you're really missing out a huge opportunity to learn and to get inspired.

So maybe in my ideal situation, instead of those three buckets, you would first start with the user research. So you have the data of what, how users are responding to some of their challenges. And then the creative solution finding would be those three things.

Guido X Jansen: [00:16:00] And in the overall optimization process, How often would you do these kinds of sessions and when the process of optimization, which you do these

AJ Davis: [00:16:08] jelly, you do them every quarter.

So you're always thinking about what have we learned and then where do we go next? And I think quarterly feels like the right pace for a lot of businesses, because you can get through several cycles of solving problems and then reflect on it. So for some businesses that might realistically be once a year and then, and then you're building out the roadmap for the rest of the year.

But in an ideal scenario, if you're really moving fast, do you want to make sure you take that time to reflect and be creative moving forward?

Guido X Jansen: [00:16:36] Of course, a lot of people that do user research, a lot of people, unfortunately don't do user research or at these companies that don't include the jets in their optimization programs, mainly focus on, analytics and, especially to the hard data.

that they do have from tools like Google analytics or whatever analytics tools they use. And then, do experiments based on that. so what would you say is the value that user research brings to that in general?

AJ Davis: [00:17:02] I think very simply we get the, what from the quantitative data and the why from the qualitative data.

So we can always see what customers are doing, looking at Google analytics, but we can only infer why they're doing it. And if we talk to customers and have them think aloud and use a research, we can really get a chance to really understand what's happening and why they're doing it. So I love to bring research in when we just don't know why a test didn't turn out the way it did.

Like I have a client who's been working at the same concept on their product page over and over for the last six months. And we're at a stage now where we have to do user research because otherwise we're going to have this endless cycle of trying to put something out there that isn't informed by. Why are customers not responding to this very thoughtful approach?

So for businesses that aren't yeah, really thinking about user research in the process, I think that's the easiest and most natural place to first plug in user research is asking. Why didn't this test perform the way we expected and can we get inspired to do something better and improve on it

Guido X Jansen: [00:18:00] for over 10 years now?

All my diet of advises about evidence-based conversion optimization with a focus on data and psychology, we see that analyzing data and recognizing customer behavior results in a better online dialogue with your clients and a higher ROI that team of strategists, analysts, psychologists, and UX specialists.

I've gathered valuable insights in your online behavior of your visitors and together with you, they optimize the different elements of your zero program through redesigned expert reviews, AB test and behavioral analysis. For more information about their services, go to Owen dialogue. It does that. You just see a, like a weird drop of points somewhere in your funnel.

and it can be very eyeopening to do those. it's very. Fairly easy to start with, right? you can, of course, complicated user research as much as you want, but in a basic user research and interview with someone or just having someone perform well, buying something on an eCommerce sites, can be very simple thoughts that you can just observe people doing.

And, like I said, you can brainstorm for hours, on why something is the way it is, but you can also just look at those users. Yeah. What are the, what are some examples that you find, that you've gone before, where something was unclear, looking at Google analytics, but very obvious, when you did user research,

AJ Davis: [00:19:22] I think navigation is often something that you can't get a really clear sense of why people are doing the things they do.

And if you do a tree test or a card sort, you're suddenly having these aha moments like, Oh, our reflect, our navigations are reflecting our business, not what our users need or expect. So I think navigation's a case where. Over and over every time people are really surprised for the kinds of things So we did a task with a customer where we took some messaging from their homepage and moved it to their product page, and it was value proposition messaging. And it just like a very foundational marketing thing. As you tell customers, why bye for now? And yeah, test didn't win. It didn't do anything. It was inconclusive.

When we move that content over. So we were trying to figure out why that would be the case, because we saw it in the heat maps that people were scrolling and seeing it, and it wasn't impacting their decision. But when we did a user study on it, we got a chance to get qualitative feedback on how people were responding to it.

and time again, across all 10 participants, people were like, so what, this is just the same as every other company would say, And so we learned that the messaging there itself, that they had used elsewhere and were using it in their sales and their marketing and all over. Wasn't resonating.

And so they didn't ever really took the time to reflect on the words itself. And we just thought that we all made some assumptions that was effective. And by the time we got to the point of user research, like it just created this tidal wave across their company of things they needed to go revisit.

So user research can be really insightful and can really pull back some assumptions that the business has been making time and time again for a really long time. Yeah.

Guido X Jansen: [00:20:57] the most obvious one from my side and the example that I've. just before, is that's, we saw a big drop off in a sort of stuff in, in the checkout process.

I don't think we had, field stack show. We didn't necessarily see which field was the, it was the problem. Just the Patria I was on. And, yeah, of course there's lots of little things happening on those pages. So you can think of anything in and brainstorm for hours, but we did a user research.

It was very obvious. there was a gifting company and they, there wasn't this field, that required you to enter a phone number of the person you were sending the gift to. And there was no explanation why the company needed that. And it was. And when we sat down with those customers, like five out of six of the day, they complained, started complaining about, Oh, I don't have it.

I don't want you to spend the person or it's it's a surprise. So why would you,

AJ Davis: [00:21:49] what are you going to do?

Guido X Jansen: [00:21:50] The surprise? Yeah. Yeah. it stands out so clearly if you do user research, sometimes not always, of course, but some of it stands out so clearly what the problem is on such a page.

AJ Davis: [00:22:00] I find that every time I deliver a user research report, I'm like, this is going to feel obvious. And it's because it's now obvious. It wasn't obvious it's before. And we got to this point and had to ask and learn, and it's becomes obvious because you're hearing it over and over from customers. But it also makes sense.

Like in that example, you just gave that there's a lot of. Questions of like, why are you asking this? And whenever you introduce uncertainty, you almost inevitably hear that in user research that they also are feeling uncertain. So set expectations. It's just a good principal set expectations for how stuff will be used when it will show up.

And then that will remove a lot of that friction. Another study that came to mind was actually a survey that we did on an exit where we were wondering why we were seeing such a big dropoff in checkout and overwhelmingly. We saw that people were leaving because of coupon codes. And it was something that we thought could be the case, but the customer wasn't really clear that made sense because they said, Oh, there's a customer there's coupon codes all over.

But 80% of the people said I'm leaving to find a coupon. And it really just changed our strategy about talking about coupons. How do we present it? Where do we present it? How do we make it easy? If we want to be a discount focusing brand, how do we make it easy to grab the code or apply the code? So just asking the questions when you're seeing that drop off in data that took five minutes to set up, these things are not complicated.

The tools are out there now that can make it really low costs to get feedback from customers, whether it's usability study or

Guido X Jansen: [00:23:25] so you just spoke about, you mentioned about, delivering those reports to clients. what's your favorite way of. basically telling them what's wrong.

what the results were from those, those studies is like video presentation or how would you

AJ Davis: [00:23:37] go about doing that? It really depends on the scale. So for the example about the survey, it was just something we shared quotes and we did it in context in our weekly meeting and everyone was like, yeah, makes sense.

Let's move forward. So sometimes I like to present it really simply. So it doesn't feel like an over-thought out resolved. It was an insight, just like an analytics insight, which pulled it from GA here's the data we pulled it from customers here's the data. But in cases like the one I mentioned where it really had a major cultural impact to the business, it was really important that we showed recording.

So I like to deliver it as a very simple executive summary. So it's easier to share a lot of the time, like C level folks. Don't have a lot of time and are used to flipping through things in email. So we like to have quick reports, quick insights pulled out in the email. A very short report with some summary SIM with some simple takeaways on the first slide.

And then for people who want to really understand and hear the customer and have the time to do that, we'll build video clips where we'll weave together audio clips or video clips of customers. So it all depends on what you're trying to communicate.

Guido X Jansen: [00:24:41] It might also depend a bit on the maturity of the client already.

How much.

AJ Davis: [00:24:46] And also how much they know you. So if you're very early in a relationship with a client, sometimes you need to present a little bit more information to build that credibility. And for customers we've been working with for several years, They don't really need us to spend that time to build a formal report because we got the insight we needed to be able to move.

Guido X Jansen: [00:25:03] So what would be your favorite, methods of doing research or do you have a favorite or does it totally depend on the issue that you're facing?

AJ Davis: [00:25:12] So from the lens of creativity and getting to the best pest ideas, I think usability studies are the best. So the reason is it gives you a chance to really see somebody use your product from start to finish.

And in e-commerce, it's a pretty standard flow for most sites. So it's easy to catch main things that every part of that website funnel. so for me, that's my favorite because it lets me level up all my test ideas. Every part of the site, I'm inspired by some new insights that I didn't have before. And it's a great tool to get a baseline.

So you can do a usability study every quarter or once a year, and you can keep rerunning that same study and understanding changes over time and what customers expect and how they're responding to the new elements on the site. So I think that's by far my favorite

Guido X Jansen: [00:25:54] with usability study, you mean, inviting participants, giving them a case to work on and do something to walk through websites.

AJ Davis: [00:26:03] Yeah, I should explain what it is. Yeah. A usability study is there's several different approaches to it. That's where it started was often in a lab setting where people would physically come to your lab, be at a computer. You would sit next to them and you would give them a script and moderate. As they're working through an application or a website, technology has enabled us to do a lot more for moat research, as well as remote unmoderated.

So tools like usertesting.com, try my UI user interviews.com. All these tools can enable you to just quickly get feedback from customers that are good representative sample of who your target customers would be. So for me, I liked the unmoderated remote studies because you can get that feedback really fast.

And it's low cost, so that trade off of time and costs to inspiration and insights is one of the best I think for CRO.

Guido X Jansen: [00:26:55] And when it's online, it's relatively easy to well to segment the type of users you get through. Most of those surfaces. Allow you to pick people from a certain country or certain experience level.

AJ Davis: [00:27:07] They also let you add screen our questions. So you can ask we've segmented our user usability studies based on, is this a current customer of this business? Is this somebody who's never purchased from them, but a purchase from their competitors, you can tier them based on their spending habits. There's all kinds of things you could introduce into the survey so that they get closer to your target market.

And you can get insights that say. For new customers. These are things that we might need to do for existing customers. These are some of the things I might expect. And so you can go really deep and start really segmenting down. But at a very basic level, just getting people who are reflect the country, the basic income characteristics.

maybe certain brands that they've shopped before is a great starting place. It's going to be fine.

Guido X Jansen: [00:27:53] Marketing here for AB testing has been impacted too. If you want to keep that thing to enterprise standards, but safe, 80% on your annual contract, you can consider comfort. But there's some release you can take advantage of full stack and hybrid features, strong privacy compliance, no blink and enterprise.

Great security. Feel good about your smart business decision infest. What you saved back in your CRO program. Check out www adult's overbook slash 2020. Comment. I got from a client a couple of months ago. They said we can do a user study, but we don't want to use any of those kinds of services, because those are people that are used to doing user studies.

Start they're hard. They're very, adapted as going through those, those websites. And those are not, are those not normal users? What would you risk your responsibility then?

AJ Davis: [00:28:49] Yeah, I think there's two ways to mitigate that risk because it's true. There's going to be some bias introduced because of the way you're doing the recruitment, but that would be true if you recruited people from Twitter, If you decided to post and say, we need people for our study. Let's sign up on Twitter and you suddenly are just looking at the Twitter segment of your customers. So from, for a tool that's unmoderated research, you can mitigate some of those things by saying, how many studies have you done in the last six months, the last 12 months, and then screening out people that way.

some of those tools will have some transparency. And so that's another thing to look into when you're deciding which tools to use is how much are they enforcing that or providing that insight back to you. And then, I think the other, the most robust thing you can do is to introduce opportunities for people to opt into your user research across.

Any interaction with your business. So for very large companies, many of them have their own databases for people. When I was at Google, we had a mix, we had our own internal database and we worked with external recruiters and we worked with the unmoderated tools as well. So sometimes it's also just about diversifying where you're getting the insights from.

Guido X Jansen: [00:29:55] Yeah, exactly. I think if I can let chance are prefer doing a mix of different, types of research, because then you can mitigate those biases that. Each method has a Nunzio. Okay. But if you still get a consistent picture over these different methods, Then properly refined. we can move.

AJ Davis: [00:30:12] It's the same thing as why we combine qualitative and quantitative.

If you just look at the qualitative data, it's not going to give you as much information as putting the two together. So same thing with recruitment, finding participants, talking to people, you can do usability studies plus interviews. Plus analytics, and you'll just be way better off than if you just did interview.

Guido X Jansen: [00:30:29] So what would be some things that you're changing or improving in the coming 12 months in regards to the way you work with?

AJ Davis: [00:30:36] I think what, one of the things we're really working towards is creating a product around this combination of user research and AB testing. We've often just plugged in user research as the relationship develops.

Because people have a better understanding of what it means to work with a company for CRS services and might be more hesitant to make that investment for research. and we're looking to really formalize and productize that so that every company has a consistent, let's start with user research and analytics exploration so that our roadmap is just automatically elevated.

Versus waiting several months before we start doing some of that work. So for us, it's, getting the word out and talking about it, productizing it, and having that be the way that we do our services.

Guido X Jansen: [00:31:18] We had some of those question in there also in our Facebook group about how you, manage billing clients.

how do you go about that? Is it like a fixed fee for a certain period? Or do you have a monthly retainer or what's your preferred.

AJ Davis: [00:31:31] Good question. we are doing time and materials at the moment. So we will set a six or 12 month engagement and then we'll work towards that budget and track to at each month.

So we just, we have a culture of transparency. And so we meet with our customers every week or every other week. And we'll report out on how we're doing against that overall budget. But the short answer is we do time and

Guido X Jansen: [00:31:54] materials. Do you have some, goals that you set, as in. We will improve your revenue by X percent.

AJ Davis: [00:32:02] Yeah. We tend to be talking about, and we tend to frame it in the, in regards to how many different tests we would be doing at which level of complexity. So we might be doing five simple test and two medium level tests and one user research study. So we'll tend to estimate how much time all that stuff will take and talk about those deliverables.

when you get into the math of calculating the specific revenue, we do estimates based on individual tests. And we'll talk about that, but we generally don't make that part of the commitment or requirement for the program.

Guido X Jansen: [00:32:34] Yeah, I think, yeah, a lot of people are struggling with that. ideally you would, I think, do it based maybe on revenue, but of course, if.

It's your hit something like COVID-19 and global pandemic, then you're screwed. and if you're doing a really good job, the client probably doesn't want to pay you anyway, if you do it revenue base, that's also an issue that a lot of, those, sheroes encounter. On the other hand, you also don't necessarily want to do with hourly base.

The South is also tricky because, it's not scalable.

AJ Davis: [00:33:01] I think the hardest thing about what we do is that if we're doing a really good job, we're creating so much more value than just what an individual test is saying. So we're in we're creating insights that potentially our whole business can act on.

And so to quantify that value and to calculate that is, it is hard. And I think all the models have shortcomings for how we can set up those

Guido X Jansen: [00:33:21] all models are wrong. Some are useful. So as the final, question, any books that you would like to tip to, to our audience, we have some avid readers or audio book listeners, and obviously podcast listeners, in our audience.

So what would you like to,

AJ Davis: [00:33:35] I'd actually like to challenge them, to insert a little creativity and fun. So rather than a book recommendation, I'm going to recommend two books, two games. That are really fun, but still database. So Dan Arielli who I mentioned early on has a game called the irrational game, which is a great game for kind of guessing how people are going to behave and how irrational they are.

And then there's a really fun game called chardy party, which is very simple, very similar to cards against humanity. And it's all about charts. So you're describing what you're seeing in the chart with different pairing of pairings of words, in the age of COVID we can read lots of books, but it's good to create some interactions with people.

So I would encourage them to check out those two games.

Guido X Jansen: [00:34:15] So can we do this online?

AJ Davis: [00:34:18] I bet you could. Do you know if you each had a version of chardy party, you could probably play it like over zoom. Okay. Okay.

Guido X Jansen: [00:34:24] So we'll have a look at that. I'll include links to those, those games in the show notes from the podcast.

So you can all have look at that. Hey Jay, thanks so much.

AJ Davis: [00:34:32] Thanks so much for having

Guido X Jansen: [00:34:33] me. Thanks for sharing your experience with, with UserTesting and being, being a bit more creative and, bringing the two this year. Okay.

AJ Davis: [00:34:41] You're welcome. Thanks for having me.

Guido X Jansen: [00:34:42] Bye bye

View complete transcript

Here, help yourself to a cup of CROppuccino

Join our mailinglist to find out when special episodes go live, what awesome industry events are coming up and to get exclusive offers from our partners.
You'll get an e-mail roughly once a month and of course, you can unsubscribe at any time
(but like coffee, this newsletter is addicting, so you probably won't want to...)

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.