Skip to main content

Product Management Webinar: Drive Growth at Scale

How to Drive Growth & Enable Better Decision-Making at Scale with Erin Weigel  

Watch our with special guest, Erin Weigel, Principal Designer, Senior Design Manager, and Conference Speaker and host, Janna Bastow, CEO of ProdPad as they explore the transformative power of experimentation in driving impactful product improvements and enabling better decision-making at scale that can propel business growth and success.

About Erin Weigel

Erin Weigel delivers impactful, user-centric products and tells stories about how she does it.

Her career started in customer service when she worked retail and waited tables. This experience developed her service mindset, which guides her unique design approach.

She has A/B tested thousands of design changes at, the world’s largest online travel website, where she worked as Principal Designer for 9 years. Her specialties are Conversion Design and building experimentation cultures. She’s currently writing Design for Impact: Your Guide to Designing Effective Product Experiments, published by Rosenfeld Media, which will be released in 2024.

Learn more about Erin and all the stuff she does at

Key Takeaways

  • Understanding Conversion Design and how it will positively impact your business.
  • How to leverage human-centered problem-solving
  • The role of experimentation and A/B testing in product design, and how to design effective experiments to generate actionable insights.
  • Understanding the importance of value creation and transfer within the ecosystem, and how Conversion Design enables organizations to achieve sustainable growth.
  • Learning how teams can leverage diverse skill sets to drive innovation and success
  • And much more!
Dots watching a webinar

[00:00:00] Janna Bastow: So welcome everybody. This is the ProdPad Expert Fireside Series that we run here. We run these things on a monthly basis and we always invite different experts from the field, people who are there doing the work and here to teach us about it.

[00:00:17] And today we’re joined by Erin Weigel, who is a senior design manager. Did I get that right? At at Deliveroo and has some really interesting takes to share with us. She’s just finished writing a book on the subject, right? On how to design for impact, which I thought was fabulous.

[00:00:35] I’ve seen a preview copy of it. So we’re going to jump in and introduce Erin in just a moment properly, but in the meantime I want to make sure that everyone knows that this is a session that is recorded. All the past ones have been recorded. So you can go back on our site prodpad. com slash webinars and go take a look at those.

[00:00:53] Today will be recorded and you’ll get a copy of that. So feel free to send it around to your colleagues or review it, go back over it, whatever you like. You will have a chance to ask questions. So please use the Q and A section or use the chat to to chat back and forth between each other. And that’s all.

[00:01:06] And these series are either as they’re either presentations from our experts or a fireside today’s going to be a fireside. Me and Erin just chewing the fat about the work that she does and things that she can teach us about designing for impact and how to drive growth and enable better decision making at scale.

[00:01:23] A little bit about ProdPad before we jump in. Okay. So you probably all know who we are. If you don’t give it a try for those who are using it by all means put your hand up, let us know. We’d love to hear from you. ProdPad is a tool that myself and my co founder built. We were both co founders of Mind the Product, as you might know.

[00:01:39] But we built it, we needed a tool to actually help us do our own jobs as product people, and it didn’t exist. So we started hacking away and it turned into something that we were using internally to solve some problems and now is being used by thousands of teams around the world. And it basically helps give help, helps you make more informed decisions about what it is that you need to be building.

[00:01:59] It helps give transparency into your product management space. So people don’t wonder where their ideas have disappeared too. It helps you synthesize all the information, what. What experiments you might run to solve which problems who’s asking about them and just helps you put it all in one space.

[00:02:14] So you and everyone else on your team know what’s being done and why we have a free trial. And we also have a free environment called our sandbox, which is preloaded with data. So you can jump in. Start playing with it. It’s got example, now next later roadmaps and example sets of backlogs and whatnot, so you can see how it all fits together and try it out for yourself.

[00:02:37] You can explore it and change it up yourself. And our team is made up of product people. So I was founded by a couple of product managers, as but our team also has product managers spotted throughout helping us reach our mission. So if you have any feedback at all, we’d love to hear it. Try out the trial and then let us know what you think.

[00:02:53] To give you a heads up on what we are working on lately we have some really interesting AI tools. So it’s a it’s what we’re calling the AI coach. We, we do have tools within ProdPad that, that allow you to do things like generate specs or brainstorm key results, or, helps you write your user stories.

[00:03:12] And those are all very good, but we wanted to take it another step. Up from there, which is using it as a coach, like a sidekick. So it can help you judge as to whether the ideas on your roadmap are actually aligned with where your roadmap is supposed to be going based on your vision. It can help give good critical feedback on your on your vision and tell you whether that’s any good or not, and help you improve on that.

[00:03:32] And there’s even a bot that you can talk to and it knows everything that’s going on in your backlog. So you can say things like. Could you synthesize last week’s feedback or could you tell me, what in our feedback is aligning with what’s coming up on our roadmap? And it’ll synthesize that and give you answers and you can talk to it like you would expect to talk to an AI agent.

[00:03:50] So give that a try. That’s all brand new. We’d love to get your feedback on that. But enough about us. Let’s talk about what we’re gonna be talking about. Let me introduce Erin. So I know Erin because she was up at the front doing a panel at last year’s Mind the Product leadership event.

[00:04:09] And I thought she had great insights and great energy, and I thought she’d make a great guest, especially cause she’s she was writing at that time and has just finished writing a book. Erin, you said it went to print today? 

[00:04:21] Erin Weigal: It did, if everything went according to plan. Amazing. If it didn’t, it was not my fault if it didn’t make it to print because I did everything I needed to.

[00:04:30] But yes, it went to print 

[00:04:31] Janna Bastow: today. Amazing. Huge congrats. So Erin is a senior design manager at Deliveroo. And it was just telling me about how she’s working on a wide range of different things from design systems and accessibility, digital, as well as real world design as well. And seemed like a real mix that we could learn from.

[00:04:48] So without further ado, I want to say a big thank you to Erin for coming along and a welcome, everybody say welcome in the chat, please.

[00:04:58] And thanks so much for joining, Erin. To start off, you want to talk us through your background and how you ended up where you are? 

[00:05:07] Erin Weigal: Yeah. So way back in the day, I actually got a degree in abstract painting. I went to a place called Maryland Institute College of Art in Baltimore, Maryland, where that bridge just collapsed.

[00:05:18] It’s very sad. But yeah, that’s where I lived for a number of years. And After graduating with a degree in painting, I realized that you can’t really make money selling large color field abstract paintings, or I spent most of my time like drawing naked people in school. So apparently people don’t pay other people to just draw hands and feet.

[00:05:40] So I got out of school and then I taught myself how to code things. Because I was working in a teahouse, so I was being like a teahouse barista, but I was also working as a toy store manager. So I was putting two different service jobs together. And then on the side, as a toy store manager, I started doing things for their website.

[00:06:02] So like uploading products starting an email marketing list. And it was from that experience that I then, like I said, taught myself how to code started getting into the marketing thing. And basically I didn’t realize it at the time, but I was running little experiments to entertain myself, standing in a store like eight hours a day, moving things around to see if I could sell more of them or, trying different pitches or, yeah, I’m just entertaining myself.

[00:06:27] And then from there, because I learned how to code and I had. Some stuff that I built up from working in the toy store in terms of like marketing and design work. I eventually got a job as a actual proper designer designing and coding an email marketing program for a small weight loss company in the United States in Maryland.

[00:06:46] And then eventually I got hired as a designer at a place called Booking. com when I moved to Amsterdam in the Netherlands. And then it’s just been a whirlwind since there. Since that point in my life, which kind of brings us up 

[00:06:58] Janna Bastow: to this point. Excellent. That’s a great start and how to get into the type of role that you’re in, right?

[00:07:04] That experimentation mindset sounds like it was pretty native to you. Part of your your core curiosity curious nature. And your book is very much centered around it’s being experimentation driven. You want to talk about where the idea for this book came from and what was driving it?

[00:07:21] Oh yeah, 

[00:07:21] Erin Weigal: That’s a really good question. Something that I do on the side or have done for, over a decade now is speaking at conferences. And I never really had any interest in doing that. I blame my old boss, Stuart Frisby, for getting me into doing that because one day he came up to me at booking.

[00:07:38] com and said, hey, we’re having like the big work conference. Would you like to give a presentation? And I was like, no, thank you. And then he basically called and told me, he’s I don’t, okay. That was not actually a question. He’s you’re going to give a presentation. So then I gave the presentation and people turned like they ended up really liking it because I was telling like stories from the toy store because I had some successful experiments that were run and they were like, Tell us about where you came up with those ideas.

[00:08:04] And it was just basically my own, face to face sales experience, real merchandising within a store that translated into real life impact and improvement of the customer experience. So I just told some stories, connected some dots, and then eventually people that were internal saw it and then started recommending me to speak outside of the company as well, too, and then it just kept on happening.

[00:08:29] But as I was doing those talks and everything people internally would say, Oh, you should write a book. And I was like, It’s so nice of you to say that I’m not like I’m not a writer but that’s very sweet. I appreciate that. And then a couple of years ago, I was on stage at a conference in front of a few hundred people.

[00:08:50] And at the end of it, somebody asked me to make a checklist. And I was like about like, how I approach design what do you look when you audit the effectiveness of an experiment or to know Why something might have failed and I was like, ooh, like that should actually be a book. That’s not like a checklist yeah book and I was like, oh my god, you’re asking me Why do people keep saying this?

[00:09:11] So I tried it and then I did and that was So it happened now. That’s the long way of answering your question. That’s why 

[00:09:17] Janna Bastow: I absolutely love that. That gives some really good context. I love that you’re not a speaker, but you definitely now are a speaker and you’re not a writer, but now you are officially a writer, right?

[00:09:25] And actually for folks listening in, I know a lot of people are curious about, doing speaking engagements or writing stuff down or that sort of thing. Speaking is a really good way to clarify your thoughts around what you might write about. Cause you get to see what resonates or what doesn’t resonate and you get to iterate on that.

[00:09:41] But, I always talk to people who are like, wait, how do I start speaking? And like you, everyone that I know gets pulled in accidentally. It’s just somebody says, you should tell other people about this. And before you know it or you’re on the conference track, you’re doing more and more of these things because people eat it up.

[00:09:54] They love it. But for anybody who’s curious about it, everyone. Yeah. Who’s here has stories that they can tell that could be compelling on a stage. And it doesn’t have to be a big stage, like a meetup. And, you, you probably have times where, something was wrong with the business or with your product.

[00:10:08] And so you tried some things and you made it worse. So you tried some more things and made it even worse. And then you tried some things and it got better. And this is your story, right? That right there is a perfect arc for a a product design type of talk. So And great to see that you followed along that path and are now at the the author part of the journey.

[00:10:29] Erin Weigal: Yeah. And for anybody who’s interested in Speaking at conferences or giving talks in front of anybody in front of crowds. It’s terrifying for everybody when they first start it. I don’t think most people ever think, Oh yes, I would love to stand in front of a whole bunch of people and potentially talk.

[00:10:44] Sharing your innermost thoughts and stories about your life and telling about how you’ve messed up. Nobody, I don’t really think anybody actually That’s what they want to do but just do it, and then it gets less scary as it happens because everybody’s really nice and it’s just a bunch of people who want to do things.

[00:10:59] So I encourage you to do it if you haven’t. 

[00:11:02] Janna Bastow: And good things come out of it. You clarify your thoughts and you build connections and that sort of thing. So yeah so you know, the book itself, it’s called a design for impact, but it’s not a design book per se. I was reading through it and I was going, this is perfectly applicable to what we talk about for product people, it’s about setting hypotheses and how to get the right data. So you’re making more informed decisions. You want to I probably borked that that description, but you want to outline what it is that you’ve been talking about and how you would describe it. 

[00:11:31] Erin Weigal: You totally did not work it. That sounded great.

[00:11:34] Yeah, basically the way I use the word design, and I explain this in the book as well, too, is there’s typically like capital D design and lowercase d design. And I really consider anybody who makes decisions intentionally with the purpose of solving a problem for a person, Really is like a capital D designer because design at its heart and the essence of it is a way of looking at the world, interacting with things and making changes to, to have the intended outcome that you’re aiming for.

[00:12:06] So I really consider many people who interact with the world in that way to be designers. And then the people that you typically. the title designers there’s a whole spectrum of other types of designers. So when I say design, I typically mean like the people who have the ideas and to change things.

[00:12:25] And then there’s the executionary types of designers, which are also super important. And I love doing that kind of design as well too. But yeah, so basically it’s about. A process to align your entire product team and ideally your whole organization around to drive growth at scale. And it’s really about removing people’s opinions and looking at quantitative data to make really well informed product decisions and then shipping only the good stuff that shows, a positive impact on your business.

[00:12:59] So that’s, and then for the purpose of Harnessing something called the compound effect. So when you make many good decisions over the course of a very long period of time, they compound to become exponentially better, which then is what drives the growth. But what a lot of companies do is they ship good stuff, bad stuff.

[00:13:19] Like you said, sometimes you make things better. Most of the time you just make things different. And some of the times you make stuff worse, sometimes considerably worse. And then all those little things just negate things out. And then you don’t have the. the exponential growth that most companies are going for.

[00:13:34] So my book really it goes through the entire process of how to align your entire team and hopefully your whole organization around this kind of counterintuitive way of working or very untraditional way of working to get on route like untraditional results that businesses really don’t achieve.

[00:13:51] If that makes sense. 

[00:13:53] Janna Bastow: Yeah. And more and more businesses are looking for ways to outperform others, right? They’ve had this traditional way of working and yet it hasn’t been netting the results. And so you’re talking about how the best companies are doing this and thinking outside the the usual box.

[00:14:06] Erin Weigal: Yeah, precisely. 

[00:14:08] Janna Bastow: And you were talking about this compounding effect. I really love that framing, right? Because it, it, as you said, it helps to outline how you need to be building out things that Make the, bring the results up to a particular level. Everything gets better and better as you do this compounding work, as opposed to the scattergun approach of just doing a bunch of things and hoping that some of them stick.

[00:14:30] Can you share examples of how you’ve seen this compounding effect work and what sort of results you might get from the map? Oh, yeah, I 

[00:14:38] Erin Weigal: think the best example is the lived experience that I had working at a company called booking. com. I worked there for about nine years I was principal designer for a number of years I was also a product person and the people manager there so I got to see it from a lot of different lenses and different perspectives.

[00:14:54] But at that company, the entire time that I worked there, we did not ship really anything unless it showed some sort of positive impact on both the user experience and very important business metrics. So if you were working in the main conversion flow of the product, you had to improve, make a conversion improvement.

[00:15:14] And you did that by solving a problem for your customers, something that was blocking them from being able to make a booking, which is why people went there. If you were working in a post booking setting, then your goal was to reduce cancellations by providing them with information that they need, or, reducing the need for contacting customer service by allowing them to self serve better, or by giving them the right information at the right time to reassure them.

[00:15:40] Or if you were working in a back end system for your For partners, and you are also measuring things like, increased signups, reducing the rate of onboarding people, the amount of time needed, or perhaps even like reducing the need to call customer service or turn. So there were always, there was always some sort of important business metric.

[00:16:01] And the metric that you tied it to was based on your hypothesis. So you think, if I solve this problem, What would I expect to see in terms of a positive business outcome if I did indeed solve that problem in the way that I’m intending? Yeah, absolutely. Booking. com is the 

[00:16:15] Janna Bastow: best example I can give of that.

[00:16:17] Yeah. And it’s definitely going to be, cause Booking. com is famous amongst products. Product circles. I know so many interesting case studies that come out of them because they’re so focused on metrics and experimentation, right? They’re the champions of A B testing. And it’s also a unique space because they’ve actually got the volume to really take advantage of A B testing and different types of testing.

[00:16:38] But you mentioned you don’t ship something unless it shows a positive result. But how do you know if it’s got a positive result if you haven’t shipped it? Are you talking about a soft launches or how 

[00:16:48] Erin Weigal: does that work? So I’m talking about keeping the change after you’ve tested it. For example, if I’ve observed a customer problem and I say, if I add a line of content here, for example is that, does that solve the problem and do I see a metric be impacted because of that change.

[00:17:05] So I would AB test it. So about 50 percent of the traffic would be exposed to that change. And then the other 50 percent would not be. And then I could see in the end and the AB testing tool, whether or not there was a significant impact on the metric I hypothesized would be impacted if I indeed solve the problem I was trying to solve.

[00:17:24] Janna Bastow: Yeah. Okay. And if you’re running that many experiments, how do you know what’s actually had the positive impact or do you have to hold things back for a certain amount of 

[00:17:31] Erin Weigal: time? So the way A B testing works is because of randomization. So anytime you introduce the concept of a randomized sample into an experiment, you’re isolating the independent variable or the thing that you’re changing as the only thing that could be possibly impacting the results that you’re seeing.

[00:17:52] So it’s the process of proper randomization that allows you to know that the change you made indeed has a cause and effect with. The impact that you see on the metric that you’re tracking. 

[00:18:04] Janna Bastow: Okay. That’s great. And so how do you make sure that you’ve got statistical significance? I know with something like booking.

[00:18:09] com, you’ve got lots of people coming through, they’ve got the whole thing that pops up, there are 47 other people about to book this one room. So you know, you’ve got traffic on that, those pages, but what about, in the B2B world when you don’t have that volume? 

[00:18:21] Erin Weigal: Yeah, so I think like a common myth that I’m hoping to dispel with this book is to make AB testing more accessible to people who think that they don’t necessarily have the traffic to make data informed decisions.

[00:18:36] Of course, at a company like Booking, where you have hundreds of millions of users every day, and it’s very easy to detect very small changes. But when you’re a small organization, and maybe you have a few thousand people, you can still make statistics. Basically significant data informed decisions when I worked at booking, I worked on a brand new product called booking.

[00:18:54] Now it doesn’t exist anymore. But we only had a few thousand users every day. I’d start an A B test. I’d wait a couple of weeks. I’d have to make a decision with 10, 15, 000 people in the sample. But the way you design your experiments then is that you have to aim for bigger impact so bigger impact changes, maybe you make more changes all at once.

[00:19:17] But there’s definitely ways that you can change your minimum detectable effect to have a lower sample size. So you just have to think slightly differently. more different in order to calculate for the amount of traffic that you got. Because if you think about it, medicine makes data informed decisions, but their sample sizes are sometimes only in the tens, only in the hundreds, but they’re able to find statistically significant results.

[00:19:42] So I don’t think if you have tens of people, let’s say a hundred or more, you just have to aim for really big impact, which is if you’re that small, you should be aiming for really big impact changes anyways. 

[00:19:55] Janna Bastow: That makes sense. You’re not spending your time deciding what shade of blue to make your links, right?

[00:19:59] Google already did that anyways. 

[00:20:02] Erin Weigal: The biggest thing that I saw had an impact when I worked on that very small product that doesn’t exist anymore was changing the shade of blue of the cursor and the input field. No way. First step of the onboarding process, because some engineer changed the default thing to have some sort of branded color.

[00:20:21] And I said, this does not look like it’s actually selected. Can you please just change this? And it had a profound effect. Very. Yeah. So sometimes interesting changes. Yeah. Because you couldn’t really tell that it was actually selected very well. It wasn’t strong enough visual signal and people were bumping off there.

[00:20:38] And I was like, it’s cause they don’t know that the field is selective, then they’re blocked. 

[00:20:41] Janna Bastow: Oh, interesting. Okay. I’m going to have to take this back to my team and ask the question. Cause we actually just did a change in the core colors that we use for links and buttons and things like that, not for conversion rate optimization.

[00:20:53] But for accessibility, we wanted to make sure that we had the right contrast and things like that. I don’t know if anybody here is a ProdPad user who noticed the minor switch that we did recently. But I might have a closer look into that and see if that’s had any impact on usage or whatnot.

[00:21:07] I would imagine hopefully a better one because it’s a bolder color. It’s easier to see more contrast. But really interesting to hear that is in fact a more impactful potential change. Yeah, 

[00:21:17] Erin Weigal: I have so many examples of color being profoundly impactful, even with small sample sizes. So I think people really under rate they think big impact is driven by big changes, but big impact is just big impact on solving the customer’s problem.

[00:21:34] And that could be something as simple as a five minute. Typography change. It could be as simple as a five minute, adding a simple line of copy. Yeah. That can be a profoundly large impact while taking very little effort and little energy to change. 

[00:21:48] Janna Bastow: So yeah. Single line to copy or little copy changes can make such a big difference.

[00:21:53] One of the things that I like to do is I like to Rewatch sessions of people trying to use the product and see where they might have slowed down and then go, Oh, if we just explained this a bit better and you watch it through their eyes and you go, Oh, yes. Okay. And this might help.

[00:22:07] And then that might help. And things like that can make a really big difference. As you said, an outsized impact for 

[00:22:11] Erin Weigal: a small amount. And when you do those things and you learn what’s really important, what actually drives value for your users and you get to know them in that particular way, then you can also do more meaningful or successful redesigns and take larger steps.

[00:22:27] So a lot of people do horrific failure redesigns. And the only way I’ve actually been able to successfully. Make large conversion boosting redesigns on things and introduce a new feature or overhaul an entire flow, is because I’d run hundreds of experiments to understand exactly where people were falling down and what actually provided the value on the page.

[00:22:49] And then amplified that and then took away everything else that didn’t matter. So yeah, that’s that’s watching customers like that and noticing those things, Jenna, is like the best way to drive an impact. 

[00:23:02] Janna Bastow: That’s great. Good insight. And I’d like to talk a bit about decision making because I think key decision making can really, drive these cumulative effects that you’re talking about, these compounding effects that you’re talking about.

[00:23:13] But when it comes to things like A B testing, have you ever heard that saying? If you can’t agree on what to do, if you can’t make a decision, just A B test it, which is actually a really expensive way to disagree. How do you use this way of working to come to a clear decision about what to build without having to build it twice every time?

[00:23:29] Actually 

[00:23:30] Erin Weigal: it’s a lot less expensive. then shipping the wrong thing and hemorrhaging money if you’ve spent a lot of time building something and then you ship it and then you maintain it and then you lose money. So I think that’s actually a myth that it is expensive because Not only as you test things you’re learning, then it can help you inform to make better decisions in the future as well.

[00:23:51] So then you stop working on the stuff that actually does not matter and you spend more time focusing on things that do matter. It actually is cheaper in the long run. And a lot of stuff that has to do with experimentation is often counterintuitive to traditional ways of working and traditional ways of thinking.

[00:24:07] That’s one of the reasons why I love experimentation because it often flips. things on its head most of the time and you really discover things. 

[00:24:15] Janna Bastow: Yeah. And that, that cost that you talk about is absolutely real. We did a bit of research on this and some back a napkin calculations and worked out that shipping the wrong feature on average takes minimum like 30 to 40, 000.

[00:24:30] And that’s not a major one, right? That’s a. A reasonable change that takes a, a sprint or two to crank out. And if you’re shipping the wrong thing constantly, just think about how much time you’re wasting, whereas, what you can be testing up front before you get there.

[00:24:45] Yeah, exactly. Excellent. So how does this scale? Does this work just for little teams or how does this work when you’ve got a whole bunch of people trying to run experiments and trying to make decisions at scale? 

[00:24:59] Erin Weigal: Oh yeah. So I think it works really well either for small companies or for large companies.

[00:25:06] I actually think it, the power really comes from the scale because A way, the way a lot of companies work is that they have, executives making decisions or hire up people who have better judgments, but who might be further away from the product making these strategic choices about what should or should not be shipped.

[00:25:25] Basically, what you’re doing then is taking the decisions about what to ship and pushing them down, but then giving the people that are now in charge of making the decision to the high quality data that typically is only saved for executives to see. So when you do that, actually you’re making the entire population of employees smarter.

[00:25:49] And then you’re taking the outsized impact of these singular people and pushing it down. But then you’re also getting the collective knowledge of all of these different people and ways of thinking, different perspectives. So you’re getting a diversity of ideas that you would not see only coming from people that have a very limited vantage point.

[00:26:08] So you’re really just You’re pushing the knowledge down and you’re expanding diversity of ideas and you’re really empowering people, which then takes all of the pressure off of the top level and the top level really becomes more of like a cultural so then they set things like, what are the important metrics, trying to figure out what are the proxies, what are behavioral proxies that are important to aim for that people then, and then, Then they have a lot less to do.

[00:26:33] They think more about really long term strategy. And not necessarily what products and features should we ship. That’s not, I don’t think that’s what executives should be doing. They should be thinking like, how do we keep the business to float looking down, years down the line, not what should be, you’d be shipping this year.

[00:26:48] That’s seems like a waste of their time, but that’s how it feels. 

[00:26:52] Janna Bastow: Absolutely. And that framing is really cohesive with what I love about OKRs. One of the things about it is that you are having your top level management set the, what’s important to the business and then the the objectives from there cascade downwards.

[00:27:07] And what’s really happening there is that you are empowering and enabling and making use of the collective knowledge of your company, right? Because what is a company besides a collection of People who, are hopefully smart at what they do and can help you make more informed decisions by giving you, it’s almost like having a thousand eyes around the business and they can all spot things that might be useful for the business to tackle or problems that they need to get around or whatever it is.

[00:27:31] Put it this way. We’re knowledge workers. So we’ve got to use our knowledge. And if you’ve got a whole, a thousand people in your company and you’re having it all managed by one person here and one person above them, that’s your middle and top management, you’re going to end up with bottlenecks and you’re going to end up with fewer insights than if you’d enabled the rest of the team to say.

[00:27:49] How are we going to break down these goals and tackle them? And how are we going to go about experimenting our way through this? So it’s a really good way of framing things and making use of the business as opposed to trying to, work the more traditional way. 

[00:28:01] Erin Weigal: Yeah.

[00:28:01] And the really cool thing about experimentation is that it puts guardrails in place. So it empowers people to not have to ask for permission from people. There are so many companies that are like, Oh, I’d love to do this, but I have to, it’ll take months for me to get this idea through stakeholders.

[00:28:16] And it’s really just me sending an email, but if it was an idea that I had at booking, I didn’t have to ask anybody permission to do that. The quality guardrails that were put in place was. the level of quality of the employees, getting feedback from my direct peers, making sure that everything was based on research.

[00:28:34] And then by running the AAB test, it would tell me, is this of a high enough quality to keep because it’s done what I wanted it to do? Or does it just, stop and then I just roll it back and then. There’s no harm done. You know what I mean? I, or the, any harm that was done, I’ve learned that now, and then I can take what I’ve learned and then make a better decision next time.

[00:28:53] Yeah. 

[00:28:54] Janna Bastow: What’s fascinating to me. You made this point is that, this is not necessarily the most intuitive way to work or it’s not the way that we’ve traditionally worked, certainly not in product or development. A lot of times we’ve been held to somebody else, more or less coming up with a roadmap and telling us to go run at it and we’re stuck to these constraints and, if you built the road map that you thought was a good idea at the beginning of the year, chances are by the time you get there, it’s not the right stuff.

[00:29:17] So we know that this way has got to change, but making that point to our I was going to say elders, our execs can be can be difficult. But. One thing that I find drives it home is that this experimentation mindset isn’t brand new to the business. As a matter of fact, other divisions already do it like sales.

[00:29:33] For example they don’t sit there at the beginning of the year and say, oh we’re going to deliver. We’re going to close this sale and this sale. We’re going to get a million from so and so they don’t know that what they’re going to do is they’re going to employ some people. To run experiments and by experiments, picking up the phone or shooting off emails or doing whatever they do and by following this process of running a lot of experiments, many of which will fail because it’ll get hung up on.

[00:29:57] That’s what happens. Some of them are going to succeed and you don’t know which ones, but as long as enough of them succeed. Then you are likely to close the pipeline and get the deals that you’re hoping for. So you don’t know who’s going to buy a million dollar deal at the end of the quarter, but you know that somebody will, because historically that’s what’s happened when you’ve run this process of experimentation.

[00:30:15] The same thing for marketing. Marketing if you ever heard that that saying half your ad dollars are wasted, you just don’t know which half. Marketing is able to experiment. They throw things out there and go does this work? Okay. This one doesn’t work. Okay. Does this work?

[00:30:29] This one does work. Let’s do more of that. And we’re not asking for anything different. We’re just asking to be experimentation driven here in the product sphere as well. We’d love to hear from people in the chat. If anybody’s struggled with making this a point or if anybody’s got tips on how they’ve convinced their company to move to more experimentation driven way of working.

[00:30:48] But where would you. Suggest teams start with this. If you’re in a more traditional frame, how would you get, how would you help someone make the point to their company that they should be working this way? What sort of first steps would you take? 

[00:31:01] Erin Weigal: Ooh, that’s always very tricky. I think every company is very different.

[00:31:07] So there’s no one size fits all solution to try to get. To adopt this particular way of working, but I think the biggest tip that I would give people is maybe to read up on company culture because really when you think about it it’s basically changing the behavior of a large group of people and there’s a lot of interconnected social dynamics that happen to get people to behave in a certain way.

[00:31:29] And one of the best books that I’ve ever read is called Leadership and, oh God Organizational Culture. Leadership and Organizational Culture by Edgar Schein. Yes, Brain. Thank you. Good one. 

[00:31:42] Janna Bastow: All right. For everyone listening, we’ll put that in the show notes and all that sort of stuff. We’ll give you a link to it afterwards.

[00:31:48] But tell us about this this book and what you, what we learned from it. 

[00:31:52] Erin Weigal: Oh, yeah. So there’s a lot of really great case studies in there about first of all, it explains what is organizational culture, how can you influence it it gives you a lot of great anecdotes about, change initiatives that have failed and what made some successful.

[00:32:06] And another really great book that I read as well that I’ve utilized a lot in different types of organizational change. Is John Cotter’s leading change book as well. Dr. John Cotter is like another organizational, I think he’s an organizational psychologist slash business school professor. But that has a really great change model.

[00:32:26] But I leaned on before. But. I think it starts by assessing the culture that you have now, trying to identify who are the key change agents, or how or why is that culture the way that it is now, and then figuring out who do you need to influence, what systems and processes need to be changed what artifacts need to be created to get people to behave in a certain way.

[00:32:47] So I would start by educating yourself. And then figuring out how your culture is now and who shaped it that way, and then trying to influence them. Yeah, 

[00:32:57] Janna Bastow: absolutely. And the reality is that, the way that you start making this change is, it depends. It depends on what kind of pushback you’re getting.

[00:33:03] Are they pushing back because they’ve always worked this way and they, they don’t know anything better? Are they pushing back because they think that by dictating what you should do, it’s going to make you work faster and get more done? It doesn’t. We know this now. Is it because it’s a sales led culture and you’re stuck in working for an agency that’s trying to build itself as a product company?

[00:33:22] There’s a lot of different ways, reasons why you might get that pushback. Always good to look at those and then address those individually. 

[00:33:29] Erin Weigal: I’m curious to hear from webinar chat, like how many of you run like A B tests or do experiments? 

[00:33:37] Janna Bastow: That’s a really good question. Here at ProdPad, we occasionally do, but we don’t do it relentlessly booking.

[00:33:42] com does, for example. I think that’s partly because we don’t have the the traffic. We’ve always said unless we have a statistical significant difference, then it’s hard to really say. But taking your point about how, using it to tell the difference between more impactful features.

[00:33:58] So one of the things that we do, I guess this is like an A B test, we have a beta group and we’ll see how that beta group gets on with a feature versus the non beta group who doesn’t have access to that change or that area. 

[00:34:10] Erin Weigal: What’s interesting about statistical significance and having lower sample sizes to work with is that you decide.

[00:34:17] What level of impact and how much uncertainty you’re willing to shoulder. So those things, the inputs are actually variables. So if you say, I don’t have that, I don’t have that much traffic, but I think this could have a pretty big impact. You decide what amount of impact you care to see. So let’s say you may want to see a 2 percent absolute change, but you’d say, but this is a pretty low risk thing.

[00:34:40] So I’d be also happy to know that, 80 percent of the time it would outperform. Base, for example. So you just have to adjust your power and your significance level to account for the sample size that you have. And as long as you’re hedging your bets in the appropriate direction and you’re making more good decisions than bad, then you’re growing your traffic in that way.

[00:35:03] And then it becomes easier to experiment. So you might have to deal with more uncertainty, the smaller your organization is, it is. And then. Only care to see larger effects, but you decide, what. What you consider statistically significant. And I think a lot of people think that you always have to have a significance level of 95 percent or that you always have to have a power of 80 percent or whatever, but those are actually variables and you can change those based on your situation.

[00:35:30] Right. 

[00:35:30] Janna Bastow: And so these these variables, it sounds like it gets pretty deep into some complex stuff. Are there like set rules as to what to, how to measure this stuff? Are there rules of thumb or is it all based on learned experience and gut feel that you develop over time? 

[00:35:46] Erin Weigal: Sorry, I was just reading Dominic’s point in here.

[00:35:48] Can I read it out to everybody and you can ask me that again? 

[00:35:51] Janna Bastow: Yeah, absolutely. Go for it. So Dominic in the chat was talking about how they do experimentation. 

[00:35:58] Erin Weigal: Yeah. So he said my dev team is usually excited to try experimentation. They really, devs love logic. You know what I mean? It’s it does work or it does not work.

[00:36:08] And they understand, I think they, a lot of them understand this like concept inherently. Anyway, so you got me on the first sentence. The expert effort is usually the biggest barrier. And I think So true because devs are inherently lazy people. That’s why they’re developers. They like to automate stuff.

[00:36:24] They like to, like they like to write as little code as possible. Also agreed. So the extra effort is usually the biggest barrier, but I found that once you’re over that hurdle, the increased confidence that comes from running experiments is something they really appreciate.

[00:36:36] Agreed. One of the biggest pushbacks that I’ve heard come from engineers before is that. Not only the extra effort, but then they say, Oh, it makes the code messy. But I’ve always pushed back and say, then you just clean the experiment up when you’re done. It’s just there for a little bit of time. You know what I mean?

[00:36:52] Yeah. Experiment. You don’t, you just clean it up. It’s literally not a problem on them that, Oh, just slow down a little bit, clean it up and then it’s totally fine, but often dabs are just the sort are just told to build stuff and given business justification, which does not really inspire full competence.

[00:37:09] Yeah. So you’re really not getting the best out of these engineers. They’re typically really brilliant people who have a lot of. Seeing ideas of how to improve the product that product people or business people could never even think of in terms of reducing errors or warnings or improving performance or something innovative to do with a new technology that you’re not getting those good ideas because they’re not empowered to think in this way.

[00:37:33] And they’re not. Enabled to just try something to see what happens and whether or not it actually has the impact they’re aiming for. Yeah, absolutely. Yeah, I completely agree with you. 

[00:37:44] Janna Bastow: And you made some good points there about developers and one of the ways that I’ve gotten developers on board with A B testing in the past is there’s often this pushback because the work that they’re creating might get thrown out, right?

[00:37:55] It’s this temporary test that, that. They make and then it’s tossed and that can be disheartening. But the way that I framed it is that is the work, right? Your work is not how many lines of code end up in the final product. The work is doing these experiments and learning from them.

[00:38:11] And yes, it means throwing stuff out, but it’s no different than, how our writing team writes stuff and then edits it down. We, we’ve got to be able to put things out there and see what works and not be ashamed of removing features if they don’t work. 

[00:38:24] Erin Weigal: Yeah, the work is delivering value to your customers that then transfers value back into your business.

[00:38:31] That is the work. Yeah. Yeah. And yeah, just like you said, It’s a, and I think that’s a, again, thinking counterintuitively, again, this is the main theme, people have a bias in their brain sunk cost fallacy. It hurts, it hurts our brains to have to throw stuff away after we’ve made it. So it’s almost anti human to work in this way, but again, which is probably why it’s and it’s very effective because a lot of people don’t want to work this way, but actually the way to make things better is to be okay with throwing stuff out that doesn’t work.

[00:39:00] Janna Bastow: Yeah. And being wrong as well. One of my go to phrases is the phrase, I bet. Because I bet. gives you this freedom to take a chance on something, right? I think we do this. I bet that this is going to happen. And the thing is you’re going to be wrong a lot as a product manager, but I bet takes that pressure off.

[00:39:19] You’re not asserting what’s going to happen. You are supposing what might happen. And if that bet is wrong, that’s fine because he made three other bets. And now you can go test those ones without feeling like you were in the wrong or somebody else was in the wrong for their bets. Yeah, I wonder 

[00:39:32] Erin Weigal: if I bet, let’s just ask a question and See what comes of it, yeah, exactly I’m looking in the chat here too to see what other everybody’s saying I’ve been fortunate to live on the innovation side of the world and oftentimes bleeding edge for most of my career And all of the teams that i’ve been a part of or have run and by the way God forbid the business developer egos were checked outdoor and experimentation critical thinking exploration was the objective You As these teams were tasked with developing brand new business opportunities and the very least ABC testing was required as the management check points.

[00:40:05] That’s great. Like it’s not very often. Thank you for sharing that, David that you come to a place where, testing is like required 

[00:40:13] Janna Bastow: in that respect. Yeah, those companies definitely do exist and there’s more and more of them. When we were talking about this stuff like a decade ago, this stuff was inherently uncool.

[00:40:21] But the way that we do product management these days has been reframed around making space for this. It’s one of the hallmarks of the Now, Next, Later roadmap. The three columns with now next later at the top does not make a now next later road map. The framework is all around being outcome driven, experimentation driven aligned with your objectives.

[00:40:39] And 1 of the things that you can do with it is basically say here’s the problem we’re trying to solve and then ask that question. How might we solve this problem? And there should be not just 1 idea. On there, there should be multiple ideas on there. Multiple experiments you might run in order to solve that particular problem.

[00:40:53] Teresa Torres calls it, not falling in love with your idea. And that’s why she has the opportunity solution tree. So you branch out and say where else you might, what are the solutions you might try in order to solve that problem? And that’s what you’re really doing is saying yeah we’ve come up with this potential solution.

[00:41:07] But what problem does it solve? If you can’t identify that, it doesn’t deserve a place on your roadmap. If you can put the problem on your roadmap and then say, okay that was one solution, but what are like three other ways we might do this and just spit ball things that you might try. And some of the changes might not be adding code, adding new features.

[00:41:25] A lot of old school roadmaps basically just say, here’s a bunch of features we’re going to add, and when we hope to have them by, and that’s Just ends you up with a whole bunch of stuff built, right? That’s not going to help you with this compounding effect that you’re talking about. What you’re trying to say is here’s three things we might try and this one we’re going to try changing the feature.

[00:41:42] This one we’re going to take this feature away. This one we’re going to edit something, just change the feature a little bit. This one we’re going to change the pricing or the packaging or the way we talk about it on the home page. So there’s experiments you can run that don’t necessarily just mean adding new product code, right?

[00:41:56] There’s there’s so many other ways that you can experiment 

[00:41:59] Erin Weigal: I feel like a lot of the things that we do and products and business is just to give ourselves is because of these cognizant cognitive biases that we have in our brain, people like to things because we get a dopamine hit, checking things off of our list, it gives us the perception that we’re making progress, but that doesn’t mean that because we perceive it that way, that it is actually Progress.

[00:42:23] And like you said Jenna, I love your now, next, later thing and I would like to propose an addendum to that. Can I? Because I love it so much and I’ve been thinking about it. Never. Now, next, later, never. Brilliant. Because it captures, yeah, it captures the part of a business strategy. People always think about strategy as what you will work on and why.

[00:42:44] But I feel like the best strategies also clarify what you will never do. Yeah. Because. XYZ strategic reasons. So I’d like to say, I’d like to add a never column. 

[00:42:56] Janna Bastow: I love that. And actually that’s how we’ve seen people using the app because we have the now, next, later columns, but then we also have a hidden column off to the side called candidates.

[00:43:05] But we’ll see people making use of that and say this is the stuff that’s not on the roadmap. And on the flip side, we have the completed roadmap because it’s all about trying to make sure that you’re actually reaching your outcomes. So everything you put on the roadmap, you have to say what The potential outcome is what the target outcome is.

[00:43:20] And then once something’s finished on the roadmap, you don’t want to just have it fall off. You want to put it somewhere where you can say, here’s what actually happened. And this is where you’re going through the test results and, making sure that you’re, totting up what worked, what didn’t work so that you can say, was this a success or was this a failure that we need to scrape out and try again?

[00:43:36] Yeah. 

[00:43:37] Erin Weigal: Do a serious retro. 

[00:43:39] Janna Bastow: Yeah. So I love that sort of usage of the roadmap where you’re identifying what’s what needs to happen. What doesn’t need to happen. What’s winning. What’s not winning. 

[00:43:47] Erin Weigal: Yeah. I see that. I had a comment in the thing here. It said great North star delivering value to your customers, which in turn delivers value to your business.

[00:43:57] Yeah. Yeah. I really. Consider that. And that’s another like big theme that I have in my book is the concept of the value cycle, which is literally the, like the point of all work, any like commercially driven work, which most of us have commercially driven work, unless you work in the nonprofit sector, they have their own value cycle as well.

[00:44:17] But it’s really about, Delivering value and finding those things that you can do for your customers that then transfer the value back into the business, which then the value gets then transferred back into your employees. So when you deliver value to your customers and then to the business, you ideally, if the system is working appropriately.

[00:44:37] Yeah. Capitalism breaks down so often, but you should be seeing, you should be reaping the rewards of actually moving that value cycle forward for your customers and your business. 

[00:44:47] Janna Bastow: Amen. Absolutely. It’s a really good way of thinking about it. So there’s a question from Sylvia in the Q and A area. If anybody else has other questions, drop them in there.

[00:44:56] But Sylvia asked, how do you measure impact of larger features like those that can’t be easily done by A B testing a content change? 

[00:45:05] Erin Weigal: Oh, yeah, no larger features can also be tested. I think I’ve tested much like really large features before and I think it’s actually the larger the feature, I think the more risky it actually is for your business because it’s taking up, it’s making a new part of a flow or taking up a certain amount of real estate that could be spent for the thing that is actually delivering the core value of your product.

[00:45:28] So I think larger features should be tested as rigorously, if not more rigorously, than the small content changes or like the little tweaks, because those things are, I guess they could be really high risk. You never really know what delivers the value, which is why you test. Yeah. But if you’re going to invest a lot of time building something and maintaining it and marketing it, it should be doing what you expect it to be doing.

[00:45:54] And yeah. definitely released large features in an A B test before and then ran it for a couple of months and then crunched the numbers afterwards to see, did this have the impact that we thought it would have? Should we release this large feature to the rest of our customer base or should we toss it in the trash can and try something else instead?

[00:46:19] So that’s the way to answer that question. Measure it in the same way that you’d measure any A B test. It’s just.

[00:46:26] Yeah. Okay. And that makes sense. Maybe more, and also maybe more work to make sure that you’re keeping your your control and treatment group properly isolated. 

[00:46:36] Janna Bastow: Yeah. Absolutely. Thanks for the the answer there. And we’ve got another one here from Amadreza who asked I have the same challenge with the dev team about experimentations when the culture of business is just to deliver roadmap items in a rigid with rigid due dates, it’s really hard to persuade the team to handle experiments where the results will shape the solution.

[00:46:56] What do you suggest to change this culture? 

[00:46:58] Erin Weigal: Do you know what I would maybe do? I would maybe put it back on them and ask them what would they like to test? Because I feel like I guess it also depends on the devs. There’s so many answers for it depends. They seem like the curious type of people.

[00:47:12] Sometimes developers just want to have their own mark on things. So maybe say what’s a problem that you’ve experienced? Or what is something that you could potentially see going wrong? So allow them to come up with the ideas. Because then they have a stake in, the success of the experiment, or if it didn’t succeed, then you could say we should we go in that direction then if we know that didn’t work or that it didn’t move the needle in any way that we thought it would be.

[00:47:36] So I think once people get the experimentation itch or bug. Somewhere, then they want to do more of it, and then they start to say do I actually know that, or is this the right thing to do? And then it becomes more of a natural way of working for them. That’s how I handle it.

[00:47:50] Maybe let them come up with some ideas, or spitball, or 

[00:47:53] Janna Bastow: whatever. Yeah, I love getting them involved with what the actual problem is, right? So if business is saying we need you to go do these things. Okay, great. Your job as a product person is to go back and say why are we doing these things?

[00:48:04] What problem is it solving? What are we trying to do here? Help me understand. Because maybe the things that they’ve outlined are in fact the right things to go do. You don’t know. Maybe they’ve done their discovery. Probably not. But we want to get to the heart of it and figure out why they think these things are going to be in fact impactful.

[00:48:19] And once you have the the why or the problems that they need to solve, you can take that back to the rest of your team and say the problem we’re trying to solve is this. And one experiment is we could do this thing that We’re being told to do by a certain date, but what other things might we do to get there faster or do it more effectively, or, that we could try against this thing to see if it’s the, in fact, the right thing, how can we prove that this is, or isn’t the right thing to do?

[00:48:40] And what other things should we be considering and that’s how you can get your developers involved as if they see the problems, they know why they’re working on stuff and you might end up, validating everything you’ve been told to do, and you build exactly what you’ve been asked for.

[00:48:53] Chances are, you’re going to learn things along the way and it’s going to provide more value for the business. 

[00:48:59] Erin Weigal: Blake had a really interesting comment in the chat over here. He said, never on the roadmap is really interesting. It’s effectively representing all the rabbit holes and distractions. We hopefully consciously avoided to be compared to the done list.

[00:49:13] We’re the exercise to compare what outcomes we delivered in the last two quarters versus the outcomes features, shiny objects that we didn’t spend time chasing, and I think. Honestly, the idea, Blake, I think you nailed it on the head of adding the never there. In businesses, you’ll often see people come in and say, Oh, we could do so much if we just sold X, Y, Z as well.

[00:49:34] You know what I mean? And then they spend time writing a business case and trying to pitch that idea or get buy in for it. And they probably don’t know that 50 other employees that came in before before them and had the exact same idea, maybe they did the discovery or whatever, but when you put a never on there and you say, and these are the reasons why, then that lets them know, okay Yes, it is never, and because of these reasons why, and then you just can say, but please focus on this instead because of these reasons why instead.

[00:50:05] So I feel like you said, it shuts down those conversations and it gets rid of distractions and really keeps people laser focused on delivering tangible value to your customers regularly. Yeah, 

[00:50:16] Janna Bastow: absolutely. And you used a turn of phrase moments minutes ago now when you’re talking about being data informed, and I love that turn of phrase because I see it as different than data driven.

[00:50:26] Do you see that as well? How does that differ for 

[00:50:28] Erin Weigal: you? Oh, yeah, totally. I intentionally use data informed. It’s like second nature to me at this point because data driven, it makes it sound like the data is making the decision for you. And I am very intentional that data makes no decisions it should not be making decisions because data itself has no understanding of the why behind the results.

[00:50:50] That it’s reporting, basically, and at the end of the day, human beings capable of feeling, thinking, empathy, managing multiple forms of evidence all at once ultimately are the ones that make the decisions, so I do feel like Being data informed and being able not only to take into account.

[00:51:12] Yes, that hard number that got returned, but also being able to counterbalance it or weigh it in with maybe some other qualitative data that you have. And then making a highly informed, intentional decision that benefits the company both in the longer and short term is ultimately what the goal is to make highly informed optimal decisions.

[00:51:34] And following the data is not you thinking. And this is really about critical thinking and making highly informed optimal decisions. 

[00:51:43] Janna Bastow: That is wonderful advice to to end on. We’re running out of time here. But Erin, do you want to give us a headline on where we can learn about your book when it’s coming out and all that sort of stuff, where we can learn more about 

[00:51:53] Erin Weigal: you?

[00:51:54] Yeah, so I have a website that’s probably in a hot mess of a state at the moment because I’m currently making all kinds of changes to it, but it’s erindoesthings. com because I do a lot of different things. And then also my publishing company is called Rosenfeld Media. They make a lot of amazing user experience books, which is the only reason I pitched them because I grew up with them.

[00:52:15] I learned so much from their books. So check out Rosenfeld Media com as well, because, that’s where my book will be. It’ll be on my website and you can check out all the other awesome books that you can learn 

[00:52:27] Janna Bastow: from as well. Awesome. That’s really good to hear. Thank you so much. And before we go, I just want to let people know about this next webinar that’s coming up.

[00:52:35] It’s not going to be one of our outside experts coming in. It’s going to be me and one of my colleagues talking this through, but we’re going to be talking about how to stop your roadmap from being derailed and you turning into what I call the agency trap. When you take on that bit of custom work for that one client because they’re willing to pay a little bit of extra and you’re like, that’s fine.

[00:52:52] And then you do it again and again. And you never seem to have time to work on the important things that are going to appease to the wider market and help you go from here to here. I call that the agency trap. I’ve been in it myself in a previous company. And so I’m here to, I’m going to be talking about how to spot it and how to get out of it.

[00:53:11] There’s more on this topic that we’ve been talking about, the now, next, later stuff. We’ve got a guide on how to get out of the the timeline roadmap and how to move into the the timeline. Now next later way of working. So do check that out.

[00:53:23] On that note, I want to say a big thank you to everybody. Thank you everybody for jumping in on the chat and your questions. I’m sorry we didn’t get to everything. There was a flurry at the end there but that always happens. And Erin, thank you so much. This has been really good insight and learned a ton.

[00:53:41] Erin Weigal: Thank you so much for having me. 

Watch more of our Product Expert webinars