Skip to main content

Product Management Webinar: Product Idea Testing

Test Your Way to Success: Mastering the Art of Product Idea Testing with David Bland

Watch our latest webinar with special guest, David Bland, CEO and Founder of Precoil, and host Janna Bastow, CEO of ProdPad and inventor of the Now/Next/Later roadmap as they guide you through the process of testing and validating product ideas to increase your chances of success. If you’re looking to reduce the risk of failure for your new product – then this is the webinar for you!

About David Bland

David Bland helps people test business ideas and is the co-author of ‘Testing Business Ideas’ with Alex Osterwalder. David pioneered GE FastWorks with Eric Ries, coached emerging product teams at Adobe, and even helped Toyota apply lean startup practices. Before his transition into consulting, David spent over 10 years of his career at technology startups. He stays connected to the startup scene through his work at several Silicon Valley accelerators.
You can get his latest book on testing business ideas here: https://davidjbland.com/my-book/.

Key Takeaways

  • How to systematically test product ideas to reduce the risk of failure
  • How to integrate Assumptions Mapping and other powerful lean startup-style experiments
  • Practical tips for making experimentation a continuous, repeatable process
  • The importance of prototyping and experimentation
  • And so much more!
Dots watching a webinar

[00:00:00] Janna Bastow: Welcome to the Product Expert Fireside that we’re running here at ProdPad. My name is Janna Bastow and I’m joined by David Bland here and today’s session is going to be about testing your way to success or mastering the art of product idea testing. 

Before we jump into, the heart of the content, I just wanted to introduce ourselves. ProdPad is a tool that we built when we were product managers ourselves. I used to be the head of product at a company in London when I needed a tool to help me do my own job. We needed something to help us keep track of experiments and feedback and the objectives and all this other stuff that we’re trying to keep our hands on- so we built it. ProdPad’s now used by thousands of teams around the world. And it, what it does, it gives you control and organization and transparency, helps you create a single source of truth for all of your product decisions by helping you build a better roadmap, helping you gather your customer feedback, helping you prioritize the ideas and experimentations that you’re doing. it’s something that you can try completely for free. We even have a sandbox mode that you can, try out yourself if you head to sandbox.prodpad.com, where it has example data including lean roadmaps and OKRs and other stuff that sort of fit together so you can see how it all works.

Our team is made up of product people and was founded by myself and, Simon Cast. We’re both also founders of Mind the Product, we’re product inside and out. We’d love to hear your feedback, we’d love to hear what’s working, what’s driving, and what’s not working. From there, we can always constantly improve the product. We do releases every week.

And speaking of releases every week, here’s one that we did just the other week. we have, many of you have been probably playing with the latest stuff you can do with GPT and the likes of AI out there in these days. a new one that we’ve just launched is the ability to fill out your idea details using, AI. So give it the high-level details and it’ll fill out things like the target outcomes and give suggestions on what problem you might solve and help you fill out your user stories. something that you can review and make changes to. So it just speeds up that process. ProdPad is very much about trying to reduce that grunt work for you.

But another one that we just released is an AI product coach. And the first version of it allows you to take any ideas from your backlog and compare them to your product vision. So it basically looks at your idea and says as to whether it’s on track with what it is you’re trying to build on the whole and gives constructive feedback on whether it, fits or not. and Steve Daniels in the chat there said that he’s been playing with it today, said it’s, very good, particularly the user story stuff. give it a try. We’d love to get your feedback on it. It’s all brand new. It’s in beta. But if you start your trial today, you’ll be able to start playing with it right away.

So that’s enough about ProdPad. I am really excited to introduce our special guest for today’s webinar, which is David Bland. As a leader in the world of business innovation, David is renowned for his insights on how to test and validate business ideas. He is the CEO and founder of Precoil, a company committed to helping other businesses make informed and strategic decisions about product ideas. His experience spans a diverse set of industries and roles, from pioneering GE FastWorks with Eric Ries to coaching product teams at Adobe and even advising Toyota on lean startup practices.

David’s been, David has a unique perspective on business innovation. And before transitioning into consulting, he spent a decade in technology startups where he nurtured an instinct for understanding what makes a successful product. And he stays close to the startup pulse through his work with several Silicon Valley accelerators. he co-authored a book with David with Alex Osterwalder, the Testing Business Ideas, and it’s become a key resource for business leaders and innovators alike. And he’s known for delivering, engaging, thought-provoking presentations and workshops that challenge audiences to rethink their approach to innovation and product development.

I think we’re really lucky to have David here with us today. David and I were having a chat earlier about when we actually first met each other, which was, seems to be, a meetup in London, 10-something years ago, and we’re talking about some of the takeaways that we had from that. So it’s great to have him here joining us for this conversation. Thank you, David, for joining us. And without further ado, let’s delve into the mind of David J. Bland.

[00:05:04] David J. Bland: Yeah. Thanks for having me. It’s, it’s great to catch up [inaudible 00:07:57].

[00:05:06] Janna Bastow: [laughs] Absolutely. could you share a bit of your, a bit about the journey that led you to start Precoil and to where you are today?

[00:05:15] David J. Bland: Yeah, my background initially was in design, I went to school for design and, got pulled into the whole dot-com era, so dating myself a little bit there. But I thought I would just retire somewhere on an island, two years into the dot-com, startup scene. And, that didn’t work out, obviously. I learned a lot, The first startup I joined, we thought we were a business-to-consumer and we ended up being a business-to-business startup, and all the growing pains of doing all that. I was there for almost seven, eight years before we were acquired, I think, for 16 million back in the day.

And so I, I learned a lot from that experience and I went to other startups where we just kept persevering no matter what, We’re like, “Oh, that customer doesn’t get us. That’s okay. We’re gonna keep building this stuff.” we cratered a couple of companies as well. And so I thought, by the time the third one, I was packing my stuff up in a cardboard box, I “Well, maybe I can help other companies learn from my mistakes.”

And I moved to the San Francisco Bay Area and, really started coaching and advising, a lot of big companies at the time, but also still staying connected with startups. I work with some VCs in Silicon Valley and some accelerators and really just try to, share what I’ve learned and help them test stuff, usually from idea to product/market fit is usually where I’m focused. And fast-forward to today, yeah, just still doing it, still enjoying it, still working on a lot of different cool ideas. And, and they’re not all software-related. I work in a lot of different industries. So it’s been a really fun journey.

[00:06:33] Janna Bastow: Yeah. That’s excellent. That’s really cool. And, actually you mentioned that you’re working in a lot of different industries. could you tell us a little bit about what are some of the similarities between testing across different industries and versus some of what are the differences? namely, I’ve talked to people who work in hardware who think that software is completely alien. I when it comes to idea testing, is it?

[00:06:55] David J. Bland: I think a lot of the principles still remain the same. really, I love the framing of desirable, viable, and feasible. And I’ve incorporated that into assumptions mapping in a lot of my work. And no matter if I’m working on something in space or something here on Earth, [laughs] if it’s hardware/software, I find that you still have to answer those three questions, right? You have to say, do they want this? Is this some kind of need? Or does, is a job to be done unmet need you’re trying to fulfill, a pain or you’re trying to solve for or gain you’re trying to create for a customer?

Uh, with viability, it’s a lot of should-we questions around, can we do this with enough impact or revenues to keep this sustained, and keep this going? And then feasibility, can you do this? It’s not just technical feasibility, but it’s also regulatory, governance, compliance things you need to work out, you need to worry about, especially with things like AI. And, and with stuff emerging now, yeah, you can do it with regulatory shifts and everything, it’s really challenging to basically address all that as well. and so regardless if it’s hardware or software, I find that still holds up, no matter what I’m working on. Now the differences I find, obviously, with business-to-consumer, you have many more customers you can test With business-to-business, you might only have 8 to 10 or something like that or 15.

And so I’m finding with the B2B work that I do, especially hardware, it’s a lot of preference and prioritization work. It’s a lot of jobs to be done work. It’s a lot of, discovery, trying to figure out and co-create with your, with your limited, sample set of customers. And so there’s not a lot of, let’s say, let’s run a crowd cam-, crowdfunding campaign or stuff like that. But a lot of the principles I find applies just to practices vary a bit from B2C and B2B in hardware and software.

[00:08:36] Janna Bastow: Yeah. Excellent. And actually I’m really glad that you mentioned assumptions mapping right upfront because it was actually in the previous webinar we did earlier this year with Itamar Gilad that he mentioned assumptions mapping and then said, “Uh, but you’ve got David Bland on. He can tell you about that.” [laughs] Can you tell us, can you walk us through what assumptions mapping is? ‘Cause I don’t, I’m not sure everyone would have heard about it.

[00:08:56] David J. Bland: Yeah. I kind a winding journey for that as well. here I am inside these giant companies, getting them excited about experimentation.

[00:09:03] Janna Bastow: Yeah.

[00:09:03] David J. Bland: But we didn’t always experiment, experiment on things that were, the best for, reducing risk. And so you have to be careful ’cause, you know, I was just happy to get people excited about running, some experiments like landing pages and ads and clickable prototypes and all this. But then in the end, I was having a hard time tying it back to, the riskiest assumptions.

And so what I ended up, falling into was I ended up working with, Jeff Gothelf and, and Josh Seiden, who I love, and Giff Constable in that crew. a lot of the Lean UX book has, this two by two and I certainly, learned a lot from Jeff and Josh and Giff. And then I was also learning a lot more about design thinking and digging in a lot deeper from Stanford d.school and IDEO and all that and Larry Keeley’s work.

And what tended to happen was, I ended up taking all these desirable, viable, feasible themes and helping people extract their risk in a very specific way with that. And then with the mapping, getting to, what’s the most important, assumption we’re making with the least amount of evidence. And so I’ve open-sourced it. It’s been, like, it’s part of Google’s Design Sprint Kit. So if you go to Google, uh, ’cause they reached out to me and asked if they can include it there, so if you go to Google’s open-sourced website, I, can also, send you all that link, it’s there. It’s also in the Testing Business Ideas book.

And what I found is it’s great for, structured conversation about risk, because what I’ve done in Testing Business Ideas book is we have 44 different experiments, but they’re all tagged with desirable, viable, feasible, and a bunch of other different things. And I wanna help people choose, choose the experiment they can run. And so I find that mapping exercise is almost foundational because you need to have an idea, at least with your team, what are the most important things with the least amount of evidence. “Oh, let’s start there with our experimentation and not just do things that are fun or just trying stuff to see what happens.” I, I want them to really narrow in on how they reduce their risk.

[00:10:47] Janna Bastow: Yeah. And that’s actually really interesting that you talk about risk, ’cause I was having a conversation with a product person recently about how product people don’t talk enough about risk or the risk that we take by building the wrong thing and how we reduce that risk. sounds like your, your book talks quite a lot about that. Is that right? 

[00:11:05] David J. Bland: it’s more of I find it’s stuck in people’s heads, like they’re worrying about it at night when they go to sleep. [laughs] And yet we don’t make space in our organizations to really write it down and talk about it. obviously, it needs to be a safe space where we can talk about, things we’re worried about in our work. But find we’re so busy planning and just sprinting and then finding more stuff to, to plan and more stuff to sprint on.

And we need to make space to say, “Hey, let’s look at that backlog, let’s look at our roadmap. What kind of big risks are we making around customer needs and are we able to make money with this and we’re able to execute? And can we just document those and then work that into part of what we do every day?” ‘Cause it’s not stuff you do on nights and weekends. It’s not stuff you do in your spare time. It should be part of the work, is, is discovering what to deliver.

And so I find that, anything we could do to structure that and just help people make space for that and capacity plan for that is needed because otherwise, it almost gives us a false sense of security that “Oh, as long as we hit, like, our sprint goals and all that, we’re good.” And I’ve learned the hard way that, I’ve worked on some amazing, highly functioning teams that we just efficiently built stuff that nobody cared about. [laughs]

[00:12:10] Janna Bastow: Yeah.

[00:12:10] David J. Bland: So that’s why I’m so passionate and drawn to this work.

[00:12:13] Janna Bastow: Yeah. I’ve had that same experience. I remember working at a startup some years ago and it had just been funded so it had grown quite a lot. It went from one team of developers to the equivalent of four. And, they broke it down into four different, dev teams and then got a, scrum master to help organize things. And you know what? Granted, I’ve never seen a team get whipped into shape so much and just get everything out the door and yet still build all the wrong stuff, ’cause they’re so focused on just becoming efficient at building, but there is no time, there’s no effort, there’s no permission given to spending time doing the discovery work, which wasn’t really, it wasn’t really part of the vocabulary back then. but, yeah, that was my experience with it. and that’s actually something that I see time and time again, is teams tend to lean towards, the, I think the more easy to mea-, measure things, efficiency of getting things out the door, right? The number of points done or the number of features or story, user stories finished or whatever else, and lean away from the more intangible, the time spent in discovery and testing. Do you find that as well?

[00:13:19] David J. Bland: Yeah. I think it’s pretty… Yeah. I think from a company point of view, it’s almost there’s a transactional level of trust, It’s “Oh, I trust you to deliver this thing on this, you know, like, at the end of the sprint and have it work and all that.” and while that’s a great starting point for trust inside organizations, I find there needs to be a deeper level of, we say accountability, but I think when people hear the word accountability, they think, “Oh, I’m held accountable to releasing a thing.” and there’s a different, mm, definition of accountability, right? it’s like being able to give an account of how you’re making progress. And so I think with the discovery work, the ladder works better because you’re trying to give an account of, “Hey, we’re trying to reduce uncertainty here, and we’re trying to find out if we’re on the right track, and we’re trying to give an account of, are we finding a signal with our value prop in these customers and our willingness to pay and our ability to execute.”

And so I think slightly, we’re starting to shift that leadership style a bit. But we have a long ways to go. And I, I see most of the companies that I come into, there’s still this transactional level of trust where it’s you delivered a thing on time, end, end of a sprint and so therefore, I trust you deliver more stuff. and we need to get below that. we need to go deeper on outcomes. We need to get deeper on having teams feel that they’re safe enough to give an account of how they’re making progress versus just being held to delivering things.

[00:14:39] Janna Bastow: I absolutely love that framing. And I haven’t heard it that way, that, change of how you look at, accountability. and you’re right. if you’re giving an account of what you actually did, I think it changes that, that habit away from, teams just looking at “Here’s the things that we delivered” and turning around to “Here’s the things that we learned. Here’s the things that we did in order to learn stuff.” it’s a little bit like in math class years ago. Remember when you were told if you just put the answer at the end, you’d get some points? maybe that answer was right. Maybe it wasn’t. But you’d still get half the marks, or a good chunk of the marks, if you showed the work along the way. And so I think product management is partly about just showing the work, showing your learnings, because people learn from those learnings.

[00:15:24] David J. Bland: And I think we, we severely lack a way to communicate what we’ve learned inside organizations and we haven’t necessarily helped leaders understand what questions to ask to probe into what a team has learned. So for example, let’s say you’re doing something like assumptions mapping, right? usually early on in an idea, we’ll have these risks of, a lot of desirability risk about “Here are the things that we have to prove or were done.” Like, it could kill this entire product idea or feature idea.

And that map should look very different three months, six months in because you’ve learned and your risk has moved around. So some things you’ve learned about, there, there’s closer, there’s more evidence, there might be new risks that appear. And so something as simple as just showing, the before and after and showing how things have changed, I think, helps communicate what you’ve learned.

And so when you think back to scientific method, I feel the whole lean startup movement is probably closer to social sciences than anything else. But if we can communicate what we’ve learned, “Oh, here’s the risk we had and the hypotheses, we ran some experiments, this is what we’ve learned, this is how it changed what we’ve, we, uh, our action and our strategy, here’s how we put it into action,” I, think we need to do more of that, of socializing that inside a company versus just delivering, things and then measuring “Did we deliver those things or not?” I think getting to what we’ve learned and how it’s impacted our risk, there’s a lot of work to be done there, but I think some of this is headed in the right direction.

[00:16:42] Janna Bastow: Yeah. Absolutely. that’s the ethos behind the, what we call the completed roadmap in ProdPad, and that you’ve got, you know, we’re known for the now, the next, and the later format of the roadmap. But a roadmap shouldn’t just be something that looks out into “Here’s what we’re going to do.” what we’ve done is created a space that’s off to the side that, where you can take stuff and say, “Here’s what we did,” and not all the things that you did will have been successes. So it gives us space to say, “Here was what we wanted to do and here’s what we actually did, therefore, like, what we learned out of this thing.”

And so what it has is this history of all the things that you’ve tried, what worked and what didn’t work in the past. So you’ve got this, log of decisions made, this log of learnings over time. And it just gains value over time ’cause you’re able to look back and somebody says, “You hey, what did you actually do as a team?” here’s what we learned last week” or “Here’s what we learned last quarter,” or last year or wherever else.

[00:17:32] David J. Bland: Yeah. I had this with some of my Silicon Valley clients where we were trying to work in, some of the, um, assumptions we were making into the roadmap, and “Are we in the problem space or solution space?” And I’m by no means an expert in roadmaps I was always working with product people trying to understand how do we weave in things that were, like, all this discovery work we’re testing and, and have leadership understand that this isn’t just a series of dates and features we’re releasing, but there’s something we’re gonna learn along this journey and it’s gonna shape what we deliver.

And I’m just glad to hear other people speaking that language [laughing] ’cause it gets really dangerous when, uh, no matter what methodology you’re using if, if you just have a list of things you’re delivering on, assuming that it’s all fact and you’re gonna be successful if you just deliver all the things listed.

[00:18:12] Janna Bastow: Yeah. Absolutely. one thing that is, giving me faith is the fact that language has changed so much. since we had a conversation when we first met about 10 years ago, people’s attitudes around roadmaps are very much that it was a list of things to do and deliverables and is very much based on just getting things out as quickly as possible. Whereas now, the attitude is much more around, “Hey, here’s the stuff that we’re trying to achieve. Here’s the problems we’re trying to solve.” the concept of the outcome-based roadmap has become its, its own thing. and I think that ties very much into this experimentation-led way of working.

[00:18:45] David J. Bland: Yeah. It’s so awesome for me anyway to hear a lot of the lean startup language permeate the product community, ’cause when I hear people, uh, on stage at major conferences, they’re speaking about, here’s a hypothesis we had and we have to go test it.” and granted, I still think there’s [laughs] some challenges with the build-measure-learn framing, [laughs] which we can get into if we want.

[00:19:05] Janna Bastow: Yeah.

[00:19:06] David J. Bland: But, uh, I do, I feel hopeful that language has permeated, which was a pretty small niche like startup. led some of the, the meetups, lean startup circles in San Francisco, And it was very small at the beginning, It was just like mostly startup folks and very niche stuff. And then to see that kind of get beyond that and permeate into “Oh, here’s how it could really help product managers and, and the product community,” it’s been really cool to see that over the last, know, 5, 10 years.

[00:19:30] Janna Bastow: Yeah. Absolutely. And you went there. I’d love to hear your thoughts on build-measure-learn framing and, strong about it and what’s not.

[00:19:38] David J. Bland: Oh, yeah. keep ranting about it. [laughs] I think, of the challenges I have with the framing is that people start with build, and so this idea of I have to build to, to learn, and often that’s not the case, And to give Eric Ries credit, in, earlier, he did describe it, you have to think about this loop in reverse. So you start with what you need to learn, and then what you need to measure to learn that, and then do you need to build to, to measure to learn that.

But I think that’s not how most of us interpret it, or at least the companies I go into, they see build-measure-learn and they start, we’re gonna start building stuff.” and that in itself leads almost like a build-to-build type mentality versus a build-to-learn mentality. So they don’t really have, an assumptions map and what their risks are and then say, “Oh, you we have to do, let’s say, a concierge MVP to learn about this thing.” It’s more about, “Oh, let’s start building the first version of the thing and get it in front of customers and then get their input and then change it.”

And I think there’s still some confusion overall in the community about that approach. I’ve read articles even a few months ago where people said, “That’s why we don’t use lean startup because, it’s about getting a really crappy version of the product out in front of people and then getting feedback and making it better.” I’m like, “Uh, I don’t know if that’s really what it’s about.” [laughing] I think it’s more about what you need to learn. And then that could be through interviews. That could be through like have a letter of intent. That could be a concierge MVP. Like, it could be all these different stuff. And I feel like we still start with build, quite often. That’s the major challenge I have with the loop as it is today.

[00:21:03] Janna Bastow: Yeah. Totally. I get that. And, if you’re gonna take the framing of it, build is a strong word. I doesn’t necessarily mean build a whole bunch of code quietly in the corner and then ship it months later. Build could be put together a very simple experiment that allows you to start learning from something. And by experiment, by, by MVP, it could be, quite literally take a bit of copy and put it in front of somebody, make a survey. and that’s your first build, and then you can measure from that. But a lot of people think a build as “Great, let’s crack open GitHub and start coding.” And that’s, I think, where the barrier comes in.

[00:21:38] David J. Bland: That’s often the most expensive way to learn, when you build. And even if I know the cost has come down with a lot of low code and no code solutions and all this in software today, and, and to an extent, in hardware with 3D printing and things like that. But let’s say you put something that is your first version of a product out in the world and then people aren’t using it. And they’re, and, and you can’t, you don’t know why. and so in, in the end, you have to reverse-engineer all this anyway and get back to “What was the value prop of the thing we built? And who were we focused on? And what job pain or gain did it solve for?” and so you end up doing a lot of the work after the fact and it becomes more anxiety-inducing because you already have a thing in the world that people, aren’t using. And so I find that even though the cost of build has come down, it still requires, it ends up costing you a lot anyway because you have something in the world that people aren’t using and then you’re trying to figure out why they’re not using it.

[00:22:27] Janna Bastow: Yeah. Absolutely. So any tips for, any product people here to, articulate that, that cost or that risk to the execs on their team?

[00:22:39] David J. Bland: Yeah. it’s interesting the language I use. And, and some of that’s been shaped by my co-author, Alex, Osterwalder over the years, too, ’cause I partner with him on stuff and I had the pleasure of partnering with Eric Ries early back in the day, too. And, and I think what I’ve learned from them was you, when you talk about voice of the customer, you, you have to use words that your executives get and, they use daily, right?

And what I wouldn’t recommend is going and saying, “We’re gonna use a systematic approach of identifying and extracting our assumptions and refining those into hypotheses and then running experiments that are desirable, viable, feasible experiments and discovery and validation. And then we’re gonna learn from those and we’re gonna use that to update.” Like, they’re gonna tune you out, Um, plus, all the canvases you can throw into that conversation.

So what I’ve been doing recently is really asking about, what they’re worried about. And, and typically, what they’re worried about, and even in the back of your head, you can map that to either, desirable, viable, or feasible risk, usually. And then I just talk, I talk about making investment decisions. “Hey, you make, how do we make a good investment decision on this? and, what are you worried about with making investment decision?”

And so I speak a lot know, how we de-risk this thing we’re working on. “Wow, we’re really worried about this thing. I wonder how we could test that to learn out, learn sooner versus later.” And so a lot of my, terminology and a lot of language I use when I’m talking to executives is around investment decisions and, what they’re worried about and also how to de-risk what we’re working on. and those are, I find, words that they are comfortable with and it doesn’t make them super defensive when you start throwing a bunch of terminology about, like, applying scientific method to business and all that. they can just tune you out even though you’re well-intended and trying to help the company. they really have a hard time struggling, like, understanding what you’re trying to communicate. So investment decisions, de-risking, trying to use that kind of terminology, I find, helps, um, quite a bit.

[00:24:24] Janna Bastow: Yeah. That, makes a lot of sense. As product people, we can often talk about what, customers want until the cows come home. We can talk about how to build it, right? Often, the, we are builders, a lot of us. but that management upwards often isn’t as natural. And so it’s really helpful to have, tips on, how to speak to our execs, how to, well, out the words that resonate with them, ’cause fundamentally, they are stakeholders just like our customers are. And we can be learning from them and doing discovery on them to, figure out how to, understand the, their needs and desires to, to build the product towards what the business needs the most.

[00:25:03] David J. Bland: Yeah. And, and it’s not necessarily everyone’s gonna come on board, but I can say just from experience, I’ve been working with one of the largest companies in the world for, uh, over a year now. Just coaching them, going through how to help them, test new ideas beyond their core, more adjacent ideas. And what I have at the moment, which is pretty amazing, is I their CTO really asking all of their teams about, “Oh, show me how, like, your experiment plan. what are you, what, what are you testing? Like, how, explain your risk to me and how you’re testing it.”

And anything new that’s not necessarily part of the core, that plan is now a requirement as far you know, getting funding and going through their funding process and all that, which I won’t get into. But, it’s really cool to see that language starting to permeate a bit of the C-suite, where they’re asking like, “Oh, show me, what kind of desirability hypothesis you have for this.”

And as the teams practice that, I find that the leadership team also gets better at asking questions. they get better at what’s a plan they threw together at the last minute and what’s a plan they’ve been really working on and actually doing the work on, talking to customers and all that and doing discovery work.

And I wouldn’t say it’s the norm yet, but I say just from personal experience, some of the companies I’m coaching, at the C-suite, they are asking for, know, “Show me your plan of how you’re gonna de-risk this.” And it’s not just a giant Gantt chart or sprints and sprints of work. it’s, desirable, viable, feasible risks framed up as “Here are the experiments we’re gonna run, the proof we’re seeking and all that.” So it’s really cool to see.

[00:26:25] Janna Bastow: Yeah. That’s a great trend to see. Absolutely. And what sort of things tend to trip up companies who are trying to get to this point of testing more?

[00:26:35] David J. Bland: Oh, there are a bunch. [laughs] Um, I would say some of the big traps I see personally, are, are things as if, you can’t talk to customers. Like, the, I think, I feel as if we swing back and forth like a pendulum in a, in our industry with, at the early stage, it was like, “Oh, we have to put the customer right next to the team.” And then it’s “No, we can’t bother the team with the customers. We’re gonna 18 different layers of bureaucracy in front in the, in the customer.” And then, now we’re slightly like, like the pendulum swinging back a bit and we’re trying to bring, like, a closer connection to our customers again.

But if you can’t talk to customers, that’s going to be really hard. also, I’ve noticed, teams get in this, like, analysis paralysis trap where they just wanna analyze and reanalyze stuff and they’re not really great kind putting into action and they don’t timebox it very well. I see a lot of biases that kind of work against us. And not that we can eliminate biases, but I see mostly, like, confirmation bias and experimenter’s bias and overconfidence bias play in which is, we’re just confirming our beliefs so we check the box and we just go build what we wanted to build, and then like, “Yeah, bring customer interviews.” [laughs]

It’s like, “Did you look at any of the notes and generate any insights?” And you’re like, “No, we just did the interviews.” [laughs] And then I have, uh, overconfidence one I would say is the next most popular one I see, which is, or most common, which is, we’re so confident we don’t have to test, “Oh, I know the answer, and so we don’t need to test that,” And so there’s some, there’s this interesting combination of lack of customer access, not putting what you’ve learned into action, and then the bias is creeping in, that I is somewhat holding teams back. Those are all very, addressable, but I, I find sometimes that stuff is working against some of the companies I’m coming into.

[00:28:11] Janna Bastow: Yeah. Yeah. Absolutely. And how does a company test when the space is like in a really new area? I’m asking for, let’s call it, a friend who, we ran into this problem recently. We were, building out stuff like, AI, generation, for your ideas and then this AI coach and then realized, “How do we even begin testing this stuff? What do we, questions do we ask? Is this something that people want or is this something that people were not even sure that they know that they want yet?” we ended up just putting something in, because it was faster to do but we’d love to get your thoughts on that and whether there’s an, an, an alternate way of doing so.

[00:28:46] David J. Bland: Yeah. I like, I still refer back to a lot of Steve Blank’s early work, which helped inspire a lot of lean startup kind of stuff. And, he had this framing. if he invented it or not, but it was kind of this idea of, “Your customers have the problem. Are they, aware of the problem and are they actively seeking solution to the problem?” And so I love that framing. I still use it a lot of my coaching. Even this week, I was using it with some teams I was coaching. And, I like it because I try to find out where are we on the spectrum with, let’s say, early adopters.

So let’s say if it’s AI-generated or something like that, is there any observable evidence of people having the problem that you’re trying to solve with this AI thing, right? And then, is there any observable evidence that they’re even aware that they have that problem? So it changes your strategy because if they’re not aware of it, then, a lot of what you have to do is, like, content generation and awareness building and you’re trying to make people aware of that, “Hey, this is a problem they have or maybe they’re symptom-aware but they’re not understanding, uh, uh, of the problem.”

And then if you go up that stack, though, and “Oh, is there any, um, observable evidence of them actively seeking a solution to the problem?” that’s where, ideally, you wanna be when you’re releasing something new because I find it’s much harder to, and longer to, generate awareness of something and get people to go through this kind of re- realization that they need to buy your product with that.

And so if you go with what people are actively seeking, it’s a smaller kind of subsegment. But you can, you can say “Okay, if they have this problem, where are they going to try and solve this? are they going online or are they going offline?” depending on the problem, could be something you ask your doctor or it could be something you go to Google and type in, And so I think the, part of the, what makes this really hard with early-stage product discovery and idea testing is that you do have to brainstorm, “Okay, where are people online and offline and are they actively seeking or not?” And then once you start to start figuring some of that out, you can start with smaller tests with them. But if you don’t put that work in, it’s almost like, uh, we’re gonna launch something and then we’re gonna look for the bright spots, We’re gonna see who signs up and try to back our way into some segmentation from that.

And you can do that, but it’s usually a lot of work and there’s a lot of noise. And, I’ve helped teams launch stuff on Product Hunt before, right? And if they launch too early, they’ve got, like, thousands of people coming in, to their landing page. then you’re like, “Are you here because you’re really interested in the problem or are you just here because it was on Product Hunt,” 

[00:31:02] Janna Bastow: Yeah.

[00:31:02] David J. Bland: And I find that it’s much harder just to say put it out there in the world and then see who signs up and then double down on them because it it, it’s a lot of noise and you have to have the time and ability to go find the bright spots and all that and do some segmentation, but-

[00:31:16] Janna Bastow: Yeah.

[00:31:16] David J. Bland: I like to have a, a way of seeking framing. I tend to use that quite a bit.

[00:31:20] Janna Bastow: That is great framing. Thanks so much for sharing. And speaking of AI, wondering if you’re keeping beat on this. what do you think the future of testing ideas is gonna be like?

[00:31:32] David J. Bland: Oh, it’s so hard to predict. I think, we’re still putting a lot of work in. I would say I’m 10 years, a little over 10 years, into this. We’re just focused on it for me. And I still think we’re at the early adopter stage of a lot of this work. I joked back in the day, you just be able to press a button and it’ll generate your whole startup. [laughing] I don’t know if that’s a good thing, by the way. I would say with AI, the way I’m using it right now is I’m using it more to stress-test things in early discovery.

[00:31:58] Janna Bastow: Hmm. 

[00:31:58] David J. Bland: if there’s something I should be thinking about that I’m not thinking about, but I’m shying away from a lot of the generative discovery work AI yet because, like, you can type in whatever and it will spit out a persona for you. to an extent, it’ll, it’ll spit out a value prop canvas, which I’m also interested by because it’s copyrighted and I don’t know why AI [laughs] is spitting out copyrighted material. But whatever, we’ll get to that later. but it’ll spit out a business model canvas. it, it can do empathy maps. Like, it, it, it does it with the sense of confidence. You feel like, “Oh, yeah that… If I was gonna start an e-bike sharing business in London, that’s the customer I would start with.” And those are all the real problems they have.

[00:32:36] Janna Bastow: Mm-hmm.

[00:32:36] David J. Bland: And so it almost gives you a false sense of confidence. And then, not to rant here but, if you don’t like the answers, you just hit the Generate button [laughing] and it just does a whole new one and you’re like, “I’ll just tap this button until it gives me the answer I want and then I’ll go build the thing.” And so I’m a little worried about that. And I know there’s some startups working in that space and some of them are really publicly being dragged in public for it. But I, I, I just, uh, I don’t, I’m withholding judgment on it, but I find that it’s a little scary to me that you can just regenerate discovery work until it matches your worldview and then you go build. I like about it is “Tell me what’s wrong with something,” or “Oh, what’s wrong with the assumptions I, I’ll often ask that and it’ll give me some pretty valid points of, “Well, here’s some of the shortcomings of it.” And I tend to use it more as “Give me critical feedback on things and help me think about it a different way” versus “Just generate a bunch of, user research that I can therefore claim is all real and then go build.” That, that’s the scenario I’m a little worried about right now. 

[00:33:36] Janna Bastow: Absolutely. I saw one the other day which was, like user research without the users, [laughs] which, of course, it’s gonna sound convincingly like users, but you’ve got to talk to your users, don’t have any proof that this AI users are not good enough but, surely, they don’t have the experience, the life experience and the human connection to back up why they feel in particular ways. and that, I think, is something that’s got to stick with what we’re doing, because we’re still gonna be building, we’re still solving problems for humans. We’re not solving problems for AI. And so that’s gotta be at the core of what we do.

[00:34:10] David J. Bland: Yeah. and we tend to bake all our biases into AI. And that’s the part that-

[00:34:13] Janna Bastow: [laughs] Yeah.

[00:34:13] David J. Bland: … I’m a little more worried about than anything else, is that we’re almost like scaling our biases in ways we’ve never been able to scale our biases before.

[00:34:20] Janna Bastow: Mm-hmm. 

[00:34:20] David J. Bland: And I’ve dealt with this personally. Like, I’ve worked in, prison reform and some other stuff in the past in, in my life and, professionally and on the side. And, if you take something and, let’s say, you bake it into, let’s say, judgment software for a judicial system and you bake your biases right in there, that’s at scale, right? and, 

[00:34:37] Janna Bastow: Yep.

[00:34:38] David J. Bland: The real-world impacts on human life are pretty dramatic and horrible. And so I look at what we do already and I’m like, “Oh, no, we’re gonna take that and we’re gonna make it even, like, scale even and I, I think we do have to take a step back and realize that, the people creating a lot of the AI are putting their biases into the AI. and this is why I uh, I think one of the, some of the more controversial stuff I speak out on shouldn’t be. I talk about diverse teams, for example, a lot in my workshops and in my coaching and in my writing. And I think having diverse teams just puts you in a stronger spot because you’re not all thinking alike and have the same shared life experience and just baking that into a platform or into an AI, right?

If you have a diversity on your team, you could say, “Oh, this is how this could be abused and this is how, you know, it, it could be used to stalk somebody.” and if you don’t have any of this personal experiences, you send, you tend to not include that into the design of the thing. And so I think much like in product, right? We want diverse teams. We want people with different backgrounds. We want to be able… you’re gonna be more resilient as a result of it. And I find that if we have just a select few working on AI and they’re not very diverse, like, that’s a little, it’s a little worrying to me.

[00:35:42] Janna Bastow: Yeah. Absolutely. Absolutely. I agree with that. And actually, we talked, earlier before this, talked about, how, product managers are responsible. We talked about, being, creating something that’s valuable for the business, something that is, desirable by the customer or something that’s technically feasible. Those are the three things that we’ve always said, “That’s what product managers are responsible for,” but also, something that is ethically, good, something that is sustainable, something that’s accessible. And what’s your take on, product managers and the responsibilities these days?

[00:36:12] David J. Bland: Yeah. I heard some pretty scary stories, [laughs] I have to say, early, early days, like lean startup circles. there would be things uh, especially with experimentation, right? I’m a big believer of kinda experimenting with your customers and not necessarily experimenting on them. And, and this is where I think we can learn a lot from social sciences, ’cause there’s a lot of that groundbreaking work already there, the principles we could adhere to and use and build upon.

And so when I think about some of the stories I’ve heard, it’s “Oh, we don’t have any testimonials, let’s just make some up and put them on our landing page.” It’s okay, you’re trying to create a false sense of social proof to get people to sign up, but these aren’t real,” and when you think about AI, that can, can even, be even worse now. But I saw that, even 10 years ago, people throwing fake testimonials up on landing pages. I see people doing things like in early days, they would do things like a cancelable purchase order. it’s “Oh, we don’t know if customers want this product and we’re B2B, and so we’re gonna have people sign a purchase order and then that’s evidence that there’s demand and then we’re just gonna cancel it ’cause we don’t have anything.” I was like, “That’s a good way to burn through whatever [laughs] cu- customers you have and create a lot of ill will with them.”

And so I’ve, like in the book, I didn’t really talk a lot about it, but I try to make it a point to not include a lot of the experiments that I feel are more manipulative in the book. I just… ‘Cause people ask me, “Oh, why didn’t you do imposter Cheetos, one of them, where you just literally take a competitor’s website and you throw your logo on it and you say it’s yours and you test?” I was like, “That’s a great way to get sued.” [laughs] And also, it’s like I’m not really on board with just copying somebody’s product, like, verbatim and then putting your logo on it.

And so a lot of that, I just don’t evangelize at all, uh, anymore. And I think as I maybe I get grumpier as I get older, I don’t know, I’m really trying to push this idea of testing with, like co-creation, this idea of, really understanding your customer and not just testing on them. and I’m not saying my thinking is perfect yet on that, but certainly there’s some things that I would shy against now that if you look early days lean startup movement, there are things like, “Oh, we’ll just pay users $5 for their Gmail and then crawl their email.”

And I’m just like, “Yeah, I don’t really feel comfortable doing that, What are some other ways we could possibly test this and still learn without doing more mani- manipulative stuff with customers?” And so I think that, I think it’s trending in the right direction, but I have to say, after hav- having seen a lot of this stuff being weaponized over the years, I’m, I’m a little worried about, you some of the techniques even in product that we evangelize, being used to manipulate people more.

And so I’m… I don’t know if we saw that with adding a circle to our Venn diagram, but I, I think there’s a lot of work to be done about having the conversation internally, when you’re asked to do something that you think is unethical or you’re asked-

[00:38:40] Janna Bastow: Yeah.

[00:38:40] David J. Bland: … or an experiment you feel is testing on and not with.

[00:38:43] Janna Bastow: Yeah. And you know what? I think it’s so important that we have conversations like this. know, people hearing you saying stuff like this, like these are experiments that you recommend and these are experiments that you don’t evangelize. And the people who are listening in today, taking this back to their team, and if somebody says, “Hey, we could do this,” going, “Yeah, you know what? We’re not gonna do that and here’s why. Here’s where I’m drawing some lines.” ‘Cause ultimately, it comes down to people drawing the lines, and product managers are the people who are often in charge of figuring out where those lines are.

[00:39:11] David J. Bland: Yeah.

[00:39:13] Janna Bastow: Yeah. 

[00:39:13] David J. Bland: I totally agree. So I think the with, not on, that’s kind of my guiding statement. Can we experiment with customers, and not just on them? ‘Cause I think when you’re experimenting on them, it just creates this almost like early days focus group dynamic where you’re behind the mirror and they can’t see you. that doesn’t help create a lot of empathy. and what tends to happen is on the other side of that mirror, like, the team, it’s easy for them to start laughing at some of the statements the customers make it’s, it’s like creates kind of a toxicity. And so I think with, not on, that’s my guiding, guiding principle on this, uh, at the moment.

[00:39:42] Janna Bastow: I love that. That’s great. Thanks for sharing. Now Julie had a really good question here as well, which was how do you effectively track decisions made without sliding into old-school 150-page documents that don’t get read?

[00:39:54] David J. Bland: [laughs] Yeah. Uh, I don’t wanna slide into that either. That’s not a good time.

[00:39:57] Janna Bastow: [laughs]

[00:39:57] David J. Bland: I remember reading this once, at my first startup and it, had, in the middle, it had cats and dogs living together, mass hysteria. And I was like, “What is this doing in the middle of this document?” And I reached out to the guy that wrote it, and he’s you’re the first person ever to mention that line?” ‘Cause it’s a line from Ghostbusters, basically. And I was like, “No one reads this stuff.” [laughs] I have to say, I think, uh, the visual, we can take a lot from visual management here. So a lot of the canvas work, a lot of the assumptions mapping work, anything you could do to just visualize your strategy as simple as possible but too simple and be able to communicate progress and change, I, I think there’s a reason these tools have stuck, over the last 10, 15, 20 years, because, you can do, that you can show something and there’s also an element of storytelling to it. And so I think we come back to our roots, we still learn through storytelling.

[00:40:45] Janna Bastow: Hmm.

[00:40:45] David J. Bland: That’s, that’s how we learn. And so we don’t learn through a PowerPoint with, 30 bullet points on it at, 10-point font or something. Like, we learn through storytelling. And so I think why a lot of these tools are working is if you can, let’s say, show a canvas and say, “Here’s where we started and here’s where we are now,” and show the like you can tell a story about those changes, right? Same things with assumptions map. You can say, “Here’s the risk we had earlier. We went off and we learned some stuff. And here’s what we think the risk is now. And then we should either pivot, persevere, kill this based on what we’ve learned and let’s have a conversation about it.”

And so, uh, I do have some templates I use, but I think overall, just err towards visual management and err towards storytelling. I think that’s, that can solve most of your problems right there versus doing a really elaborate, thousands of bullet points kind of document and PowerPoint. no one learns well that way. And visual management and storytelling are the things I keep coming back to.

[00:41:34] Janna Bastow: Yeah. reminds me of, Matt LeMay’s approach. He did the, the pledge, the one-page, one-hour pledge, which is instead of writing up these massive things or getting lost in creating this huge PowerPoint deck, create, spend no more than one hour or one-page worth, and at that point, stop and go share it around with other people to get feedback on it and just make sure that you’re on the right track rather than getting bogged down in trying to create this perfect document or trying to capture everything all at once.

[00:42:03] David J. Bland: Yeah. don’t know what, the current state of, Amazon is at the moment. I know the narrative-based approach was something where you had to read something. I know they also use the Testing Business Ideas book ’cause I’ve spoken to some of their product people. but this idea of, it really comes back to storytelling again, right? is there a story you can tell that really shares what your vision is, what you’re trying to accomplish, what your risks are?

And I think the artifacts you used to tell that story can vary, but I just wouldn’t lose, uh, I just wouldn’t lose sight of the story, ’cause that’s where a lot of these tools break down, is that, like I’ve seen thousands of canvases over the years, and I still teach it. I still teach it to VCs or startups that are funded by VCs in the Valley even. And when you think of it like a checklist of “Yep, have a value prop, yep, have a customer,” and it’s a bunch of bullet points, it, it doesn’t really help you tell your story. and so mostly what I’m doing is reteaching a lot of these tools in a way where you can say, “Hey, what’s the story we’re trying to say, tell here? And does this help you tell your story?” And I think if you keep that top of mind, it’ll help strip away a lot of this extra stuff that could make it, actually work against in, in that way.

[00:43:03] Janna Bastow: Yep. Yeah. All right. Very good. And so how does, this testing stuff differ when you’re in a micro startup or a startup versus a much more established company?

[00:43:14] David J. Bland: Yeah. I think I mostly focus, up to product/market fit on, on the opportunities anyway I’m working on. It could be a product, it could be a service, a feature. it could be different things. But I tend to play in the, what we have a semblance of an idea, but we’re not sure it’s worth investing in. And, once it gets to the point where it’s something, it’s repeatable and scalable, I tend to back away. Like, I’ve helped scale startups before. It wasn’t, like, one of my favorite things to do. I tend to like playing in the messiness of idea to kinda product/market fit.

And so whether I’m in a startup or a big company, a lot of the same tactics apply. I would say the biggest difference is brand. So when I’m working with a startup, they’re all like, “Oh, we’ve spent weeks coming up with the perfect logo and brand,” and, like, nobody cares. Like, really, if you look at any really successful company like DoorDash, they didn’t start off as DoorDash. It was like Palo Alto Delivery Service, right? Ring didn’t start off as Ring. I forget what it was on Shark Tank. It was something, like, not Ring. And so a lot of these, branding conversations emerge over time and it’s not something you need to focus on right away as a startup. However, with big companies, I work with companies where their brand is over 100 years old. And so they’re terrified of doing anything that’s less than perfect that would “damage” the brand.

And so what I find when I’m working with, like, companies like Adobe is we end up doing projects and labs branded stuff. And it’s not exclusive to them. I have other clients that do this as well, but they’re one of the more, high-profile examples I can think of and talk about, which is if you look at Adobe Express, right, it started off as an off-brand MVP that I helped out with seven, eight years ago maybe at this point.

And it was a lot of, we don’t wanna put the Adobe logo on it right away, so can we still test it and see if it solves the jobs for designers but without all the distraction of driving a bunch of people to it just because it has a logo on it?” And if it doesn’t work out, you just quietly kill it. but if it does work out, you can just bring it on brand. And that’s what they did with Express.

And so I’m finding companies, like, that have air cover from their executive team that they’re able to do that. I think some, not everyone’s on board, with that, but I think the challenge with not doing that is you wanna internally refine in this customer-free zone till it looks more than perfect and then launch. and that’s a big either you got it right or you didn’t moment, [laughs] whereas I would much rather go through a series of experiments and learning whether or not we’re on the right track or not. And I wanna do that with actual customers and not necessarily damage the brand along the way. And so project labs-based branding is what I tend to recommend with my clients. I’ve seen that permeate into the crowdfunding, by the way. I have a lot of examples of, Indiegogo campaigns where it’s something from, 3M or Bose headphones or Delta faucets, and GE does this as well. and they create these little labs brands on, uh, Indiegogo and they’ll fund new product ideas. And it’s not like they need the money, it’s more about is there any, like, strong evidence that people want this, like desirability, viability. Feasibility, they can figure out how to build it.

And so I am seeing more of that. I wouldn’t say it’s very common yet, but, have, list of 15 or 20 Indiegogo campaigns that are all big companies doing labs branded, for the most part, testing of new ideas. So there are ways to approach it, but I would say brand by far and away is the biggest difference I see between startups and, and big companies I work with.

[00:46:33] Janna Bastow: Yeah. Okay. That’s really interesting. Thanks for sharing. And Kamal asked a question. in a high-traffic e-commerce context in which the top-level objectives are largely revenue and profit fo- focused, is AB testing the only tool available or there are alternatives? 

[00:46:48] David J. Bland: I’d probably have to dig in a bit more, but AB testing is, okay. I think it’s one of the, one of the staples everyone goes to. My challenge with AB testing is that, sometimes we don’t really have a hypothesis behind it that we’re trying to test. So I always try to make sure that if we’re trying to impact revenue or something high-growth that, we do have a hypothesis behind that. I also make it a point to measure what we don’t want to go down. [laughs]

[00:47:09] Janna Bastow: Hmm.

[00:47:11] David J. Bland: And I think sometimes we only measure what we wanna go up. [laughs] And I’m like, “All right, we can optimize for this part, in our AB tests, but we have to also have other measurements of the stuff we don’t wanna go down.” And that was something I learned early on in my career. So I would say AB testing is a great staple for that. there are also other books I really like for, like, growth experiments. and there are other people I’ve worked with in the past. Like, I really love, Lex Roman’s work. she’s very awesome at looking at growth experiments and how apply this methodology to more growth-stage stuff. So I definitely, recommend her stuff. there are a couple of other books that I like for growth-level experiments, but I would say AB testing is fine. Just be careful of, one, having a hypothesis and also measuring what you don’t wanna go down. but for the most part, I’m usually focused pre-product/market fit so I don’t have a lot of, like, growth experim- experiments in my back pocket that I could give you at the moment.

[00:47:59] Janna Bastow: Cool. Thank you. And so what’s the scope of what product managers should be testing? should they just be focused on testing things that change the product itself, or should they be looking at testing around pricing and messaging and positioning and packaging and the wider, gamut to reach that product/market fit? 

[00:48:20] David J. Bland: I usually think, so if you think about, having some kinda cross-functional leadership team that you’re working with, right? So I know we talk about the trio, uh, we think, like, product design, engineering. I view this product as, and I said this at my Mind the Product keynote a while ago, a while ago before the pandemic, which was, like, you have, you’re in a place where you can influence the business maybe now more than ever, right?

Because if you look at a business model, if, if you have the wrong product, your business model is gonna fail. And if you have an amazing product and a wrong business model, you’re still gonna fail. And I think we can all point to products in the past that we love that are no longer around, usually because they weren’t viable. So I’ve been pushing product people to, to very much play in the viability realm. feasibility, it can be more of just a joint, collaboration with technology, legal, compliance, whoever needs to be in that conversation about “Can we execute on this?” Right? But when you think about, cost and revenue, having product lead that or at least facilitate that, is really important. And so I would recommend when you’re thinking about, your product ideas for those of you in existing companies, your new product ideas, don’t necessarily take, like, the existing business model and just apply it to a new product. I’m coaching a lot of teams this week I can just think about I had four coaching sessions yesterday with this really big company and I was constantly asking them, like, “What are other ways that you have thought about making revenue here that you’ve never tried?” and ’cause they have a new version of a thing and they’re just gonna slap on the same business model. And your business models expire. I mean, they don’t always work. look at, Nespresso. Look at, way back in Blockbuster, Like, a lot of these business models just, they don’t work forever.

And so when you think about this, I would say from a product point of view, definitely get more comfortable with numbers or pull in accounting or finance if you need to. I know every product person has their own strengths and weaknesses, but definitely get into the viability realm, because there could be a way that you can make more money with your product, but it’s a tweak to the business model and it’s not just taking the existing model and applying it to a new product. So I would say at least brush up on viability and get more comfortable with that testing ’cause I think you can have a huge impact there because, you have an amazing product, wrong model and business model you can still fail.

[00:50:24] Janna Bastow: Yeah. That’s a great answer. Thank you so much. so this has been a great conversation. We’re running out of time. But, David, how can everyone find you and follow up with you, find your work, find your book, and all that sort of stuff?

[00:50:36] David J. Bland: Yeah. The book, it, it’s pretty much everywhere now. it’s in, it’s on Amazon, but it’s in, I think, 20 different languages if I last checked. Yeah. I think it’s in 20. I think it just came out in Spanish, although I’m trying to confirm with Wiley that it did. [laughs] They don’t always keep me in the loop on stuff. but I think it just came out in Spanish in January. But it’s in, 20 different languages, uh, over 20 languages. so anywhere you find books, you probably find it. for me, I’m, I’m on davidjbland.com. So that’s a good way to find me. I’m also really active on LinkedIn, although I have to say you’re gonna get a lot of business memes if you follow me on LinkedIn. I don’t take myself very seriously on LinkedIn. my style is very much just try to make you laugh and also educate you. And so just be mindful that you’re not gonna get a lot of stuffy content from me if you follow me on LinkedIn. but I’m also on Twitter, Instagram, all that stuff. not all the socials, but most of them.

And, my style is very much like give away what I’m working on for free, like what I’m thinking about. So there’s a lot of stuff where I write up, like some of the stuff we talked about today. There’s a good chance there’s either a video or a podcast or, or an article somewhere that I’ve written on it. So, yeah, just, reach out. I’m around.

[00:51:38] Janna Bastow: Excellent. you’re the type of person who actually makes LinkedIn worth going to. So appreciate the meming and having the fun over there. It’s great. 

[00:51:44] David J. Bland: I have to say, though-

[00:51:46] Janna Bastow: [laughs] 

[00:51:46] David J. Bland: do regret asking how many chuggas before the choo choo because I think people started using that as an interview question.

[00:51:54] Janna Bastow: Oh, no. [laughs]

[00:51:55] David J. Bland: And that’s not my intent. So all just for that one-

[00:51:57] Janna Bastow: You just started it now. 

[00:51:59] David J. Bland: if you’re an interview and they ask you about how many chuggas before the choo, you could probably blame me for that.

[00:52:03] Janna Bastow: Okay. That is good knowledge there. Excellent. [laughs] David, thank you so much for joining today. This has been fascinating to, to get a peek into how you think about testing business ideas. for everybody, And big thank you, everybody, say thank you to David. And, thanks again, David. Thanks for being here.

Watch more of our Product Expert webinars