Sean [00:00:18] Hi, welcome to the Product Momentum Podcast, a podcast about how to use technology to solve challenging technology problems for your organization.
Sean [00:00:29] Paul, this was a great interview. I’m super excited about it.
Paul [00:00:32] I enjoyed it thoroughly. I learned a ton from Ash. I think the ways that he analyzes the things that make people tick is a really fresh perspective on the Lean concept.
Sean [00:00:46] He’s been in this industry for a long time. He’s really purposeful and thoughtful. I loved his analogy about the hamster wheel of the killer feature.
Paul [00:00:55] Let’s not spoil it too much. Let’s just get into it.
Sean [00:00:58] Alright, sounds like a plan.
Paul [00:01:02] Well hello, product people, welcome to the pod. It is a pleasure to welcome Ash Maurya today. Ash is praised for offering some of the best and most practical advice for entrepreneurs and intrapreneurs all over the world. Driven by the search for better and faster ways for building successful products, Ash has developed a systematic methodology for raising the odds of success built upon Lean Startup, customer development, and bootstrapping techniques. Ash, we’re so happy to have you today. Welcome to the pod.
Ash [00:01:28] My pleasure. Excited to be here.
Paul [00:01:30] Absolutely. So just to jump right in, what’s been on your mind lately about assessing the state of things in products and how businesses view product teams? What impacts do you see product specifically having on the way businesses think?
Ash [00:01:44] So I think there’s kind of a big movement putting product at the center. I guess in some ways, it’s always been there, there’s just a new awareness of it. What I tend to stress a lot about is really how to measure products working. So we sometimes fall in love prematurely with solutions and features. But to me, it’s a lot really about measuring the product working and getting some metrics and the right kinds of metrics.
Paul [00:02:10] So some of the metrics that people often tout is the things they want to see as a decision output or ROI, but really ROI is a result of the people, the stories, the experiences, the sweat equity that goes into it. So how do you generate a top-line input of teams that has this capacity for empathy and experience that can speak business and ROI at the bottom line?
Ash [00:02:39] Yeah so I think that’s, in the new world we’re living in, it’s a skill that’s increasingly becoming important. So I came from a technical background and I know when I would build stuff, I would just say, “give me the specs, give me the requirements,” and my work was done when I was feature complete or spec complete. So that’s where the buck stopped, but increasingly so because of cycle times getting shorter, our relationship with customers getting more direct, it’s become increasingly important to bring the customer voice in. So when I talk about metrics, it’s both the qualitative and the quantitative and we can measure almost anything these days. So to me, it always starts with doing a bit of both. And qualitative is the easier place to start because we can get to see patterns and learn a lot of the big themes, the signals, and the noise, what I call it, you know, validate qualitatively, but you still have to verify quantitatively because you can get a lot of false positives. And there, we try to bring it down less about output metrics and not things like build velocity or, “are we doing things on time and on budget?” That’s important from an ROI perspective, but the first order bid is, “are we building something that gets used?” Are customers engaging with this? So that’s where I like to start and then everything else layers on top of that.
Sean [00:03:52] I like that, our customers engaging with us. I’m going to pull a quote out here. You just said, “validate qualitatively but verify quantitatively.” I love that. Give me some tactical, real-world examples of how you apply that.
Ash [00:04:04] Yeah. And part of the reason that two phases are required is because of that speed of implementation. We want to learn quickly, especially if you’re a startup or even a big company launching a new product, we don’t yet have enough time or data points to really verify things statistically significantly. But you can do a lot of qualitative testing fast. And so places where this can be used, the classic one that designers use a lot is usability tests. You know, you may run five to 10 usability tests and uncover a lot of the big problems that are maybe not obvious to the folks building the product, but very obvious to the customers trying to make sense of it because they are looking at it for the first time. The other place where I see this a lot in the startup world is testing pricing. So a lot of folks would say, “well, pricing is very complex,” but if you go and try to sell your product to 10 people and they all say no to you, it’s pretty easy. They are just rejecting you outright. So that’s where the qualitative can give you a very strong signal one way or the other that you may be onto something or, more in that invalidation phase where something doesn’t work, it’s very effective in finding problems.
Sean [00:05:09] Yeah, so I think you’re onto something here. It’s a new way of looking at testing, I think, for me anyway. So validation and qualitative being a way to quickly get these answers that point you towards the truth. We really don’t know what the truth is until you deploy something and we see it working in action, right?
Ash [00:05:26] Yes.
Sean [00:05:27] But using validation and qualitative research to get at answers fast versus the quantitative stuff that really is going to be useful for us over longer periods of time, so it’s more like a time horizon sort of footprint.
Ash [00:05:39] Let’s summarize it. I’ll even qualify that a bit further because the scientists in the audience would probably be cringing, saying, “well qualitative can give you a lot of bad signals, too,” and so that’s where to me it’s more about signals and noise and about insight generation. So when we are searching for ideas, all ideas look the same. They all look very promising to us when we come up with them. But when we go and do some of this qualitative learning, we get to see which are the really bad ideas fast and we can turn them down. And the ones that even get the 10 out of 10 people are willing to buy or people are engaging, we don’t know if that’s going to scale, so that’s where the metrics come in. So it is that time horizon. So even when we get the validation to move forward, I call it validation and not verification because it may still fall apart when you try to scale it.
Paul [00:06:24] An interesting distinction, so validation versus verification. I think that’s often lost in sort of the checkbox mentality that teams fall into looking at the features that they’re shipping versus the outcomes that they’re having, right.
Ash [00:06:36] Yeah.
Paul [00:06:36] I’m going to sidebar into a little bit of a buzzwordy topic, but you’ve written a few posts over the years recently about Jobs to be Done and the way that it’s started to have an impact on product culture overall. And it’s funny how polarizing Jobs to be Done is. First of all, just stating it as part of a cultural touchstone, Jobs to be Done as either one of those things that you love it or you hate it. And I think that there’s something that we can learn from some of the things that proponents of it have had. And one of the models that you shared that stuck with me was that people go to the store to look for a drill bit so they can drill a hole. They’re not buying the drill bit, they’re buying the hole. But you extrapolate that out to well, maybe they’re buying the drill bit to drill a hole to hang a hook to hang a picture to express themselves so they feel more emotionally complete. And that story is just way outside the narrative that a lot of Jobs to be Done is focused on. Is there a value that you think we can start to push this model around? What thinking have you started to pull out in your own practice, in your own models?
Ash [00:07:44] So I would say from my first exposure to it, which for a lot of people it is the Milkshake Study, I was fascinated that there was something here. But the thing that always troubles me is when I see things like it, it feels a bit like a magic trick. I see the result and I don’t know how they got to that result. And so I searched for, back then, there was not a lot of information when the first kind of case study was shared. I eventually ran to some of the consultants in the Rewired Group who were part of that project and got a little bit of the backstory and began to understand more about their process and along the way, I read many other kinds of frameworks from many other thought leaders.
Ash [00:08:19] And it is a bit unfortunate that there’s a bit of factioning happening and lots of kinds of breaks in the framework. I did find that the big idea there, which is hiring and firing products, those kinds of seemed to resonate with me. And I’ve looked far and wide for exceptions to the rule. And what I mean by that is, for every product that you build, there’s always a product that already exists, an existing alternative, that you are potentially replacing. And I’ve looked even across disruptive products and I’ve always found those existing alternatives. So some of those principles to me, are still very sound, and I still teach those when I am coaching teams to kind of think about hiring and firing.
Ash [00:09:00] The pieces where I have tried to go further is trying to understand the bigger context. So I get into jobs theory, there is the functional job, and then there’s the emotional job. And I find, like in the drillbit example, people don’t care that much about a functional job. Like in my mind, the perfect job is, “get me the outcome without any work.” That’s just our human condition. Like we just want to do nothing and we think that’s utopia for us. I always then wonder what we’ll do with all that free time, but that’s what we are always chasing. All engineering, all scientists are trying to essentially eliminate the work part of the outcome. So that is the perfect ideal solution and we can never achieve it. There’s always going to be some issues. That’s how I like to think of it. So to me, the functional job is not where I like to settle. So in the drillbit example, when I’m interviewing, say, around the drill bit and I see all these problems with shavings on the floor and the drill is not being perfectly round and the drillbit is breaking, that to me is still in that functional realm and I want to get to the bigger context, which is, why are you drilling the hole in the first place? And is it to hang a painting? And if you’re hanging paintings, maybe I’ll sell you a TV. I just ran into the Samsung Frame TV. I can sell you a TV instead of a drill bit. And that’s what to me is the powerful insights that come from thinking bigger.
Paul [00:10:17] I love that and I think the anecdote piece of it that you start to get to is, first of all, you have this experience that you’re trying to manifest in somebody’s life. They’re not trying to sit back and do nothing, right. They are trying to do something. So it’s not this utopian existence. It’s that they’re trying to do something and do it effectively so they feel fulfilled. But I think one of the things that is a takeaway from this for me is just the hiring and firing aspect is something we do not because of a failure, per se, but just because the job may be different for me than it is for the person who built the product. For example, on LinkedIn, the messaging app works perfectly fine on desktop, but for whatever reason, I always fire the desktop and pick up my phone and chat using the in-app messaging. So I don’t know why I’ve developed this personal practice, but I’ve essentially fired the desktop function and hired the mobile function and I don’t know why. It’s just it’s because it’s in my thumbs and not something that I’m trying to manage another email inbox.
Sean [00:11:20] You have a bias for using your thumbs for communicating.
Paul [00:11:23] Apparently.
Sean [00:11:24] I don’t know what else that says about you, I’m going to leave it at that.
Paul [00:11:27] I think the other big takeaway there that I heard that I think is important is just getting out and talking to users, right. We are never going to know what job they’re trying to solve until you’re close enough.
Ash [00:11:39] That’s exactly right. But what’s even more fascinating, and this is where I’ve also more recently gotten fascinated with the whole Kahneman/Tversky behavioral economics kind of body of work. But even the users and customers sometimes don’t know. Just like you were saying, you know, you don’t know why you’re using thumbs for LinkedIn. But even if you’d bought a product, people will often give you the surface answer. “I bought the phone because it was on sale.” But if you kind of dig a bit deeper into what causes that behavior, it’s sometimes very irrational, and sometimes that emotion comes in. We make an emotional buy and then we rationalize it later. But as a marketer, as an entrepreneur, as an innovator, understanding that irrationality is very powerful because you can better position products, you can better market products, you can build better ads. A lot of Apple ads are essentially jobs to be… And I know Steve Jobs, when he was alive, was gifted and his last name was Jobs too, he must have been a natural. But there are a lot of his ads where just really these kinds of bigger context, emotional stories, and it works really well once you can pull on those strings.
Sean [00:12:40] To your point from earlier, though, even those can be misleading.
Ash [00:12:44] Yeah.
Sean [00:12:44] Here’s the reality: whatever we think we know about our users, we find out isn’t always how it’s going to be adopted in the wild or how it’s going to actually play out. Which is why I believe in the power of thorough and persistent experimentation.
Ash [00:12:59] Yeah, that absolutely is the gold standard. So going back to what we were saying early on with the qualitative and the quantitative. The quantitative is where the data kind of proves the thing working at scale and everything else, in my opinion, is all insight generation and we are generating, hopefully, better insights and that’s where all the interviewing and the qualitative learning comes in. But there is no substitute for the experiments.
Sean [00:13:22] I love that. So insight generation, there’s another great little buzzword we can start to throw around. We are insight generation machines, Paul. That’s what product teams do. You know, you touched Dan Ariely, Daniel Kahneman, Amos Tversky behavioral economics. I’d like to pull on self-determination theory, the work of Ed Deci, Richard Ryan, sort of the science of human intrinsic motivation. Going back to what you said earlier, like figuring out what is the emotional need that we’re solving beyond just the functional need. That’s what you’re talking about.
Ash [00:13:54] Right.
Sean [00:13:55] What is it that underlies this human’s behavior that we’re trying to motivate, that we’re trying to influence? And it’s always about going all the way back to the father of positive psychology, Abraham Maslow. It’s always about some higher-level need if you pull hard enough.
Ash [00:14:10] Yeah.
Sean [00:14:10] Right? And that’s what we have to figure out how to systematically go and find. That’s the hard work of product.
Ash [00:14:15] Right. And I think there are two pieces. I mean, this is where, when we say customers are irrational, it’s that sometimes needs and wants are often at odds with each other. So we work with a lot of entrepreneurs and they will, for example, say, “I want to raise money.” And we tell them that’s not the right thing to do. But if I lecture at them, I’m going to lose them because they’ll go to someone else who will tell them how to do the thing they want. So this is where we sometimes, as product people, have to understand, there’s almost, if you kind of visualize a hill, when we talk about that progress or making… From a customer journey perspective, they’re trying to get to the top of the hill, but sometimes where they think they’re headed is not really in their best interest. And as product people, we have to use that momentum they have, that motivation they have, to maybe get them started but then steer them in the direction where we would like to leave them. And that’s where it gets even more interesting and, you know, when we talk about funnels and kind of changing behavior, new products are fundamentally, I find, about some kind of behavior change. So understanding that those behaviors are first, key, but then being able to influence them to become another thing just to start building into product.
Paul [00:15:25] So you talked about needs versus wants, and I think that’s a topic that doesn’t get a lot of attention because the fundamental user story that we work in our backlogs around every day has as an X, I want to Y so that Z. And it’s driven around wants, but the needs are really where we get the insight about what utility they truly need and the thing that they want might not actually be what they need in the product and the behavior change that we’re trying to make manifest in the world. And the hill metaphor I think is apt. I’ve seen the diagrams that talk about the job done at the top and the motivating forces pulling you up the hill and the forces of inertia. The needs are the status quo. It’s, “I’m getting this thing done in this way so why bother changing?” And I think that that kind of psychology really separates the good product managers from the great ones, the ones who can build a functional feature and ship it on time and under budget versus the ones who have the insight into what stories are being told and what experiences they’re making.
Ash [00:16:32] Yes. And even, you know, that journey to the top of the hill sometimes needs to be broken up. And this is where we get into a lot of those sub-customer types of metrics and habit loops and reward loops. But if you, for instance, don’t get an outcome for six months, you’re just going to lose a lot of people and so you have to break that up and sometimes even add artificial ingredients. There’s a book that I read many years ago, Charles Duhigg’s The Habit Loop book. And there he talks about Pepsodent, you know, and brushing wasn’t a habit. They had to add mint fresheners in there to give you this feedback loop. So you’ve got this minty freshness as a reward when you brush. But before that, we were not brushing regularly because we just didn’t think the stuff worked. So that kind of product design is what I was getting at. And that’s a bit of that irrationality of human behavior, is we don’t stick with things necessarily long enough till we get some kind of feedback that this is the right thing. And as product folks, we have to sometimes even add some artificial [methods], whether it’s gamification and intrinsic is better than just, you know, arbitrary badges, but some kind of feedback loop that this product is working and you are making progress.
Sean [00:17:35] Awesome. Well, I’m going to take us in a little different direction. Most of your work, at least the books I’ve read that you’ve written, has been focused on Lean and taking the concept of Lean, the Lean Canvas, taking it to the next level, and running Lean. I’d like to talk about that a little bit because I’ve also read, about a year ago you wrote an article on Medium. Here’s a quote from you: “Using a Lean Canvas does not a Lean startup make.” I’m going to let you run with that.
Ash [00:18:01] It ties into the bigger context. So the drillbit analogy. So for me, I find, unfortunately so, I go around and do workshops and some coaching and sometimes people bring me into conference rooms and they say, “look at all the Lean Canvases we have.” Some of them are filled, some of them are blank, but the dates on them are very old. And so to me, that was a team that was following process because they were maybe forced to or asked to and they’re not really using the tool in that intended purpose. So that’s kind of where that gets into is with every one of these, and it’s not just Lean, it’s Agile, it’s like everything, every process, every framework that we have gone through, it comes down to the set of ceremonies sometimes. And we are doing the motions because we are being asked to do so. But without the right mindset, we kind of lose a real essence of it. And it happened even in the original Lean when it got brought here, and even, I think there’s a quote in Jeffrey Liker’s book The Toyota Way, where some folks went back to Toyota and they saw that they were operating at a very different level. Lean to them was a way of thinking. It was down to the cultural level, while for everyone else it was just copying a process and you can’t just copy a process and get the outcomes. So that’s where that quote really came from is that there is definitely the kind of checklist process that ensues with every framework and if you just do a Lean Canvas, you think you’re being lean. And that to me is not the right way to think about it.
Sean [00:19:25] Yeah, which brings me to the importance… You talk a lot about, at least on this pod here and in some of your later work, you talk a lot about the context within which the team is doing it, which always leads me into the conversation around culture and where culture comes from. And I believe one of the most important things a product team does is to work on its culture, to think about its culture. You know, and culture comes from context. So what happens is, you have this other context. Even though you’re trying to operate Lean, you still have to overcome all of those old habits and all the anxiety of doing things in a new way.
Ash [00:20:02] In many ways, that’s where my most recent work has shifted. It’s less about the merits of the framework because I think people kind of get it and sometimes some of these things feel commonsensical. It’s like, “yes, we need to test, yes, we need metrics.” We need to do all these things. But when you go into practice, it’s exactly what you said, is that old habits die hard. So you’ll find people stuck to, “yes, I need to go and talk to a customers. Maybe I’ll do that tomorrow. I’m more comfortable building product today.”
Sean [00:20:30] Yeah.
Ash [00:20:30] And so those kinds of triggers are things that either the team self-realizes one day and the culture starts to change and we talk about transformation, but they start with these kind of triggering events or you get some external agents who can come in and through coaching, they can help create those triggers. But I find that change doesn’t happen without triggers. We can only bring the horse to the water. We can’t force them to drink.
Paul [00:20:52] So I want to go back to sort of the bigger concept of Lean in general. And I think when it’s applied traditionally, it’s in the entrepreneurial startup mindset. But many of our listeners are product managers at corporate, they’re product owners at an agency. Is it applicable to take entrepreneurial concepts and apply them in the scrum team, in the design sprint? Are these things applicable outside of Silicon Valley? And can you start to take these principles and take tools and use them day-to-day? How do you see that working? And potentially how is that changed since you started writing about it?
Ash [00:21:31] Yeah. So I strongly believe so. And definitely in the early days, and we’re talking about Lean Startup, so the original Lean even goes back to like manufacturing. But the original Lean Startup, Eric Ries was inspired by some of the principles in the original Lean and so that’s why you call it the Lean Startup, and he was in a startup, so that’s how it got named. And he then tried to change the name because it was too limited. It’s definitely expanded since then. But I would say that there are aspects of it. The way I look it, I sometimes make a distinction between entrepreneurial thinking and entrepreneurship. So not everyone has to go to a startup or be an entrepreneur to apply these techniques. These are all problem-solving techniques and there are some fundamental principles that go into product. If we believe that the products we are building are ultimately about creating value for customers and then value capture, which is monetization and all those things come as a side effect, then that should be the first order of business as we cover it. Those are the kind of the principles that we take from Lean and if we try to model this in any kind of business, as long as that business has customers, which are all of them, some of these ideas can be applied. So a Lean Canvas is not startup agnostic. It talks about customers and problems and solutions and it can be used for low tech and high tech businesses. So from that perspective, a lot of these ideas can be brought in.
Ash [00:22:49] Now, when we talk about measurement and metrics, in some arenas, things are easier to measure, especially if you’re digital, you can get a lot of metrics. You can do a faster progress reporting than if you were maybe a physical product or a services business. But there are still metrics to be collected. You may have to do that manually and even that’s changing. There are these cameras in retail stores now that can do facial recognition and identify people and faces and visitors. So even that world is changing.
Sean [00:23:16] All right. One of your posts was titled from a couple of years ago, I don’t start with an MVP.
Ash [00:23:22] Yep.
Sean [00:23:22] Remember that one? I think it was kind of a story, more of like a parable.
Ash [00:23:26] Yes.
Sean [00:23:27] But I love it because it’s another one of those buzzwords that gets thrown around the product space. MVP, everything should be honing in on an MVP. And I firmly believe that you shouldn’t even be talking about an MVP unless you have an MVA, a minimum viable audience that you’re talking about first. What do you think of that?
Ash [00:23:42] Absolutely. So the whole premise of the MVP was sound and theory, which is rather than building a big product over nine months or a year, let’s build something smaller and then iterate and refine. The challenge is that today customers have no patience. Rightly so, because they have so many choices. If I build a crappy MVP and say, “oh, you know, this was my best attempt to solve a problem for you,” and you say, “I don’t have this problem,” it’s like I don’t want to talk to you anymore. It’s like I have other things to do. And so you lose that customer right away. In the old world where you had that deep relationship, the customer could say, “no, no,” you know, “you got this wrong, but let me sit with you and you could do all these requirements pulling out or refixing.” There was the old joke, you know, raise enough money to build two products, one to give it to the customer and completely fail and you learn a ton and then you build it again and get it right. That was the old waterfall world, but it doesn’t work as effectively now because customers just leave. We all have to level up as product people and try to get on the right problem worth solving right from the very beginning. And so we do not advocate starting with an MVP, even a small one, because you then get into this bill trap of customers aren’t talking to you and so we start guessing. We chase this mythical killer feature and it’s just a hamster wheel. We go around and around and we run more experiments, but we are guessing, not learning.
Ash [00:24:56] So when we talk about insight generation, we can generate all these insights, but it’s a disguise. [inaudible] It seems like we are pivoting. We use those words. But to me, that’s not a real pivot. So I’d rather slow that team down and say exactly what you said is, “let’s go and figure out…” And this was how the Lean Canvas was even built and we’ve even created this variant called the Leaner Canvas, which is nothing more than the customer box and the problem box just squished together. And so it’s basically telling me, who are you building this for? You know, what are they using today? So not what are you building, what are they using today and what’s broken with that? Because if the incumbent always wins, if we believe from Jobs to be Done that there’s always an existing alternative, if that isn’t broken in the customer’s mind, you either have to break it. You have to with your marketing, break that product so they switch to you, or you have to realize that their product is broken and if you can highlight that pet peeve, you can cause that switch.
Ash [00:25:50] So that’s how we simplified it. And you don’t need a product to do this. This can be just an offer. So once you’ve got your minimum viable audience, you can put an offer in front of them. And even just talk about the problem. You know, if you can say, “I believe you have these problems,” and you can articulate it well… In marketing, we call this the strategy of preeminence. If you go to a doctor’s office and you get correctly diagnosed, we believe that the doctor has seen this before and they have the cure to our problems and we take the prescription and we go apply it. The same thing happens with customers. If you can even just, not even figure out solution, if you can articulate their problems better than they can, they transfer expertise to you and that starts a conversation. And then you can go deeper and get to those deeper whys, the bigger context, really find those problems worth solving. And then you can come back and say, “Hey, if I build this for you, you know, would that work?” And you can do all that with the offer testing. And if you can sell the offer, then build the MVP. So the way that we like to summarize that is, the old world approach was, let’s build a product, let’s demo that and then sell. In the new world, we do demo first. So let’s build a demo, let’s sell the demo. If the demo doesn’t sell, widen the product. The nice thing with the demo is that you can iterate way, way quickly. So we can go and change that every day if we want to.
[00:27:05] I love that concept of the incumbent always wins, right, your strategy of preeminence. Even with Jobs to be Done, like we have a way in which we use it like if we don’t understand how they’re solving this problem today, we have no business starting to think about how we’re going to help them sell it better tomorrow. So we’ve got to catalog all the key problems and how they’re being solved today. Because like you said, the incumbent always wins. I love that.
Sean [00:27:30] I also love your… I had a visual reference in my head of the hamster wheel of the killer feature. I can imagine trying to draw that out in an animation of some sort.
Ash [00:27:40] Sure. But I think that insight is actually a gift, but too many innovators just don’t see it. And what I mean by that is that we’re always fighting the innovators bias, which is I want to build something cool and different and I don’t want to solve the obvious problems. I want to solve new problems, and so we search for new problems. But the secret is that the new problems worth solving come from old solutions. And so all you have to do is go study those old existing solutions and you can look at story after story of massive innovation. When Steve Jobs gets on stage, he says, “smartphone is not so smart, half of it is keyboard, and so I’ll show you an iPhone with all screen.” And I’d say it’s contrasting what’s already in your hands and what I’m going to give you. And that causes excitement. That causes massive switches. If you just demo the iPhone in isolation, sure, it would get an effect, but people have to connect the dots. Steve just connects for you.
Paul [00:28:32] That’s a super-compelling narrative. I think that it is a technical marvel when a feature is released all on its own and it’s a piece of functioning code and it works and it’s slick and it looks pretty. But when you do connect the dots and you lead the horse to water, so to speak, you can see, “Hey, this is a great solution, but here’s the problem that it’s solving and I didn’t even know I had this problem until you told that story.” One of the things that I want to pull on a little bit further of the strategy is, the thing that we’re really talking about is time. The faster we get to knowledge, the way that you’ve put it in the past is that the speed of learning is the only true, unfair advantage that you have. We live in a world of smart people with access to the world’s information at the tip of their fingers. Eventually, somebody is going to get there, and the faster you can get there, the more unfair your advantage is. So how do you think about learning in teams in solving problems? What are some of the strategies that you’ve seen succeed personally, and then within organizations, that make them thrive?
Ash [00:29:38] Yeah. So I’ll firstly phrase the speed of learning is that there’s a lot of information, as you said, and sometimes we chase the easy things that are not what Peter Thiel likes to call secrets, like behind every inflection point of a feature, of a product, an idea, there’s usually something that that product team saw that others didn’t. And so uncovering those insights or those secrets are really what that learning kind of alludes to. And there’s no easy answer. So it’s up to me, like those insight generations, it’s something where good ideas can come from anywhere. And so what I try to do with the teams that I coach, and even internally at LEANSTACK where I work, is try to create an environment where we try to generate a lot of ideas and then very quickly test what we believe to be the most promising ones. And so this is where we borrow some thinking from the folks from IDEO, design thinking, the converge diverge.
Ash [00:30:27] But if I wanted to work on something, I would rather get my team in and first align around some big problem themes, not specific root causes or causes of the… You know, things that I think are the problems, but like problems, these could be like a metrics area, like people aren’t buying or people aren’t activating our product; we are losing people in the first 30 days, and get my team aligned around: this is what I want you to go focus on, but let’s not brainstorm here in this room because we’re all going to influence each other. You probably will let me give you all the ideas and then will go and just implement those. And I’ve done all those types of things. So I’d rather a team go away and do their own research. Some of them, if they’re design-oriented, they might go to usability-type things. Others might look at metrics. That’s where we break the curse of specialization. So the course of specialization is if you just go to your developers and say, “I have problem A, how do I solve it?” You’ll get a lot of build-like things, you know, features and performance. Go to your designers, you’ll get design solutions. Go to your marketers, you’ll get marketing solutions.
Ash [00:31:27] And the way to break that is by saying, “here’s a problem area, we don’t know what’s causing this. Go and do some of that insight generation research and then come back and then let’s look at all these ideas and rank them.” And the ranking also is very empirically based on evidence. So if a designer came to me, and we have done this as well, they will come up with a design on paper. It looks beautiful. It looks better than what we have, but we don’t have any evidence that aesthetics is the problem, right. Can we go and test that somehow? So that’s the next question is how do we then test those? And if an idea proves promising, then we just run an experiment. And that’s the last piece, which is time boxing. We try to figure out, how can we test the riskiest assumption with this feature or this idea as quickly as possible so we can get the signal in the noise? We are here doing a podcast. If I want to test podcasting, rather than going and buying all the equipment, what if I just shot one show and put it out there? And let’s measure and see the impact and if nothing happened and people said that’s just not the right medium for our audience, we get that signal. Let’s not do any more of this. Let’s go find something else.
Sean [00:32:32] Yeah. You’re going to make me think the podcast here. Quick question we ask all of our interviewees: how do you define innovation, Ash?
Ash [00:32:42] So I contrast innovation from invention. It’s not original thinking, but I look at the invention as a new way of doing things and I look at innovation as taking that new way, technology, method to market.
Sean [00:32:55] Cool, that’s neat. It’s a neat way of looking at it.
Paul [00:32:56] I think that’s one of the more unique answers that we’ve gotten. That’s really good.
Ash [00:33:01] OK.
Paul [00:33:01] So I want to thank you for taking the time to chat with us today. I heard a ton of things that I’m going to take away and apply in my own practice. I think not the least of which being the time that we spend in processes and check boxing, you know, kind of where the rubber meets the road, is sometimes a placebo for the hard thinking that takes time and energy. What I’m going to try to do is carve out some time in my calendar away from scrum meetings and ceremonies and start to try to get to a place where the more important, “what problem am I solving” kind of thinking is done. I think that it’s been a really refreshing conversation from that point of view and hearing how the ways the products are built are going to be a direct manifestation of the teams that build them. And the teams that build them are only as strong as the leaders who are articulating the vision. So I think that’s an encouragement to anybody in a position of product leadership to take a look at the time that you’re spending on the problems that you’re solving.
Sean [00:34:03] Mm-hmm. I like that. Thanks for joining us, Ash.
Ash [00:34:06] Yeah. Thank you, Sean, thank you, Paul. It was a blast.
Paul [00:34:09] It’s been a pleasure. Cheers.
Paul [00:34:14] Well, that’s it for today. In line with our goals of transparency in listening, we really want to hear from you. Sean and I are committed to reading every piece of feedback that we get. So please leave a comment or reading wherever you’re listening to this podcast. Not only does it help us continue to improve, but it also helps the show climb up the rankings so that we can help other listeners move, touch, and inspire the world, just like you’re doing. Thanks, everyone. We’ll see you next episode.