How to build an engine to build Product-Market Fit

https://www.dropbox.com/scl/fi/3wc7gymehtmrz3vd1o1bv/Product-Tank-Building-a-PMF-Engine.key?rlkey=tsd8bth189v6vovug7x3jvequ&dl=0

https://app.fireflies.ai/view/Javi-talks-at-Product-Tank-Malaga::m1Hulm06iELw60Xy?channelSource=all

Speaker 1: 00:00 You know, why is this talking interesting for you? So product market fit engines or the process that covers it which is for market fit discovery.

As I was saying, it's a repeatable framework that increases the likelihood that you will build a product people want before you run out of money.

And those are very specific variables because the likelihood, because most startups fail to build a product people want because that's really the aim to find out what they want and then build something that meets those needs.

And then the classic, especially if you're in an early stage startup or in a product project with very set budget is before you run out of money. So there's always a clock ticking in startups and in projects. So goals are today, providing first principles understanding of what product market engine is and product market the discovery and then to provide a bit of a process at least the layer one, it's the second layer which is a bit too technical for today.

-

Quiz: What is your favourite tool to do X? example

-

So first I want to kick it off by understanding what progress really means. Think what are the keys to victory? What does success mean in product market fit? Right. So let's take it to the highest level and then we're going to bring it down into the tactics. Right. 

Because as product managers, we can lose sight of what the real goal is. And specifically with a startup is that we're creating a financial product either for you or for your investments, right? 

Because if you're going to grow your company, you're going to need resources. And if you need resources, either you need the revenue to invest in the team and the resources, or you're going to have to raise money from investors, right? So one way or another, you need to make it work. 

Speaker 1: 00:00 So you know one of the cool things about it that you can think about it in these three sections, right? So you're creating value, right, with your demand. You're delivering value through fulfillment and you're capturing with your business model, right? So the idea is that you need to create more value, so. And so you need to capture more value than it does to deliver that value, right?

The way that we're measuring success is through a word called traction. And the idea is that it's repeatable. Traction. What's traction? It's the rate at which your model can capture monetizable value from its users, right? Whether you're charging users or you're exchanging signups or eyeballs for advertising, it's monetizable value, right, that you're capturing, right?

So the way that we're doing this is through a customer factory.

-

We have an honorable visitor that goes through a customer factory. We have a distribution mode, we've got a product mode, and then on the other side of the factory comes a happy customer and business lounge, right? So why is that important to think about it in this model? Because what we're trying to do really in this process is we try to maximize the throughput. So the amount of these that are converted into those of prospects, that are converted into happy users and money while you're minimizing the costs, right? So the way that we're minimizing costs is really. There's three variables inside of the cost of acquisition, right? We've got the features, which is the number one enemy actually of product managers and operations, really features. And as a product management, you might say, like, what do you mean the number one enemy? 

That's literally what we're here to build. The thing that creates the biggest constraints in the customer factory is always features. The more features that you have, the more problems that you have, right? So what this is telling you is like, what is the minimum amount of features that you can build in order to create value, right? So we have to start, you know, when you look at big roadmaps and backlogs and all those things, that is they're not good. You have to look at them with a very critical eye because you have to be incredibly selective. If not, that's what they call, like, become a feature factory, right? It's not what you want to be doing. So why is this important, right? We want to do this before we run out of money. That's important. How many startups close? 

How many projects inside big companies also close? Mostly everything closes because we're not showcasing enough value. Something is showcasing a lot of value. Normally it's not going to close. So we want to minimize the time it takes us to showcase traction, to show value, right? And there's only one way to do that. That means that we need to reduce, right? And waste are these things that we've seen down here. 

-

So, so the main goal here, if there's three phases, right, We've got an idea or product, what we call production. We've got three processes, right? Our go to product when we're launching an alpha or something, or go to market when we're launching an MVP version 1, 2. And then we arrive to minimum viable traction and then scale, right? 

So the goal is to arrive there at the right time with the right product so that we can put the right amount of resources to go to scale, right? -

But what does even product market fit mean, right? There's a lot of discussions about this. There's very qualitative answers around it, there's more studies maybe about it, but it's still quite open, right? Even Martin Grayson was a big guy in the space. Pretty amazing guy, right? So being in a good market with a product that inside. Inside that market, right? Pretty hard to measure that, pretty hard to understand how you're arriving that, right? So first of all, how do we measure product market fit, right?  So there's specific signals that we can use to try to measure forward market fit, you can use the three different modes at three different stages of where your company is in terms of fidelity. So at the start, where you might not even have analytics, you might not even have enough users to do that. You can just use quick surveys, for example, the Sean Ellis test. If you guys know and just ask, like, how disappointed would you be if you couldn't use, you could no longer use this product? And they have like, very disappointed, somewhat disappointed, a little disappointed. So if a lot of people say that they're very disappointed to leave your product or that your product would not be active, that's a measure that there's some kind of stickiness there, right? So that's the first level.  Second level, if you have a running product and you have metrics in there, you might have Google Analytics or Mixpanel or anything, you can start measuring leading indicator for engagement, right? So these are specific actions within your product. Whether a user clicks on something specific, right? Like a core product. If it's Airbnb, it might be bookings, right. And then in terms of retention curve, which is really the real measure of product markets, that this is what we want to get to is how many of you guys are aware of a cohort analysis does that bring about a lot? So you're measuring the usage of the product by when users sign up, for example, to your cart. 

So if you have the month of January and February and March, you measure, okay, the users in January, how many of them came back in February or March or April, right. And you're measuring them in buckets. And depending on the changes that you make in your product, let's say that in January you make a lot of improvements to your product. By February, the amount of users that come from February to March will be higher. Right? So you're improving consistently and you can show it with this type of analysis. That's where we want to get to. And ideally you want to see what they say, a flattened curve, right? So, okay, so now we know a little bit. What is product market fit? How do we measure it? Right? S

-

So then we move on to constraints, right?

Speaker 1: 00:00 Like what are the things that are limiting, what are the problems to product market fit. So Paul Graham, founder of Y Combinator. So there's just one mistake that kills startups not making something users want, right? So 42% of start track is pretty significant, right? So it's really telling us what are the main failure modes that we need to watch out for. So this is the traditional route to product market fit or no product Market fit, low product fit, right? So we've got like a traction. And this, again, this could be measured by your survey, your metrics, your cohorts. And then this is progress, right? So what regularly happens, And I'm sure a lot of you are aware of this, especially if you're starting again, if it's. No matter if it's an early stage and it's first product or it's a new product that you're launching, right? You kind of build every MVP that's pretty classic right now. And then many users retain and keep using it. So you get a bunch of feature requests, and then you start building all those feature requests. And then you're like, oh, we don't know if we should add this or that. And somebody comes in, like, we should add also this. And then your reservation becomes bigger. And then maybe you get a bit of a spike. So you get excited, but that was maybe just some newness from the market. And then you just keep making the iterations bigger and you just kind of coast along, you know, not really getting. And the product gets bigger and bigger. And then it's your. Your instrumentation systems are. It's harder to get data from people. You have many systems, and then you get a bigger team, and then it's just chaos, right? You can't really understand what's going on. And everybody's just winging it, right. And just hoping for the best, right? So that's very, very classic. And this happens even with millions of users. I've been in experiences where we have 4 million users and there's some kind of fit, but we didn't get. We were getting enough revenue in order to get out of the investor kind of journey. And then, okay, let's put feature requests. This is what should be happening. And then somebody else comes in and, you know, it's very dangerous, right? So there is a real psychological process that's happening, you know, around this, right. And why these things happening, right?

So the first part of the product, the problem is that we're choosing the wrong market. I mean, customer segments could be the job to be done. And it could be multiple layers, right? You could be hitting the right, top layer level, customer job to be done, right? Again, if you're doing an Airbnb, if you're renting homes to people, it might just be that you're targeting the wrong segment. It might be that you're going for people that don't have the budget, right? It might be. So that's choosing the wrong market. You're choosing Wrong. Your segments, right? Which means also that you might be choosing. Sorry, you're choosing the wrong segment or you have the right segment, you're choosing the wrong needs. Right. The problems are important enough, right? So once you have that, you've got the wrong requirements. So you're building the wrong product. Right? 

You're building the wrong product. What happens after a while, you know, and this unfortunately happens, right? You've got investors, you've got people that you report to and you've got to, you know, send out dashboards and growth charts. And then what happens? That, oh, where can we find value? Somebody's head is going to cut here. So where are we looking at? Okay, well we've got a lot of signups. Let's showcase signups. They're exploding, right? So then you kind of show them that signups are growing. So what happens after signups are growing? That the next quarter you have to increase more clients because your product's not improving, for example, or maybe you're hitting great activation, but you're not hitting the retention numbers that you need, right? So then your team optimizes to work and improve those metrics that are working, right? 

And that's really troubles because again, it's not that nothing is not working, it's just like you're not advancing in the progress of the whole revention chart, right? So ultimately then you're running out of money, you've got to raise money and then prematurely scale right forward really does it. And again, this could be applied to anything. If you think about how you normally launch a feature, normally it's the same exact approach, just with one feature, right? Like you might choose the wrong person, you deal with the wrong feature, some details are wrong, then you get, maybe you think that you have feature fit and you don't, but maybe you've got a lot of people trying it, you've got a lot of people coming back to it, but there's no actually aha moment there. 

So you pretty much already scale it up to your old user, right? And then it's just adding more problems into the mess, right? So what's the issue here? How does that look like here? Right? So we have our go to products, go to market, go to scale. What regularly happens in the exploration phase? We get the RO market, we start building the wrong product, we check up the meanings of growth and this is the element, right? There's a report about I don't know how many, it's like 60 or 70% of SAR, prematurely scale, right? So you put in all the money, right? Let's go in that rocket ship and then most of them die, right? 

Speaker 1: 00:00 Remember how many examples of like big companies that are raised like 2, 10, 20, 30 million that all of a sudden six months later just like crash down. And this is exactly the issue. And again if you think about it from a feature level at a day to day basis, the same exact. Right. So what is the summary for this problem? Right? This is the one more feature track, right? And this counterintuitive to everything in product management. Again it's about prioritizing features, delivering features, right? So we have not enough users, so we add more features, right? And so we don't get enough users, so we have more features. Just kind of going in this, in this loop and then you got to move on to a new feature, you know, and then you prematurely scale because you have to.

-

And then you know, you're dead. You just don't know it yet. Right? So what's pre MF primary fit discovery? Why is it the safest way to go? Right. So just some quick facts around, you know, taking a high level view on why it's not a personal thing. I think there's people take it very personal knows and we shouldn't. It's really hard to choose the right market and to make the right choice, right? These are real numbers and early stage venture returns, you can see that like 0.4% are like 50x 1.1 20 to 50x. The main angle here is that it's really hard to choose the right thing, right? So most likely you're going to choose something wrong, right? Second thing that's very important and I think the new way of product management really does incorporate a lot of this. 

-

But still for us we have to remind ourselves of this. And I myself have to remind myself that anything that I come up with, it's just an assumption and it could be gold or copper, it could be crap because I don't have yet data that is validated in the market, right. Might be a great bet or a great guess. It's still a guess, it's still a bet, right? So we have to kind of take a step back around it next thing here. So our job then becomes to quickly, cheaply and accurately locate value, right? In this case with the market, the requirements, the product checks, right? Because what we have, if we have our business model canvas, everything's an assumption. We want to be converting these questions, these assumptions into validated data in a step process, right? 

And you could never do that in this manner. Imagine all the variables, if you don't it sequentially with a structured process. Right. So then this actually becomes, it's positive to them to think that our real job then becomes that we have to test and discover the best models and discontinue the rest. Right. 

-

Failure is a big part of the process and this is a big issue culturally in many companies. Right. Because the metrics by which you are assessed are not regularly discovery metrics. Right. And it's a big problem. Right. I was working, for example, I was leading the venture builder for Craft Heights, the company that does like the ketchup and monster and stuff. And we're reporting directly to a CRO and then they were assessing. So we're launching new innovative companies for craft. 

But the metrics by which they were evaluating process at the start was like cost of acquisition. Right. So makes no sense. How much does it cost you to get a customer at the start? Right. And that's fine if you're, you know, a bit further down the line, but when you don't even know who your customer is, what you're delivering to them, right. That, that doesn't make sense. Right. So culturally it's an issue. We have to be aware of that. Of course, if you're a startup, you're mainly, you know, in that space specifically with your early stage. If you go more to series B, you definitely have some issues. But this is the main thing culture that we do need to implement inside the company.

We need to have an exploration and discovery mindset, which means that our goal is to test a bunch of things methodically and then slowly find what the model that we should see scale. 

-

So. So there's two fundamental premises of, of discovery. And I always really love to just go to like, you know, real examples because I always found that in my journey everybody uses words and you know, about pro market fit and as such, and honestly to me they had no meaning. So I'm trying to get what other practices out there do this process at an incredibly high sophisticated level. And you know, I got really into like oil and gold exploration and like rock mining because these guys literally do this process. And if you see how companies operate, you know, then it's going to become pretty apparent that some of the things and how we act regularly, it's pretty crazy, right? So the first rule is that we have locate value before we try to extract value. The valley. Right. Which means that in their case, or better said, in our case, it would be pretty crazy. We just went to the middle of the sea, and then just picked the pace, you know, and then just build like a, you know, like a 4 billion oil platform and then see if there's actually oil down. Right. Like that would be absolutely crazy, right? So what they actually do in the fact they send some ships, inexpensive ships, they do some. They send some signals out, those signals return, then you can get a map of what the seed looks like. If there's value there, if there's oil or there's signals of oil, then you start doing other like drill tests. And there's a bunch of things that you do before you do the platform, but that's the first one. And normally today, what do we do? We build a product first or we build a feature first without doing the right exploration and Right discovery, right. It's just that this is in the billions of dollars, you know, just because it's €10,000, you know, or much more in some cases. You have like a half a million startup, you know, and we think that because it's software, you know, we think that it's not as serious, but this is exactly what's happening now. That's what we're doing.

And the next one, equally as important, is that we validate, always at small scale, before scaling, right? So again, this is how you do some of these processes, right? The first thing, you put one plus side, and then you're validating your risks. Not only that you extract the value, but then you can build the platform and then you have the right thing to put the platform. But once you've covered all of those risks, then you start putting more platforms in place, right? But first you find a fit, you locate and extract the value at small scale, and then you scale up the operations, right? So, so with those two in place, how it looks like, you know, from, you know, a first principles perspective, how discovery looks like is first you start with modeling. So you model the demand.

You try to understand, you know, not through market size, but with real market contact, you know, what the market is, what the possibilities of willingness to pay for, what's the problem that you're solving, how important is it for people? And then once you do have some hypotheses around that, then you do sample runs, then you do small runs, right? You don't build like a whole product that takes three months, right? You try to do something very small and you do this small fidelity. We're going to see a few things later about how to do this right? But I just want to make sure that we're keeping these concepts. Because if you understand these concepts, then you can translate into your own bag and your own product without me showing you the example. Right? The tactic is really unimportant.

Like what's really important is this approach and this level of thinking, right? So by the way, in some of the specific fears, and we release a small stand, you do a prototype, you can do a skeleton scope. So very few features, very crappy design, very low number performance, most likely not even working. It's like you kind of fake it. You do, you quickly put together, you find 10 customers, right? You don't need a thousand users or the 5,000 users. I mean, we have wait lists of like 50,000 users in the task. Why do we have wait list of 50,000 users? We can't put 50,000 users in the factory. We just need 10 people to get it started. Right? And then once we have that, then we start with your bigger scale. 

-

Version one, version two, you increase the scope, you polish, you add the design and you add more users, etc. Right? So let's bring it down to us, right? Like tactically, what's the process, right? So something that's very important on the high level that we have to get started. When we talk about identifying the right market, we can also talk about right markets. There's another important segmentation rather is what type of market that is. And I'm sure you guys have, you're aware with Crossing Mechanism, there's a book by Jeffrey Moore, it touches about, you know, a concept that he called earlier. The hobbyists, the people that are really excited that are working on something that nobody cares about, have. Right? And then you've got these early adopters. 

Early adopters is the type of people that if you approach them with a product that's unfinished, they're very excited about, they're willing to try, they don't care if it's not finished. They're willing to give you some money because they're innovative. They don't need too much security on some things, right? And that's the type of users we want to kick off our tests. Why? Because our first releases are going to be really crappy, right? I'm not. We don't even. We're not even going to have releases at the start, right? We're going to create like a napkin sketch, like a wireframe or a mockup. So we want to early adopt the right. Early minorities, where a lot of the market is, right? That's the bulk of the market or half of it, at least.  So we don't want to target those people because they're going to tell you, you know, when you're telling them, hey, is this a problem to do? Is this a product that could be interesting? They're going to be saying like, you know, this design sucks or what about privacy or what about security here? Right? The area doesn't care about any of those things. Right. They just trying to solve the problem. They want a quick solution. Right? So those are the guys that we want to target. And I've seen like a lot of the problems when we're launching something. That distinction is really fatal because if you contact 20 people and 10 of them or 15 of them, because statistically there's more.

Early majority are early majority and they say that the crop or the product is craft, then you're going to get demoralized or you're going to get feature requests from them and then you're going to put them in your product and then they're not going to buy anyway because they have a big risk and you start in this one more feature cycle, right? So always try to identify that earlier. Dr. Again, as they say, it's person has a problem, is aware that they have a problem, is actively looking for a solution, most likely has stitched some other solution together or like a few solutions to make it work and that they have a budget. That's the definition of an earlier dolphin.

So once we find that we want to discover what their hair on fire, you know, problem, as they say, vital job to be done, then we want to deliver a core mvp. So a very small minimum vial of CORA has a term already, but it's I think blown up, right? MVPs are absolutely gigantic. So I think the idea is to re identify what's one core use case, a very small thing that you can deliver and then you start measuring like we talked about before, power market fit. If you have high power market fit regularly, you're going to start getting word of mouth and that's going to drive you to the early majority. If not, the main idea of testing is that we have these loops that we are trying to discover around.  So remember that 4% that we talked about that our job is to explore and test. This is where it starts. Is this energy? Dr. Is this a hair on fire? And then we look around, right? This is not it. This is not it. This is it. When we find something, we go to the next level and we look around. Okay, for this area. Dr. With this problem, I want to build this feature, this core MVT does It have attention. Does it work? And then we loop around it and then so we're constantly looping around these items here. Right. So how this looks like in terms of very specific customer factory. Right, the customer factory is broken down into these levels. Right. So you've got a reach moment, sign up, setup, aha moment and habit and then referral moment.  If you guys, any of you aware of the pirate metrics of Dave leclerc, does that bring about? So that was a very cool system, used it for a long time. But this comes from. This is a mix of some reforge. I don't know if anybody is aware of reforge. So they talk a lot. So they have the setup on the I have moment and they have it. And then I use this to change a few things and then add a couple more things and then add a few important elements which hopefully I will publish soon and hopefully with them. The main idea here is that most products, and I'm sure you've been in these conversations of, okay, let's talk what features we should release. Most everything is geared towards a habit model. And this is the really big problem.  So let's cover quickly what the definitions are. So rich means that, for example, something is clicked on an advertisement or has replied to a message that you've sent one day, hey, are you interested? Or you know, do you have a problematic click? Right. So sign up is once they've arrived to your landing page, right. And they put their email there, set up most likely something where you're asking them for donation, like, what did you come to do here today? Or what type of products are you looking for today? So anything that brings you to the aha moment, which is really the key in all of this. There's been a few definitions out there, but the one I find most useful is when the user finds real value. So I'm sure that's why I was asking before.  So if you have like WhatsApp or Outlook or Calendar, when is that moment where you're like, holy shit, this is really something that really got a quick example, for example, for me, you got the scooters, like the lime scooters. You guys are aware of these, right? So in Berlin, I, I hated these gurus. They were everywhere. I had a real hate for them. I was like, they're making this city ugly. And then I can't walk. And one day I had to go to a meeting and I'm really late and I see a scooter, I'm like, I'm not gonna make it like Uber's not gonna make it in time. I take my mobile app, I sign up in a moment. So the setup time was incredibly fast. And I got on the scooter, I had arrived there on time.  I was really like worried, like it was a very important meeting. And then I arrived and then I have like this go for the meeting of like, oh my God, what just happened? You know, like, it saved my meeting. And then all of a sudden, this whole new world that opened up in terms of exploring Berlin and I just had this realization that I loved scooters, care less about, you know, them making this city too dirty because I could get anywhere. So that's the aha moment, right? The habit moment is the next time that I was going to use it, right? So what other things are important for me to come and use it? Whether it's your classic notifications, which is a very simple way of doing it. But, you know, there's other products.  There's normally one core feature, one very important feature that makes you stick. And then there's other features that you start using, right? Your designer, if you're using Figma, there might be something very specific, right? And then once you're locked in and you're using that, oh, it also has this. It also has this, right? So most products are building this long list of features for habit. Like everybody, let's do notifications, let's do gamification, let's do that. You need to get to that aha moment first, right? So that's the most important thing in the customer factory here. We've got to work backwards from the aha moment that in the world market gets engineering to make it really simple here. This is the process that we talked about, right? There's two modalities that we're in model by hypothesis, right? Our guesses, our bets or release.  And in release, we want to have some kind of plan. Not too far, but some kind of plan where we're going to test a few things, right? And then we're going to check if we have pmf, we go back to our model and then we release. And then once we have this, right, this is our fail point here, right? We cannot go into explanation span or a scale, right? If you want to, if you want to extend your life time for a little bit, you could do that, but you're still going to die. So you shouldn't do that until you do have good measures of product market fit, right?  And that's really a lot of the work, I think with Startups is mainly getting this model interpreted right because everybody and the founders, you know, high up and disconnected, it's like, I don't care, we need to move. Like I've got people to answer to. We're gonna move, right? So, and I'm working with some guys right now in Switzerland and you know, they understood it, right? When you really explain it down to them and say ultimately understand is internally your company, it's like, hey, we can do it but we're going to see ourselves in a worse off situation and we're going to have invested all of this money and there's going to be more complexity at that level. So pivoting or making the right changes, it's going to be almost impossible. Right. So really important, right? So this model and release.  So I kind of created this. How many of you guys are working like build, measure, learn from like Eric Rees, like the Lean Startup, Very classic. I love it. Absolutely love Lean Startup. Got me started on a lot of this journey. I think it's a little bit, I think it could do with an update. And for me, this is the one that I created for myself where you're here for the rest of your modeling or your release, right. And each of those steps to get started at least which is what we're going to cover today. A little bit of that process. There's, there's sub elements here so let's run through them. In modeling we're going to, covering, identifying job to be done, gap define value proposition and prioritize risk assumption.  And then in release we're going to do prioritize atomic test, release test and check data. So model assumptions. So the first part, ultimately a lot of people start here and that's fine. Like you guys remember like the value proposition canvas to capture you've got your customer segment and you've got a product and it's a good starting model, you know, to, with a lot of people, you know, to get a high level understanding of, you know, who your customer is, what are these jobs to be done, what are the needs that he has on the negative, on the positive and then on the other side, the main idea is that you're creating a solution with features, right. That are directly tied to solve these specific problems. Right. So you can very clearly see if there's a mismatch. You know, you're just making stuff up.  And with a lot of code base that I work with, if you see that's empty, this is pretty filled up and you're like, yeah, well you Got to go back and do some exploration because it's not familiar in that problem that we just described. Right. So I think a lot more powerful is this. I mean, in the techstars workshop we don't cover this. For example, we start from here. But in Founders Institute we always start with this. Right. And that's the jobs to be done map, which is a completely customer centric map of the jobs or the actions that need to be taken by a user, irrespective of your product. Right. Again, so if it's in Outlook, that's why we're asking, in Outlook Calendar, we're asking what would you do without it? Right? So you need to know what hidden things you have.  You need to know what are the times you need to know one of the guys. So those are the jobs, Right. If Outlook didn't exist, you would need to do that. Right. So it's important to understand that level because that's where you can have real innovation. Also, a lot of the products that you have are just like little improvements, but when you start their job to be done, you can come up with an innovative way to deliver on that job. Right? So once you understand the job steps, you understand the hires of existing hires or existing alternatives, the different ways by which you accomplish those steps, maybe it's a paper, maybe you write them on your mobile. And then the unmet needs, what are the problems in times of time, effort, accuracy, cost.  So once you have that, then you want to map it. And let's cover a real example here, which I think it helps with what's coming afterwards. So this is a real company that was from the Customer Development Lab by Steve Blank, which is the mentor of Eric Reese. So this is a company that was trying to deliver drones for farmers to capture data. Right. So they thought that farmers, they would need to manage and treat their fields, right? Because they wanted to forecast production and they thought they would do that because there's a lot of diseases and that they want a profitable harvest. Right. So all farmers want the profitable harvest. They want to avoid disease. So they thought, okay, we're going to get drones.  These drones are going to fly over the fields, they're going to get hyperspectral image data that's going to help improve the forecast and help them with targeted interventions. Right. This was the model that they came up with. Very, very exciting model. Right. So the first thing that we're talking from, the first principles that we need to know that these are assumptions, right? They're question marks, they're beds Right. So we want to try to prioritize these assumptions in a way that reduces risk at the start. Right. So there's really a process which is we identify them, we evaluate them, we prioritize them. By the way, how am I doing in time? Yeah. Okay, Maybe I can speed up a little bit. So the first thing that we want to do is ask ourselves what needs to be true for this idea to work.  It's a very simple exercise, actually. More mental reframing than anything. But when we set out an assumption, and you can do this with. With homework, so you're working, it's just like manage and treat fields, right? And then we just act on that. If it was true, you just take a step back and say, well, for that to be true, farmers have to be interested in more data for fields. Right? So that's a statement that now we can actually go ahead and validate. Are farmers interested in this or not? Right. For example, for. We want to help them with diseases. The assumption is farmers struggle with diseases. Right. So that's a very easy validate. You go to a farmer and you say, I'm struggling with disease. How often? What is the consequence? What is the cost? And then you can validate the problem with them.  Not only that, if we want to use targeted interventions, farmers will use data for interventions. Again, people to the farmers, they told us that they're interested in the data, diseases. Will you use this data to help you for interventions? Could they say yes or no? Right. So all of these things that we're doing right now, there's no product involved. We don't need to build anything to validate all of this. Right. And just in that simple reframing, we can really kick off the process. Right. So we take all of them and there's others like, you know, that they have a budget, they're willing to pay, that we can actually do this technically. Right? So when we look at assumptions, a lot of the times, one of the mistakes that I see is that people treat them equally. Right? So you have to break them.  You have to rank them according to the category that they're in. And we saw this at the start of the presentation. If you guys remember Demand, business, fulfillment, the first thing, that most important thing that we need to validate first is the demand, the problem first and then solution afterwards. Right? Then fulfillment and business, depending on the company. But regularly what happens is that everybody, especially if you're an engineer or product guy that's coding up and it's building. Let me see if we can put this together, right. I don't have to do a technical spike. And then you start doing all this effort and you're like, well, you don't even know if they have the problem. You don't even know if they want the solution. You don't even know if they're willing to pay for the solution. Right? So always demand first. That's critical.  Right? So there's the canvas used by David Bland. I think he came up with the first version. So that once we have all these assumptions, we want to evaluate them. And the way we evaluate them is like how much data we have and how important they are. So importance which is covered. Problem, first, solution. Then depending on fulfillment or business, we can see and then the data, let's see if we have it here. Very little. The lowest level of validation is opinion, right. I think this doesn't matter if you've been in the business for 20 years. Great. It's a great opinion and it could be a good one, but we still need to validate that at lower levels. Second is model data or expert data. What we're really looking for is to realm more qualitative proof, right. Or quantitative proof.  So positive proof could be you talking to people, right. And getting that, those impressions from people, whether it's custom interviews or surveys. And then quantitative proof is really when you have a numerical result around it, ideally around what people are doing, not what they're saying. Right. So if you have an app, you're using Mixpanel or Google Analytics and you have numbers and say 400 people signed up, that's going to better than, you know, I've been in the business for 10 years and people will definitely sign up. Right. That's the level of difference that we're talking about. And in between is I've spoken to 20 people and 10 of them have told me that they have a four out of five level interest in using the product once it comes out, right. So that's what we want to think about.  So the idea is that we're placing this, right? We're asking those questions and we take those assumptions and we put them on the canvas, right? So at the top are the things that I play the circle in yellow, the things that we want to test for, right. So at the high level we're going to see that we're getting a roadmap here. It's pretty clear kind of the direction of what we need to do, right. So at the top was that farmers, again, this is a starting project, right? So farmers are interested in data for pills, highly important, not better yet, they can speak to farmers. Farmers struggle with diseases, same category. Farmers want to force our production. Right solution. They didn't offer them a solution yet. So still we need to validate that.  And the ones that we normally start with data can usually like, we can present the data, we can have drones actually capture the data. All of those much more to the right there. So when we, how can you bring this and how can I give you some kind of route into what your current process is? We all have product roadmaps. The main problem here again is that most of the product roadmaps, that we have a lot of the elements right at the top, they are not actually driving what we would call this assumption program. We're targeting items that are incredibly risky, are incredibly wasteful. Right? So when we look at their product roadmap, which was number one, build drones. Two, engineer super complex solutions, recreate an API for data pipeline, four, try to sell to farmers.  You can see here that all the first items, which are the ones that take a long time and super expensive, they're all grown, right? So this is how we can bring it into our organization. Right? So the product roadmap, high chance and it's quite expensive and wasteful and this is what's going to help us guide it. Right. So I think this one we're going to jump. So now we know what we are targeting. What are the assumptions that we need to go for? Right. That we need to test next level. So we've done the modeling, now we go into the second release, right? So prioritize atomic test release and check data a little bit faster through this. But the main idea here is that we're launching what we call minimum viable tests, which just play on the words of minimum viable product.  But the idea is that we're launching an experiment, right? We have a question. Do farmers, would farmers pay for this product? Then the release is, will farmers pay for this? It's an experiment. We don't know how we need to treat it as an experiment. So we use the scientific method assumptions. You go to the experiment, if it hits a number, it's true, if it doesn't, it's false. And then we start operating. Right? This is how we turn those question marks into checks. Why do we want to do it this way? Because we're going to learn faster and cheap, right? In the case of these guys or any of our projects, the idea is that you're going to bring down items that can take maybe 30 days that can be very costly. Thousand, five thousand, ten thousand, one hundred thousand. Into much faster cycles.  We're going to learn, we're going to get the answer faster and cheaper. That's a pretty good, I think, selling point, right? You want to get your objective, do it faster and do it much cheaper. So that's the reason to do it this way. In this case, they wanted to first buy the drones, buy the cameras, buy the software and engineer everything. That's going to take them forever. And they kind of, this is a lot of money. And then you have a small scale that you can deploy this experiment. Right? So what they came up with was, all right, instead of buying the drones, let's hire existing planes so that the farmers. There's already planes that were flying above the field, right? So let's hire those.  Let's attach some cameras in there, they're going to fly over the fields and then we're going to hand process that data. No APIs, no integrations, nothing. Right. And we're going to deliver the value and then we're going to see if it's useful. If not, we're going to iterate. Right. So main idea here, we already talked about it. We have these test cards. This is what Strategizer uses. I think it's a great way to get started. You'll develop. We don't use this as such right now, a bit more sophisticated. But I think this is a great way to kick it off inside the organization. Right. You have a card which allows you to document the hypothesis. And so a hypothesis is, for example, customer. Second one struggles with problem. Right. So farmers struggle with disease.  The experiment is we're going to talk to 50 farmers and then after we talk to them, we're going to send them an email that drives them to a page. And if they put their email there, you know, then that's going to be an experiment, right? And we're going to measure how many people click on it, how many people sign up, how many people pay and then this will be validated. If 10 people go through or 20 people, by the way, this one here is very. That's for the more art head science, to be honest, because it's very hard to get benchmarks at the start for these things. It's more about extremes. Like if nobody gets to it, then you really make a change, right. A lot of people get to it, then you're on to something. But yeah, that's.  There's still a lot of conversations around that. Right. So the cool thing about this Process is that right at the start, I'm sure you imagine that you have your boss and he's saying, I want you to do this idea. And then somebody else comes up wanting to do that idea and then somebody else gets mad because they are not included. You start the process, you get everybody's idea, open table, everybody writes all of the ideas down, right. This is great. And if somebody just has terrible ideas, especially if it's your boss, because you're going to put them through the same process for everybody and this is going to show which is the better idea. You put all the ideas, everybody feels included and then you prioritize them. I'm going to show that in a second.  And then we call it what I call atomizing the test. So graduates all the tests. And then you define by these evaluation methods, you know the time it's going to take you. So for example, in the case of the drones and hand stitching the data, right? So it's going to take them a long time. So three out of three, you have the cost, it's going to be really high. Three out of three, data reliability, really reliable in this case. And then critical, not very critical because you're not targeting individual hypotheses there. Right. But now you put the other pyramid that you have, right? Like we believe that farmers struggle with this disease. To verify that, we're going to speak to 20 of them and measure an X. Right. Kind of very fast. You can talk to farmers very quickly.  You go to a farm, you speak to them in five days, you can talk to a bunch. The cost, nothing. Or pretty much nothing, right. Reliability much lower. Right. Because you're believing what they're saying, right? So that's the trade off. You're trading off speed and cost organization. But it's still good insight for you to take the next level, right. And critical, very critical because you're targeting the highest level assumption, which is problem. So once you have that set up are the processes that you lay out your experiment, right. You map it out, user story map and then you storyboard it by front stage and backstage. So this is what the customer goes through, this is what the operational action that you need to do. In the case of the drones, the first step for the experiment was that people need to receive the drones.  Then they're going to scan the field, receive the data. And then in the back stages, you have to buy drones, program the field, integrate data. So what we do here is a mapping out constraints to learning. Right. The main idea here is that what Are the steps here cost a lot of time or cost a lot of money? Right. So we can see here very quickly that buying rows is very high. Program field is going to take you time, integrate data, take you a lot of time, right. So what do you do here? You just switch those, you come up with ideas that are specific that allow you to get the same learnings in a way but much cheaper.  In this case you exchange back to cameras, use my planes, fly over the field and then hand stitch the data so that they receive it. Right. So all of a sudden you have an experiment that's very expensive and costly and then you have one that is very fast and you're going to get very good data, right. So on a hyper tactical level, right, there's not that many tests to do. There's like always posts about the hundred to tests that you can do. And like libraries there's really like six things that you can do if you're trying to understand the problem. Do customer interviews first. Always. No matter what anybody says, if people say we can't talk to them, we're do customer interviews. That's it. It's really simple.  Like it's the fastest way to verify if somebody has a problem is you talk to them, you speak and you ask the question. And if they say yes, they have a corner, if not, they don't. I don't look at data, don't look at. I've gone through a lot of metrics at the start, you know, a lot of like survey data kick off with that. Really it takes five people, takes like two days to do it. Incredibly, incredibly useful. You can move on to surveys, which is great but I would never replace that. My background, business intelligence.  So deployed really high level analytics systems, you know and I completely realized that just wasting so much time trying to get these level of understanding, you know, of course you measure where the constraint is with your analytics, but once you know where the constraint is then you've got to go to your interviews. That's what's going to tell you the why not the where solution. You do a demand test. Either you talk to somebody and you say would you like this number one? Number two, you put that sentence in a maybe in a form and then you send the form to them which is like a low level landing page. Would you like this yes or no? Next one is you do a landing page which is more design, right? And after that you do some like clickable prototypes, right?  So no, just mobs or wireframes. They don't even need to be designed. After you're getting those validations, then you go to wizard1. So ideally, unless you have hyper crack team that develop incredibly fast or your feature is very small, ideally I would always do some kind of wizard related logo or revolving door. And then lastly you go on to actually develop the feature. Right? So I would always go in that level. Right. If you're doing something like this and you haven't validated your problem, you're wasting. Right? So last thing, super important. I only included this last year, but I always feel we're almost finished. Is that especially product people we love to think that building products the most important thing. But again, this is a factory. If there is no users going into the factory, you have no way of validating your product.  Even worse, right? So again, this is a revolving. There's a change, right, that brings users into the factory and you need that chip, right? So you need to validate at the same time your assumptions for distribution, right? Your minimum viable distribution. So once we do always. And that's there's. I don't know if you guys have been aware. Brian Chesky from Airbnb. It's like a really big discussion around products and product managers and product marketing managers. Product managers dead. Like we're gonna, we give them product marketing manager responsibility. I think like people might have not understood the issue. Really. I think the issue is that don't build if you know, if you can't put users in, right? Because then you're wasting everybody's time, right? So the idea is that we need to couple the distribution.  S Speaker 1 01:14:22 How do I get my users, how do I get users into the feature or to the product and what is the feature that I'm going to get them to do, right? So if you're right at the start, most of my conversations are not only around what are we going to build, but are we doing design partners? Are we getting a pilot? Is it a free pilot? Is it a paid pilot? Are we starting off with just premium? A lot of these questions that have nothing to do with the product but are a complete constraint on the product. The product is there, but you don't have any usage and how are you going to reach that? Right? So lastly, just again, these are all first principles, right? You do things crappy before you do them good.  S Speaker 1 01:15:03 Like you show this to customers right at the start. You don't even build it, right? And that's kind of like the principles around these things. So once we've done that, remember the model you release and then we go back to the start and you ask this is what we thought, what did we find out? What is the throughput of customers that went through in our customer factory? Do we have a constraint? Where is the constraint? Is it in the demand? Is it in the sign up? Is it in the aha moment? All right, boom. 5 why analysis, why is that happening now? Then you keep asking and then you either persist which is you continue doing the same thing or you move on to the next feature or need you zoom in or zoom out.  S Speaker 1 01:15:46 So we need to target, we need to open up the problem to a bigger customer base for we need to niche down or you pivot right or you change directions or we're going to target this customer or we're going to target this customer need. Right. So the way we do that, just to finalize on the hyper tactical, you can do like one week or two week lean spreads. Right. You start on a Monday. This is our model. These are the assumptions, these are the experiments that we want to test. And then you start release in small batches. Right. And then that's how you do it. Right. So that was a lot of content. So sorry for hitting you guys with all that. Imagine in half an hour how fast that is.  S Speaker 1 01:16:27 So yeah, I open up the questions if you have anything, challenges, anything. Yeah. So literally next week I have five meetings with hopefully potential early adopters. So how would you fracture 30, maybe 30 minutes meeting if it's a first meeting and I'm just discovering the model. Yeah, yeah, very good. So here I like the script that is in running lean from Ash Myura. Very good script. So number one, try to understand the customer so who he is, what their role is. Number two, what is their. I can explain two levels, depends. Are you fully in exploration or are you like what do we mean by. So let's say that how long have you been talking to customers? Is this the first five customers that you've spoken? Okay, so this is. So you've had just one meeting with anybody, anything about it?  S Speaker 1 01:17:33 Okay then I will open up a little bit more. So then I would try to. So I'll finish the Ash Maori script and then I'll tell you another way I think you can complement. But ideally what I think the most important part right in the exploration. If you have no understanding of the market, this is what you need to really gain an understanding what are their jobs to be done. So normally I would tell them like take me through, you know, when you're trying to accomplish what is it that you're what's the problem? Space. So our assumption is that real estate brokers are facing a lot of administrative work and we can create a vertical enterprise solution which we can automate them. And now I'm looking for these workloads to automate this call. Super. Then I think so you don't know.  S Speaker 1 01:18:25 So one of the big questions is which workflow is going to be important to automate? And that's why this is important, right? Because if you think step one, step two, step three, you're going to capture. Okay, so what are the workflows that you currently do nowadays? Like, take me a little bit from how does a week look like? Right? Or what are the most important workflows that you regularly do? And you're going to talk right at the start. You would build that yourself. Maybe you can get like a transcript, you know, like one of these tools that get the meeting. You can write stuff at the same time and then you will put that there. But you're trying to get left to right. What are the work. All right, so how are you currently accomplishing these workloads?  S Speaker 1 01:19:07 Heading through the steps and then like, which of these workflows is like the most important to your business results and which is the ones that are like the biggest pain in terms of frustration or time or cost? Right. But once you have that now, you can start say like, okay, so like it looks like, you know, workflow X is one of the most important ones for you and you have the most problems. Are you guys looking into finding solutions to do those workflows better? And then it's like if they say yes, no, then you say, okay, this might be an element of the next. Yeah, we're actually doing this and we stitch this software with that software, but it takes a lot of time and they're like, oh, that's interesting. That's what they should for you.  S Speaker 1 01:19:50 Like, okay, well so normally what I would do is so you could do it if you want to do it a faster way or the slower old or bilible. You could do a part problem interview in 30 minutes, which is what I just did, and then you could finish up like the last five minutes what were thinking. We're exploring this because we're actually trying to build a solution that would help with this work. So I said that might be something interesting for you. And then. Yeah, all right, is it. Okay, fine. And this is your scheduling for the next meeting, which I would call the demo solution meeting was like, would it be cool? You Know, we're building something right now.  S Speaker 1 01:20:30 Some designs would be cool if I, you know, give me a week or two and then I take you through it and you can give me your opinions. And they're like, yes, so we'll do problem first, then get them into the solution. And then from the solution you're going to show it to them, right? You can share your screen. Hey, this is overthinking. And something simple don't have to be too crazy. And then is this kind of what you were thinking? Would this be helpful from like a one to five perspective? How much would this help you find? And then if they say five, great, and they say three, like, oh, why is that? What would be helpful? What, what would be great if it was there?  S Speaker 1 01:21:07 And then they're going to tell the feature elements, you're going to write all those down and then you're going to design it pretty quickly potentially. And then you go on to your next customer. And I think that's the first part I would say here that you're in right now, right? So this is that first call, right? Of like, is this guy interesting? Does he have a big problem? Right. And then is our solution interesting? That's like your second call and you just have a bunch of those and then you're building your hypotheses. And I guess ideally what you're doing really here is that you're trying to get some numbers on. You know, you will start seeing overlaps into which are the most important and unsatisfied workloads.  S Speaker 1 01:21:55 What I would suggest, be careful with this because a lot of the times in our interviews with mix to those people. So be careful with that and also be careful with segmenting those. You know, a big company and a medium sized company, you're going to sell to them quite differently and ultimately the problems that they have are different. Right. So make sure that you identify first the company well, like, okay, how big is the company in this case, how many realtors maybe do you have? Like, so whatever it is that defines the company, simply do that and then you can slice your data according to that. And when I mean slice, it's just a mental slice, right? It's not a survey right now. So is that helpful? Yeah, yeah, super. Yeah, it sounds cool.  S Speaker 1 01:22:46 So you just kick this off now and then essentially have nothing really. We're just looking for a big idea and I'm pretty hoped in the AI agents I'm experimenting a little bit and I found the Y combination video and we said like Pick the most boring administer before. Yeah. Put AI agents on top of it if it's relevant and it should be able to. Yeah, that's a great idea. Yeah. Vertical AI agents. I think I saw your post. Yeah. Yeah. Cool. Good luck. Any other questions? If I would like to reach a bottom that is not here on the market, that would be the first question. Sorry, could you repeat that? Like if I would like to release product they not yet in the market. In Huro you go to mirror information. And the second, how to be durable. Okay, yeah, good question.  S Speaker 1 01:23:55 So, so I. I think that the second question, maybe you start there because I think it's easier. How to be durable. Right. Then we're thinking about habits too far. Right. I get it completely. I understand why you want to think that and I think there's a couple of things in there. So I think maybe you might be thinking also on your business model. So if you're thinking about how to be durable. Right, Meaning for example, in AI, I'm working with some guys, they released something in AI three weeks later, like ChatGPT releases the same thing. Right. So that could be angle to your question. Right. So then what that tells us is that we need to try to define a value proposition that is that has some differentiation around it. Right. So I think there's two levels to the question.  S Speaker 1 01:25:02 Differentiation in the sense that you have some specific capability that others don't currently have. Right. So I think that's important. Right. And if you use the. There's a lean canvas, so you've got like a position canvas and then the lean canvas has, I think unique advantage is called or something like this. But then you can take a deal out like what's your unique advantage there? So I think when it comes to the design, you can think about that. Why did you say also ties to your sector? First question, how to mirror it depends what business you want to be in. Right. Because if you, let's say that you're entering a mature market and the SaaS tool or CRM, let's get specific, you go into a CRM and then I'm going to make a better CRM. All right.  S Speaker 1 01:25:54 So now with AI agents that is open, you know, or copilots, that's opened up a new possibility for differentiation. But if you were to say five years ago, you know, I'm going to make a new CRM, the question there is most likely that you have to niche down because it's such a mature market. Meaning that the tools are so developed, there's so Many strong features here that for you to be durable in this market, right. You need to change your strategy. And normally that is be hyper specific in your segmentation. So do CRNs for lawyers, right? If you do a general CRM and either you have like 14 of them in the bank or you're not going to get anywhere.  S Speaker 1 01:26:42 But if you do CRMs for lawyers that are like starting up, then you have a, a differentiation and a capability for putting your foot in the market. Because nobody will be able to serve that niche market better than you because you're hyper specific. Be focused on that market, right? So in terms of mirroring, that's the danger. If you're trying to mirror somebody, it means that you're not gonna have the capability of differentiation. Like there is a saying in this, that's. I used to have this slide actually that at the start only try to copy like 80% of the features and then innovate on twice. I would bring it a little bit further because I think tactically it's a great frame, but it doesn't work tactically. Like you shouldn't be doing 10 features.  S Speaker 1 01:27:40 You should probably be looking at what is the one feature, right. Or the one aha moment that your competitors are currently not doing well enough. Right. And then trying to deliver that first. So I think again, let's just get into it. What type of business are you trying to copy somebody or do it just better or do it cheaper or do it faster? Or are you trying to do something that's innovative, different, so incremental or innovation? Which one are you going for? Maybe you want to describe anyway, just only one. Well, you know, much more. But I think the main one is tourism. So, so in the tourism market. Okay, and within tourism, what are you thinking about? You know, when you said about featuring sometimes could be confusing for customer to decide what they want.  S Speaker 1 01:28:43 I think like at the end, if you've been working a lot of industry, they could help you to enrich, you know, is also to be modified the services. Like I like when I say, you know, modify the services, it could be an extra to, you know, to enforce or to push client or customers to, you know, sign up or to do something. Because if I were just only one, it would be like a stupid because, you know, there are many aspect that normally they could be attract. So what is this specific business that you are? Well, I don't want to be compromised. Okay, okay. Yeah. You know, I do have like this kind of, you know, contradiction between Featuring, I have a lot of featuring that we confuse you for the customer, I don't think.  S Speaker 1 01:29:40 So if they, you know, respect the connection, the other side connection, they could be like a good value. That's why I think like, you know. Yeah, like I said. Yeah, yeah. I mean, look, ultimately if you go into a mature market, you're going to have to build a lot of features before the customer even considers to switch to your product. Right. And that's fine, not a problem with that. Right. Just to take that into account. So in that case, if that's the Strategy, you can 100% go and take the best in class, the best product in class for that market. You break it down, you break the features, you understand what they're doing. You go to like Trust Pilot and different review sites and you see what people are saying bad about it and then you try to identify what they're missing. Right.  S Speaker 1 01:30:38 And then you try to, you know, the challenge there is that you need to come in with the understanding that it's going to take you a little longer for you to build a core and repeat. Right. And if you have the money, you have the team and that's fine. But I think what you need to do first there, if that's the case, is identify what your differentiation is and come up with an experiment to validate that. Don't build the whole product, don't build the sign up, don't build the paint. Like, don't build all of that stuff. Like take whatever feature you have and then go to them. You know, you could even take screenshots of the competitors and put it in a prototype and then do yours and say, hey, would this meet your needs? Right.  S Speaker 1 01:31:27 And then when you get validation about that, then you could move into, hey, like we're thinking of building this. Would you like to be a design partner, you know, go into a pilot program or something like this? So that's, I think a good approach. That would be the safest approach about it. Does that make sense? Yeah, absolutely. Cool. Super. Yeah, great question. Yeah. Any other questions? Maybe you have any examples, experience? So you think that you follow all this process, you have all those printed diagrams, you completed all the canvases in this tool that retrospectively you understand where you biased? Absolutely, yeah. Can you provide some. Yeah, so let's see. So we can do like bigger users and lower users. So for example, one project that I was working maybe eight years ago, it's like this, it's called Plug dj.  S Speaker 1 01:32:46 It's like a social music platform. So People would go in and you would have like these little music rooms with avatars and people would DJ and then anybody could teach. So it's like a virtual room like this and then somebody's DJing it and you put their song and then the next guy puts their song and the next guy put their song and people chat and there's a community and whatever and there's a few million users there and there was some engagement and you know, the assumption. So were like a small team, like product agency, product growth agency. We came in like three people and took over. And a lot of the analysis, I mean this was before I had a real deep understanding of like assumptions. I mean we're doing a lot of usability assumptions.  S Speaker 1 01:33:34 We're working really at the design level, not at the business risk level. Right. So a lot of the. So we did like a usability analysis and you know, really well mapped out and we broke it down in a proto model. So like I did it a little bit like this, but. But I wasn't targeting the same risk. So it was about, you know, your pirate metrics, you know, like what's your onboarding or your activation and your retention and your referral. And we came up with redesigns for, because we interviewed a lot of people with a bunch of surveys and we came up with a bunch of improvements that we thought could turn it around. In this case we get more people to retain, more people to onboard and the more people to pay.  S Speaker 1 01:34:20 So we came up with a pretty solid roadmap and we started executing it. What I think we realized by this. So we improved a lot of metrics, like the onboarding because so the setup is very design centric and there it worked really well. So people were having trouble finding music rooms. Like the diamond will explain experience. The people would find music groups, if they find a music room, they're going to be high retention. So we had some improvements in retention and habits too. Right. We improved some emails and we proved a bunch of things in experience. Right. But I think a lot of the big changes that we did that we thought, and I definitely thought would be like most impactful, which was this design of the room.  S Speaker 1 01:35:02 You know, how like the usability wasn't cool, the play was here, like it was hard to find stuff. And we thought, okay, we're going to do a cool design. We're going to study Spotify and SoundCloud and all these platforms. We're going to make some cool things. We want to invest some money in some design details. We launched that and we didn't fix the main problem, which is how do we get people to pay more? Right. And even if they came back more, it wasn't sufficient to warrant the investment that we had made, honestly. Right. So at that point I realized that, oh, there's a business level risk here that we didn't target completely, which is, are people willing to pay. In our case, you know, we pay with avatars. So there's a bunch of avatars because it's like a little.  S Speaker 1 01:35:56 It's like you guys, but an avatar look, right? So you have your own avatar so that, okay, people who have a different avatar. So now you're like a pirate. And now you're like, you know, and we could have easily tested these things without redesigning anything. And we could have directly gone to the aha moment, you know, the classic Facebook example of. And we did try to do the analysis, but I don't think weren't sophisticated enough yet, which is to like, you know, they said again, face growth times. They identified that if they added seven, I think it's 10 threads in seven days or seven days, seven friends in 10 days. Whatever it is that people that added this amount of friends in seven days, they were to come back to the platform. And we're sticking, right? So identifying that metric is unbelievable.  S Speaker 1 01:36:45 Because now everything that you do could just take to push people there. The design can be crappy, emails can be crappy. Like, if you just get to that metric, that it will work. And if we could have done that, you know, instead of, you know, I think at that time I just came from like a company that was very designer organic and ended up selling to Microsoft, the company. And it was, you know, I came with like Apple mindset of like, every design needs to beautiful, everybody. And I learned the hard way that was just a mistake, right? That if we just get people to the right move to play the right song and get the right feedback from people, like, hey, that's a really cool song. I really like you. Do you want to be my friend?  S Speaker 1 01:37:30 You know, if we just did that in a really scrappy and crappy version, I'm sure we would have gotten the metrics really faster. We could have said, hey, we have these hundreds, you know, which one would you get? Right? And we built a bunch of things, but we didn't do those really. You know, it was like eight, nine years ago or so. And it's a good experience. You know, we made a lot of inquiries. But that's why I was saying that ultimately you can be winning in some cases, but you don't ultimately create the metrics that you need before we run out of money. Rather, the investor money went out and I need to raise another round. And then, you know, it's. It's a tough market as it was in that time, then that was hard to do. Right. So. So that's one example.  S Speaker 1 01:38:16 I think that. Is that useful. Yeah. This is over to me. I really like the idea the product market fit, but at the conference I was startup. Do you have any qualitative data? How much money should I spend on this? Or so what is the. If I do this method, how much money will I save? What's the correct amount of effort to do it? Yeah, that's a really good question. I think it depends what your context is. Do you work in a big company? Are you guys launching new projects? Or is it more of a personal interest for Zoom? Okay, perfect. Yeah, I think that, you know, from like a discovery perspective, it depends how much time you're investing into it. Honestly, that's important to say, like, are you full time on it? Are you halftime? Are you on the weekends?  S Speaker 1 01:39:15 But you know, you can take from two months trying to talk to people. Right. Because at this level, this question is not about me because I'm not a company. If you have a startup and it's running. Okay, so the startup is running. You have a team, let's say that. Okay. If you expect you save 20%, you save all these. Yeah, yeah. Because you can use this in a way to test yourself. Yeah, yeah, for sure. Yeah. So I think the question is. Is well intentioned? I think it's a. It's a well intentioned question. I think we probably need to refine the question a little bit. Right. So I would even say, let's see, I'm trying to think of a slide to kind of get to that. It's a question that derived more from a production standpoint than a discovery standpoint. An endless discovery.  S Speaker 1 01:40:17 I think the main question to ask is, if we don't do this, we're dead. So it doesn't matter how much time we say. Right, because it's a very good question, but it's a production question and you have discovery, which is the phase that we're in, and then production. And production is when you have requirements. Right. You're a real factory. This is what you need to build. Right. And you have this question, okay, if we buy this machine, how many hours are we going to save or how much if we use this material how much money are we going to save? Right. But in discovery it doesn't work like that because it's almost irrelevant because if you don't follow the sequence, you're dead. So how much time you save or not?  S Speaker 1 01:41:02 If I can get a little bit specific, you know, because I have had to use it marketing time, so save six months or get to. So it could be from, you know, I had a conversation with somebody in an hour and maybe by my estimation saved them about six months, you know, because you're just telling them, I want to build this product. So you say, okay, how long is going to take and to hire the employees and to set up operations and to build the product and to get the users. Okay, three, six months. Okay, you can go next week and you can validate if you even need to do it. So six months, for example. I agree with you. I see this is obviously very valuable. I'm thinking more very statistic. A thousand companies start out with this method.  S Speaker 1 01:41:52 A thousand companies start up with this method. You know, 3% more were more successful, were more efficient because they found their rival faster. Yeah, yeah. But I guess there's no, I mean there is a product market fit report that I got a little bit of the data from the premature optimization that's very interesting. I'm trying to remember what the name was. I mean, I'm trying to think also in which element is this question an important question? And the only reason that I have searched for this question is to sell to people, right. Or to sell to my bosses. Right. Or to do a market study. Right. So in operations, so if you're executing this question is not relevant. Right.  S Speaker 1 01:42:45 But if you are trying to sell to somebody, do a market report or something like this, because, or, and I'm trying to think, maybe it will come up later to me, but definitely if you put premature scaling startups, I'm pretty sure that one of the first results in Google will be this report. It's a great report. They do, they analyze a bunch of different startups and how they did things and at what level they died. And I remember from like six, seven years ago, something like this. And then you can get screenshots and then you can post them and stuff. But in metrics, most startups die, really. Right. So one of the metrics that we saw here, you know, maybe another analogy could be to try to identify the accelerators, right.  S Speaker 1 01:43:37 That do apply lean methodologies and which of them have the highest, for example, Y combinator, they have a high rate of return on some of these things. Right. Another element, if you go to tech starts, it's also a high element of return corporate venture builders, they have a very low level of so and they pretty much predominantly do waterfall. Right. So you can take real life examples to showcase, you know, explicitly that point. So I hope that's helpful. Thank you. Lauren. One first question. Are you planning to share the deck with the audience? Because there's a lot of stuff I know. Yeah. So this is the tech start deck that I usually use.  S Speaker 1 01:44:26 So I can't share it per se, but I am building right now by the way like a full document with this which I will put in maybe in a week or two out there which will contain the basis of this. I think I want to do a video too about it. Cool. So my question is around what are the adjustments you apply to let's say an organization? Because in my opinion this framework works really well for startups. But when you're in an organization where you understand of let's say cross functional and there's dependencies with other teams to build even the most minimum, minimum viable.  S Speaker 1 01:45:05 There's of course there's always, you know, the split way features that a block team is able to deliver independently but many times there's, you know, the standard, you know, constraints from security or whatever simply for compliance and training if there's not a single internal communicating ahead to your and depending on the architecture of the team, the company, there's a lot of cross functional work. So if there's a team working on front and you're working on back end and then it's like, okay, so I'm learning these guys to work on this. I can deliver this. What are the adjustments? The adjustments to the framework? Yeah, for organizations when, you know, even when delivering the tiniest feature requires coordination. Yeah, that's a great question. Yeah, it's not easy for sure.  S Speaker 1 01:45:57 I think that in my own experience and for example in Kraft Heinz, incredibly slow company, high level of regulation to do anything. So we have to go through this issue, compliance, legal. I think the framework by itself stays meaning that their first principles. So this is just theory of constraints. Right. So if we don't do this, we're in trouble. So we have to make that stick. And I think the question is around inside those blocks right at that second level, what are the adjustments that we need to do operationally, organizationally, communication in order to get there and what are the expectations that we need to set about speed and cost and arriving there. So the general is to prepare to I mean, to go to like a daily or a weekly mini batholes, that's, I think, number one.  S Speaker 1 01:47:09 And that's, I think honestly sounds funny. But it's probably the one thing that is the most important because what happens in an organization is that because everything's so fast, so slow, so cumbersome, so much fights, everybody loses morale and you just give up. And I found it with myself. I'm very like, I'm a crusader in these terms. Like I really live or die by these theories in the sense that it's just, it's physics. So I. You can't change physics no matter who is in front of me, right? And even myself, when I found myself in these environments, part of me was like, oh, you know, you can just let it go or do this. And. And I was really worried, was like, oh, wow, it really changes anybody. So I think that's really the main. The mindset is the most important thing.  S Speaker 1 01:47:55 Just giving a shit, really. And then being aware that there's going to be a lot of no's and you know, people are going to. You definitely feel you're a bit of a ball breaker. And I think accepting that was number one because I'm like, okay, I'm okay to live with that. I just need to be super nice to people also. You know, I need to kind of get them involved in my mission. So first, the mindset. Number two, how can I put the mindset out there to everybody? Like, this is super important. You know, I would love, you know, to get your help in this. I, I hope that, you know, we can make it happen together. So really making it humanizing it. Right. And we have like examples like we worked within three years ago. Q.  S Speaker 1 01:48:44 AI this was an AI Robo investor that we built, that I built with the founder. We built the whole team and everything forbes. It's like the publication, the digital publication. And were integrating with something called Apex, which is like a broker, one of the layers in there for regulatory and investing. And I think that their regular integration was six months. I was like, you're out of your mind. I have three weeks to do this. Right? And they're like, no, you're out of your mind. Like, okay, well we'll see, right? And we did the fastest integration that they had ever done. And this is like a publicly trained company like a year later, right?  S Speaker 1 01:49:25 So the main angle there, apart from trying to get people on board, really strong planning around where the constraint is and what I need to cut out so just reducing scope. Crazy. I remember having this method where I was like, so minimization or first elimination? First we just eliminate it. Like this is not necessary. This is not necessary. So that's the first thing. And this is for integrations, but we can talk about any other topics and then minimization. Okay, well you want to do it at this level of quality and at this level of sophistication where we want to. We're not going to do it at this level. Right. So just adjusting in scope, being super nice, constant follow ups to people. Again, making it, tying it to the company value.  S Speaker 1 01:50:19 And honestly I think one of the most important things that I learned when I learned this, sometimes I did it wrong and I failed completely. And then I learned a little bit was getting a champion inside the company. I think getting a champion was so important because I realized that a lot of these fights I wouldn't have if I just talked to the guy in a backroom meeting and say to the boss guy and it's like, hey, like this is so important. This is the reason. And you know, if we don't do it this time way then we're in real trouble. And then he can just overrule anybody else. Right. Sometimes that didn't work because then another guy was more friends with him. Right. So, so then it's about.  S Speaker 1 01:51:03 Yeah, just that's a less happy experience, but it's more about just keep pushing all the time. Right. And honestly just breaking balls. Honestly about it. It's, it's really. And nobody wants to be that person in a way. But if you do care about the product and you know, then I think that's important in terms of. Yeah. I mean in terms of launching something uniquely from the team, I think scope is probably the biggest because again, features are the enemy. What you take the longest with. So more features, more interactions. Right. With people. Right. So the less interactions, the faster you can go. Right. Number, number one. Number two. And just trying to find loopholes everywhere. Honest. Like with the legal team. Yeah, we have to do this. It takes three months. You crazy? Like I have a day to do this.  S Speaker 1 01:52:00 Like what is the way in which we can, you know, what way can we jump around this or. And sometimes even just moving forward honestly without like is the risk so high that they will fire me if not moving forward with some things, I mean that's very calculated. So you know, that's something to consider and you know, can't use it all the time for sure. You know, but I think that making that evaluation and then I think, I guess just a lot of. There was a guy I remember a few years ago that said like yeah, the roadshow going. I think roadshow, it's super important. Like what roadshow? I need to work, I can't do presentation. And like, oh, this guy's really smart because he was really in there in the environment longer than me. I was more from the startup.  S Speaker 1 01:52:49 So it's like just showing to you, like what are we working on, why this is important, why it's really cool. Right. And then I thought that was like I should have done a lot more of that instead of just being in my hole thinking that logic is going to work with people. Like nobody cares about what I'm doing really unless I make them care about it and I make them invested and I ask for the feedback. Right. So I think that all those things definitely add, you know, to that pile. Yeah. And just resilience, honestly to go through it. But I hope that how, what have you found for example? So let's say an example. They have actually created venom in us which is similar to this. And then. Yeah. So the API iteration was done like in two weeks.  S Speaker 1 01:53:51 So the team actually did that in two weeks. And then it's been like six months to go through legal because it was like oh, did you go through the vendor procurement process? Oh well, no. Did you go through all this? Did you talk to finance? And they okay, so the guys are, you know, the actual technical work was done months ago. Yeah. There's value for the customer. We're explaining everyone, hey, by offering this payment method or simply find, you know, customers life by far, it's gonna be a big improvement. We have customers asking for it. So we have. It's already done, it's already built. Still not large because there are these other teams which are not aligned. It is not a priority for them, which I understand. And then it's like, okay, so yeah.  S Speaker 1 01:54:43 And then usually push back and sometimes I say CDO saying hey, how come you haven't launched this? It's like, well, I still rely on depending on legal compliance, finance to approve to get all the sign offs. Yeah, yeah, sometimes. And then it's like what about training? What about. So that's usually. Yeah, that's a great example. I mean what have you done? Do you think, you know, along this process that you feel could have sped it up? What has been the initiatives? Well, I mean, yeah, really what you're saying, having a champion, one of these teams would have heard from the beginning. Maybe you're communicating ahead that, hey, we're working on this because it was kind of a surprise because we got the, let's say the buying from the CEO from the farmer very quickly. But so.  S Speaker 1 01:55:46 And we thought, oh, we have the CEO working this out. Go for it. Yeah. No, not yet. So they're still, you know, other things that have a saying. So we thought that we had a fast track. Yeah. To deliver because we got the CEO back in this area. Yeah. And we made a mistake because we didn't realize there are other teams involved. Now we are, you know, to all these guys early on. Yeah, yeah. You need to know that we're working with. And. Yeah. And let's say add some offer to our. And say, yeah, we think we're this guy. You're already done. And the other thing is, I mean, we don't never throw anyone under the box. Never say, oh, it's finance teams, or we're not blaming anyone. We think that's very unfair.  S Speaker 1 01:56:46 I think what they do is they have place, let's say risk appetite. So we are, you know, more cowboy mode. Let's see what happens. And they're like, wait a minute, who's this vendor sharing data? So they are. They are trying to. Yeah. And that's why they're there. And that's why we are the experts. Sure. Supposedly. Yeah, absolutely. And it sounds like a joke, but so my approach, we have also kyc. So what I did was I took ownership of their process. And not everybody does it, but I did it anyway. And what I did was I mapped out the complete value chain of the process. I mapped out the whole value chain. And it's not about blaming, but we have to identify what the constraint is and where it is. If there's no culture where we can identify what the constraint is, we're doomed.  S Speaker 1 01:57:50 Right. So I mapped out the whole process and they were like, okay, break it down. I need to understand this. Okay, is this the problem? Is this the problem? And then what happens? Because. And in craft, I did it like this with the. This is like the leading. Like, who the hell am I to do that? Somebody that cares about, has an urgent need. Nobody was doing it. And everybody was, like, kind of shocked that I didn't come from that world. I come from the startup world. So I'm like, all right, well, we're going to break it down and I'm going to help understand where the constraint is, and then I'm going to ask questions where we can ideate how to overcome that. Constraints together.  S Speaker 1 01:58:26 And what happens is that most of these people, they are completely locked in to that mindset of the not caring, not ideating ways to do it faster. They're just comfortable, right? So that's their state, right? So my thinking was like, all right, no problem. I'm going to help enable what you should be doing. Right? And that's what I think a lot of product leadership is really what I find mostly is about trying to come up with these models, honestly, and then have people understand. Understand where these constraints are. Right. So. And in your case, I would. I mean, literally what I did there is like, we go to Figma Jam, like, do you have one hour? And that was the first hour. Probably take like three or four, because maybe there's five or six people in there.  S Speaker 1 01:59:19 And then probably signal to value with signal to noise is not very high. So like, let's map it up. Okay, we're just in process. Where are we now? Where are the steps that need to happen in between? And then honestly, the next level, what I was doing is like, all right, let's try to understand what the actual time, the lead time is. So what is the time that it takes you to do this process effectively? Is it a day? A month? And then, okay, we're being stuck here three months. What's the issue? It takes a day, right? When we showcase the constraint to people, then they themselves are going to be alerted by what is the waste here, and then they're going to start taking initiative on these things. Right? A lot of the problem is like theory of constraints.  S Speaker 1 02:00:06 They call, like the multi. The bad multitasking, bad multi project environments where there's like 25 million projects to do, and whoever shouts more is the one that gets the attention. So first of all, you have to not shout, but like the concept of shout, which you have to break both, like, hey, this is important, all right? And then you have to be super nice because if you want to go into their environment and say, let's break it down, like, who the hell is this guy to break it down? So I have to be light, you know, and they have to, like, see that I don't have bad intentions. And I'm really here to make the process to achieve my objectives faster. I know a little bit of conference objectives.  S Speaker 1 02:00:45 So then we break it around and then like, all right, well, it seems like this is the thing and what step is going on, you know, very open, collaborative talking. But really the angle is let's find that constraint. Let's make you realize that this is absolutely ridiculous. Right. And let's give you the importance that you want to kind of move so that you can do something for me. Right. That's, that's the handle. So that would be my approach personally, because if something is built in, six months is waiting, that's unacceptable. Right. But it's just that we learn to accept these things because we've been in the company for so long and we don't want to, you know, hurt people. And that's normal. It happens to me too. Right.  S Speaker 1 02:01:26 But the idea is that, you know, you have other teams waiting, then you're pissed off because you put time in, you can get user value that if it was only that problem, there's probably other 10 things that are in the pipeline that are waiting and then all of a sudden there's no delivery and then what am I doing here? This is just a waiting game. So that's, I think an interesting approach to it. Then there's of course like a super, I think more long term angle, which is, I'm not sure how you mean changing the company. Microsoft. Yeah. About projects and putting work in progress, limits for projects. But I've been there from product two and like, you know, like more projects, you know, let's launch this, let's launch that. And what happens with that? That your lead times are insanely, you know, large.  S Speaker 1 02:02:24 Right. And then you're doing nothing, you're doing a bunch of projects and they're all interchanging, everybody's waiting. Right. And theory of constraints touches this very effectively. Right. It's like you put work limits on projects and you start something and you finish it. And you could probably tell in your own lives, right? In your own lives or your to do list how many things have been open for over a month or two months. And then you're doing more and more. But when you live in it like three projects, then when you hit the constraint, then you kick into, all right, let's identify the constraint, let's break the constraint and let's come up with ways to lift the constraint and let's do it with an urgency. So that would be the second level for sure, or harder.  S Speaker 1 02:03:10 But I think the first practical level is just, hey, I would love to understand, you know, what the process is to accomplish this operational workflow and where we are right now. And if like, do you think that we can take an hour, you know, to just help me understand and break this down? Share the screen Figma jam do boxes write the process like together and then help them see and then by the third session I'm pretty sure that somebody's going to come up with a solution. I hope that's. That's a problem. Really good question. Yeah. Thanks so much. Thank you so much. Absolutely. Organizing here it's been so. Thank you so much for it. Lot of valuable content. Thank you. I appreciate it. Thanks for having me.  S Speaker 1 02:04:10 And then as for the rest, put some people there if you want to like stick around network with people. There's a lot of people here that you can get now. I think it's also part of like part of adding value with really good talks value is working with other people like minded that you can like talk about products maybe get opportunity for job whatever. Some you can build something cool. So use of sign H as well but this example network that is power that you can use for it's better. I think one of the best things for your career is to have a strong network. So use it next time. There's one coming up in February. February. I'm going to create another one. We're going to have another. Thank you so much and hope to see you soon.