Listen Now

In this episode, Richard and Luke unpack a pattern they couldn’t ignore after running 224 sales call transcripts through an LLM:

The #1 issue mid-market ecommerce brands face isn’t that performance is down … it’s that they don’t know why performance is happening.

And when you don’t trust the numbers, you can’t trust the decisions.

Luke walks through a painfully familiar weekly business review scenario—multiple agencies, multiple dashboards, multiple “versions of the truth”—where 80% of the meeting is spent trying to figure out what’s actually going on, and almost none of it translates into confident action.

What you’ll learn in this episode:

  • Why brands don’t hire help because results are bad—they hire because clarity is missing
  • The hidden cost of “dashboard chaos” (and why more stakeholders often makes it worse)
  • The 3-step framework to go from confusion → clarity → action:
  • Why incrementality (geo holdouts) is the gold standard when ROAS debates never end
  • How to sanity-check your “real-time” metrics against your closed P&L (and what variance is acceptable)

They also tease what they’ve been building to solve this problem end-to-end: a higher-capacity operator role designed to reduce complexity and speed up decisions—so your team spends less time arguing about numbers and more time changing outcomes.

If you’re leading growth at a brand in the $10M–$100M range and your team spends more time debating performance than improving it, this one will hit home.

Show Notes:
  • Axon is offering $5K ad credit when you spend $5K. Go to axon.ai to set up your first campaign.
  • Explore the Prophit System: prophitsystem.com
  • The Ecommerce Playbook mailbag is open — email us at podcast@commonthreadco.com to ask us any questions you might have

Watch on YouTube

[00:00:00] Luke Austin: There are seven different windows up on the Google Meet, each representing a different stakeholder within the marketing org function and or vendors and agencies.

So I think there's four Ven different vendors and agencies. There was the leader of the marketing performance department. There was a couple people on their team internally and there's a creative stakeholder as well. And so the whole and the whole meeting was reviewing. The past week of performance in the org in the, for the.com business specifically.

And, and then how they're pacing to the forecast and then what actions are being taken as a result. Right? Is sort of like the framing of the discussion. 80% of the meeting was spent trying to parse out what's actually happen. Right. Maybe more that's not an overstatement. So the, the large majority, 80% of the conversation was spent parsing out what is actually happening right now, where, why are we pacing behind in this set of metrics?

Where, where, where the problem areas, where the opportunity areas and 20% or less was spent on. Action ideation, net new things that could actually alter that trajectory. So just like that's, that's important, like I think for all of us to think about these spaces that we have, team meetings, meetings with the agencies, et cetera.

Like how much of that time is being spent on things that are actually gonna alter the trajectory of the business versus. Just uncovering what's going on. 

[00:01:20] Richard Gaffin: Hey folks. Welcome to the Ecommerce Playbook Podcast. I'm your host, Richard Gaffin, Director of Digital Product Strategy here at Common Thread Collective, and I'm joined today as I often am by our VP of Ecommerce Strategy here at Common Thread Collective, Mr. Luke Austin, rocking the Yeti Bauer. What's going on today, Luke?

[00:01:36] Luke Austin: Oh man. I am in a bit of an existential state of mind to be, to be honest. If there's anyone out in the world following along with what. Folks have been accomplishing with open claw, claw bots whatever the most recent name is, it's, it's pretty, it's a fascinating, fascinating world. So we're, we're pushing, we're trying to push the boundaries of our own imagination in terms of how we leverage the continued improvement of AI technology and what's happening in this space for on behalf of our customers, and realizing that really imagination is the main constraint.

The technology and the capabilities are just getting less and less so. So what can you imagine? And how can that, how can that come to life? So that's the the state of mind on this Friday 

[00:02:21] Richard Gaffin: yeah, I was gonna say, as, as someone who has many existential crisises himself, like unpack what's, what's the most existentially crisis inducing part of this for you?

[00:02:31] Luke Austin: I, I think, I, I think this idea of the main constraint to. The output or the ideas being my own imagination and what I believe is possible is something, something I find myself thinking thinking about in different, in different states. And like, I'll think about this in terms of leading teams and other individuals, like being able to, Ima imagine something that's different than or possible for those individuals.

And and then even for myself that's maybe different than the historical trajectory. And then being able to sort of do that with. The technology and the output and the tools that we're building. And so when you're asking, when you're asking an LLM to expand your own imagination of what's possible for it to do in your world, that's where it's like, okay, yeah, well, like what, what a, what, what point do I have in this, process?

[00:03:23] Richard Gaffin: I don't have the imagination to know what I'm not imagining right now. Or

[00:03:27] Luke Austin: Yes. 

[00:03:28] Richard Gaffin: lines. Boy. Yeah. Brave new world. Well, speaking of imagining or using LLMs to do somewhat new things, look, this is not necessarily the most imaginative thing in the world, but it is very helpful. One thing that we've done recently is taken. Essentially recordings or transcripts of 224 sales calls that we've done over the last little while here and run it through an LLM and had it spit out what is the most common thing, what are the most common sets of pain points that people share with us on these business calls. Now we have some sense of what they are. Of course, we're on a lot of these things, but having the LLM sort of parse for us. Just how frequent this one, kind of like main issue is, was really fascinating. And it also, I think just sort of anecdotally aligned with a lot of our experience on, on sales calls and with client calls as well. So, I'm won't bury the lead longer.

Like the number one thing that brands kind of in this range, which I'm gonna say is roughly, let's say 20 to a hundred million, 50 to a hundred million maybe. What brands are most concerned about? And, and whether they know it or not, I feel like this kind of comes out in these calls, is that they don't trust their numbers and that could be the result of they just don't trust specifically the reporting of the platform. They don't have a sense an agreed upon source of truth. They haven't kind of like gotten everybody together and said, we're going to trust the numbers from this place, or we're going to trust this sort of aggregate of numbers. And then there's nobody particularly like responsible for knowing what those numbers are. So what ends up happening is that a lot of the discussion revolves. Around do we even know what's happening? So speaking of things that LLMs do well they, the, the kind of AI spit this out, which I think was. Kind of an interesting way to frame it was that, that, the brands who sign with us don't sign because performance is down.

They sign because they don't know why performance is happening. And I think that's a good way to frame it like that. There's, whether things are good or bad, they don't maybe even don't even know if they're good or bad at this point. So Luke, why don't you talk to us a little bit about just like, anecdotally or experience with this, and then we'll get into like how we address it.

[00:05:30] Luke Austin: Yeah, I, and I think that last summary line is, is really helpful. One a customer of ours that we've worked with for, for a long time now, a couple months ago, shared. The main sort of the main seat that they see us taking, which is that when things are going well, they know why and when things aren't going well, they also know why.

And there's a deep level of clarity related to the state of the business and what is working and what's not, and what the actions are that's, that are necessary to, to, to alter that trajectory. That is the core thing that, that we are, that we are after. And that the, that the system and team that we built is is focused on accomplishing.

And so I think this sort of like, provides some contrast in this conversation. I'll frame up a recent conversation with a prospective customer that, that we had that's that's maybe a, that gives some more detail to some of these pain points that have come up in these sales conversations. And many of you may relate with this and likely will relate with this at least to some pieces of it where the conversation that we having with the prospective customer.

Is we start walking through their org structure, team, structure, agency, vendor structure, data structure, the tools and the dashboards that they use to make decisions, and then what the weekly sort of flow actually looks like. Like how, how things actually happen on a day-to-day and weekly basis in the organization.

And and they shared, they pulled up a spreadsheet that they had Google sheet, multiple tabs. Pretty detailed like in terms of the amount of, you know, daily targets for each of our customer cohorts and like, for each of the ad channels as well. But then what happened is we are in a, we are a conversation with this customer around a weekly business review, and we sort of started to see how this weekly workflow was happening.

And, and this is the, this is a picture that I want, that I want you to sort of envision, and it may not be too, too hard for us to sort of imagine this. This may have been our weekly meeting with our marketing org this past week, which is Google Meet. There are seven different windows up on the Google Meet, each representing a different stakeholder within the marketing org function and or vendors and agencies.

So I think there's four Ven different vendors and agencies. There was the leader of the marketing performance department. There was a couple people on their team internally and there's a creative stakeholder as well. And so the whole and the whole meeting was reviewing. The past week of performance in the org in the, for the.com business specifically.

And, and then how they're pacing to the forecast and then what actions are being taken as a result. Right? Is sort of like the framing of the discussion. 80% of the meeting was spent trying to parse out what's actually happen. Right. Maybe more that's not an overstatement. So the, the large majority, 80% of the conversation was spent parsing out what is actually happening right now, where, why are we pacing behind in this set of metrics?

Where, where, where the problem areas, where the opportunity areas and 20% or less was spent on. Action ideation, net new things that could actually alter that trajectory. So just like that's, that's important, like I think for all of us to think about these spaces that we have, team meetings, meetings with the agencies, et cetera.

Like how much of that time is being spent on things that are actually gonna alter the trajectory of the business versus. Just uncovering what's going on. Right. I, I think it's a really sobering sort of starting point for the conversation to realize and then how, how it progressed from there. How it sort of came about is there's looking at revenue pacing by day spend, pacing by day.

The businesses clearly behind revenue forecast for the month and overspending relative to that. So the MER efficiency is lower, so that's like really clear. But then like the next level of like why becomes immediately murky our new, so our new customers. Our, our weighted CAC is at this level. Well, actually no, I don't, I don't trust that number 'cause I don't think that's baking in affiliates.

And the TikTok shop costs in addition to the performance fees. So I think our weighted CAC is probably closer to 83 instead of $60, for example. Okay. So maybe the efficiency is lower and that's where we're seeing it. Okay. Agency one. What, what are you seeing on meta? Is, is, is this kind of lineup with what you're seeing in terms of the CAC being higher than, than normal?

And the conversation just sort of progresses in that line of like, everyone's sort of trying to like draw correlations between, every agency has a different deck and data system that they're looking at. The core data for the, for the business is like, there's a lack of trust even at the leadership level of like the data that's showing up there.

And then the kicker being the who's responsible to go fix the action. Everyone left that meeting an hour plus of, you know, 10 people's time across multiple different platforms and and outside vendors is like no very little clarity and specificity of, okay, who's going to do what to solve that specific action.

Because we're actually less clear than we were when we started that meeting of what the problem is. It, it actually got murkier for us when 10 people tried to talk about it in 10 different ways. And even if there were actions coming out of it the level of confidence that I think anyone had that that thing being done was actually addressing the core problem.

What we're seeing is very low, right? So we could, we could all then be solving problems on the things that aren't tied to the most. Important problem to solve. I'm not saying they're not impactful, but the most important problem to solve from a highest, highest impact, highest leverage activity. So that that sort of framing of this context, I think teases out more of the specificity of what we're seeing in the LLM output that you referenced of the hundreds of sales perspective client conversations that we've had.

And what comes up is like this. The, the ability for us to get to the point of high level of clarity in the state of the business where we're off course, and then be able to index as much of that energy into action to adjust and alter the trajectory of the business is that, is the, that is the highest leverage activity that we are focused on providing clarity.

Because once before you have clarity, you can't go on to anything else. You have to start there. And it's gonna, it's going to lead to a lot of time and resource and energy spent on. The wrong things or on just trying to get to a level of clarity. If you don't have that as a starting point,

[00:11:28] Richard Gaffin: Yeah. No, that makes sense. I think like the danger of course, of, of the situation that you're describing is that. Like you're saying, you can, you come out of this very expensive call ultimately without an action plan, but then action has to be taken anyway and action will be taken. But it will be, I mean this is just a classic example of if you fail to plan, you plan to fail or whatever, right? Like action will be taken, there will be a sense of pressure to do something. Of course in this case, maybe doing nothing would be better than doing the wrong thing, which is often what happens coming out of these calls.

And so like what we're advocating for philosophically is taking a break in a sense, stepping back, finding clarity, creating a plan that makes sense.

Doing all the kind of like legwork before you take action now. Part of what we offer, obviously is like, we'll do that. So you can continue to act while we kind of do that strategizing for you. But let's, let's talk a little bit about like, okay, so you're in that call, experiencing that level of confusion. How do you sort of, how do we come in and fix that? Like where, where a maybe I should say like how, if it's so hard to find truth, how are we able to do it?

[00:12:35] Luke Austin: Yeah. So I'd offer three different sort of sequential steps that we'll talk through and as we talk through them, do a mental check mark, check mark in your mind of like, do we have that? If no, then that's what you need to start on. And then like, okay, move on to the next thing. If you do, then, then go there.

But these are like, if all three of these things don't exist in your organization, you're gonna have some level of of misguided action and or chaos, right? Like some on some spectrum. So the first is you need all of your data integrated into one place. You need all of your data on on a marketing and financial level integrated into, into one place which may seem may sound like table stakes. And, and this is, this happens in a very low percentage of the conversations that we have initially with our customers where your data across your online store, so your shop, your Shopify store, will come store, whatever it is, every single one of your marketing channels, right?

Facebook, Google, apple, but also your TikTok shops affiliate fees, right? Like you go down the line, your mark, your variable marketing cost, right? So marketing channels online store. And then your financial data. What is the what is the product cost? The cost of delivery, all the variable costs associated with actually getting that, getting that to the customer.

And then your fixed costs are, or your operating expenses so that you understand what if your, if your plan is actually gonna be able to meet the obligations that you have for that, for that point in time. All of that data needs to be integrated into one place. Otherwise, you are going to have. 17 tabs open on your Chrome dashboard or be relying on data from partner A to match data from partner B.

And the amount of time that's spent triangulating between different data sources to be able to come to a decision to actually understand what that's, that is, that is the time and resource bog number one and, and most, and most DC organizations. So all of this data needs to be integrated into one place and.

To have a high level of data integrity. What does, what does that mean? Ultimately, data integrity is, there has to be consensus at a decision maker leadership level that this is the data that we're going to trust. Right? So like for us, we've thought about measuring data integrity in a number of different ways.

What's what makes it challenging is in many cases there's no objective source of truth to this conversation. Like, Hey, where do we go to look at like. Contribution margin. Like what's, what's absolute truth? Contribution margin for your.com business, right? Like and in most cases, like you don't actually have that, right?

It's some calculation that you're doing an estimation based on the period of time. So for us, the data integrity is all the data needs to exist in one place. And then there, there needs to be consis consensus at the highest level in the organization that this is what we are going to trust. That contribution margin number in that dashboard is what we are going to use as the source of truth.

Until of course, the P&L, the P&Ls actualized for the prior month. And that sort of becomes like, that's the, that's the business source of truth. Right. And that's what you're sort of measuring it against in hindsight is like. Cool. This is how we are tracking contribution margin to land once we actualize the P&L all our financials.

That was really freaking close. Okay, cool. That's, that's good. Right? All the data needs to be in one place across your sales, marketing, and financial sources. And that needs to lead to a the data point data points in terms of the profit of the organization, the contribution margin sales, new customer, returning customer revenue and then the ad channel data to be at a high level of data integrity, which means there's consist and agreement at the leadership level of this is the, this is the source that we're gonna trust.

[00:16:10] Richard Gaffin: Okay, so then let, let's fill in some gaps there. Like, so obviously like if you bring all your data into one place, that's step one. But in this case, your data it now, like the various platforms may be reporting inaccurately or whatever, but the numbers are still in one place, so that's not the complete solve. You'd mentioned sort of like establishing data integrity and getting to that single source of truth.

So how do you. Build a structure to actually gain buy-in to actually trust a number. Like what? What are our methodologies for establishing data integrity and then getting everybody on the same page that this number being reported in this case, in our platform status is actually the correct one.

[00:16:51] Luke Austin: Yeah, so the the most helpful, the most helpful comparison will be to look at the data in the context of. The core financial document of the business, which is gonna be a P&L and a closed out P&L for some time period. Right. And so that's what, that's what we'll do is we'll integrate the data, pull it in across the data sources, and then we'll actually build out the current month and then past month's P&L based on the data we have integrated, and compare it to your finalized actual financial documents to be able to say like, okay, the P&L, these financial metrics that we have in stats.

Re in the context of your P&L that closed out from last month or two months, or three months ago. This is, this is really close. So this is going, and that's what I would challenge people to do is like whatever data system you have, whatever spreadsheet you have, or software or tools, like, does that give you all of the data on a financial level needed to make decisions that you look for in a P&L level format?

If no, that's, that's an issue. If it does, okay, what is the fidelity, what is the accuracy of that? Relative to your actualized P&L historically, right? And you should choose the thing that gives you the highest level of fidelity in terms of the, in terms of how accurate is relative to this historical P&L figure.

So like, if you're like, okay, here's our, here's the thing we built in the system that gives us a sort of a real time view of these metrics. Here's our actualized P&L from two months ago. Oh, these are, you know, 12% variance between the, like, that's, that's too much variance. Okay, these things are, so what, what we'll aim for in terms of the fidelity of these metrics, because there, there's always going to be some level is, is one to 2% variance in the core metrics.

Like that should be your threshold for like, okay, if you have, if you have 1.2% variance or 1.5% vari, like that's really close that's pretty hard to get to. That's the, that's the bar that we hold for ourselves is like the, that, that level of variance or lower to exist.

[00:18:42] Richard Gaffin: Right. And sorry, to be clear, this is variance between the data and. Like the forecasted result and the actual result.

[00:18:52] Luke Austin: Yeah, so the, the data that you were looking on a day-to-day basis, intra month intraday to inform your decision making, like it's February 13th, the data that you're looking for, February month to date, and yesterday's data, whatever you're seeing in that view relative to. Your actualized financial documents, like your P&L from two months ago, right?

Like and and the, and those sources.

[00:19:15] Richard Gaffin: Okay. So. Once this sort sort of established data integrity, that still leaves open the question of like basically accountability or who's going to own the number. So let's talk a little bit to that. Like getting to the point where you're not sort of like asking agency one and agency two, their separate like opinions on the number or whatever.

Or holding them both accountable but but neither accountable at the same time. Like how do you get to a point where there's like a sense of like, this person owns this number that we all agree on.

[00:19:46] Luke Austin: Yeah, so there's, so there's the second step actually here. So this first step is data integrity data integration and integrity, all your data in one place and agreed upon consensus to this source of truth. So, okay, you got that check mark. Second step, which is, context for that data to live in data, all your data living in one place is unhelpful unless it lives in the context of something, and it should live in the context of what your business objectives are, of what your forecast is for the business, right?

You have a financial objective. That you have set based on what you have set. Yeah. You've set for the expectation of the team or with your board, you have some expectation of where the business needs to head over the coming month, quarter year. Your data needs to live in the context of that. Otherwise, it's gonna be completely disconnected.

The decision making is gonna be disconnected from, from what the outcome should be. And then the second piece of this data and context bus bucket is an agreed upon measurement framework. For, and this gets at your, your earlier point about you have the ad channel data. You have your revenue data from the platforms.

You have the ROAS data, like there's the time spent questioning how much do we believe the, the Google brand roas and the meta acquisition roas relative to its what its real impact is. Well, let's look at what North Beam, triple Whale and Google last click rows. And our house number, like the amount of time that's spent in that triangulation is.

Is a big piece of the puzzle. So the data needs to live in the context of one, your business, your, your targets, your business objective, the forecast that you have set, and then two there needs to be an agreed upon measurement framework that is used to inform. How that data is, is visualized. And our contention for those, for those two things and the measurement framework is that geo holdout incrementality testing is the gold standard of measurement.

So how you look at performance data from each of your media channels should be in the context of. Incrementality adjusted revenue and roas. So an incrementality factor applied to each of those channels, specifically based on benchmark factors and then a specific test for your for, for your brand.

And then context of the business forecast is we set targets for every single day of every month of every year. And that's how we track whether the data we're seeing in the dashboard is good or bad. And the, the amount of, like, the amount of, just even like in the example we teed up, like this weekly business review conversation and that experience is spreadsheet, a lot of data, even daily, daily sort of expectation of the data.

But like, most of the data lived outside of contextualized business targets. So you're just looking at like $230,000 of revenue yesterday. Good or bad? I don't know. Like we're, you know, we're 13 days through a 28 day month. So we should have a pacing that's about 47% and that's about 44%. So like that's, that's what happens in these conversations, right?

Which is like cool based on the days that we're in through the month versus where we should be. We're pacing, we're pacing pretty close or we're slightly off. That that shouldn't be the case. It should be very clear to everyone. We landed at this much revenue yesterday against this much spend, this much contribution margin, this much new customer returning customer revenue.

Here's each of our ad channels spend in roas. Every single one of those things should live in the context of a target that's set, and anyone should be able to look at any one of those metrics and say, good or bad, relative to the expectation.

[00:23:04] Richard Gaffin: Okay. So yeah, so, right, that's establishing, establishing context, establishing what it means relative to something else, which is the only way to really understand the truth, or rather, I should say the quality of the number.

So then I guess like what, what I'm, what I'm asking or what I'm thinking about is like, how do we. Then assign like, yeah, assign responsibility for that.

Right. So obviously, like we've gotten to a point where everybody has, is clear, has, has understanding. Now at what point do you say like, Hey, this, you are, you are held accountable to whether this number goes up or down or is good or bad. 

[00:23:41] Luke Austin: Yes. So, yeah, check mark one, right? You have the data integration integrity, check mark two, you have data that lives within context. And then check mark three, you have accountability related to who's solving the problem. And so the, the, the principle here is. The simplest system is the most preferable system, right?

Like, that's sort of like what ideally you, you you reduce the amount of complexity to get the, the level up you want. And in general, adding more people into the workflow reduces the level of. Get it, it's harder to achieve the same level of accountability and in our, and in our experience, the the speed to which insight to action can occur, right?

So, like at this point, if you have data integrity, data integrity and integration, and it lives inside of context, you should have a high level of being able to get to the insight pretty quickly. 'cause all of your data lives in is, is good. It lives in the context of the targets and the measurement framework.

So you, like, it should be very quick for most people to be like. The insight is this, this is what's happening, right? To level that, then the question becomes how do we get to action as quickly as possible from that, right? So insights, right? How do we get to action? And if there's 17 different people involved in this process that like loosely holds some fraction of the solution, it's gonna be really challenging to do.

And so the accountability. And our experience is you want that to exist in the simplest system and and sort of the least amount of separated nodes or people as possible to be able to accomplish that. Now what that requires is that you need someone with a high level of capacity to be able to achieve that, right?

Like you, you, you, for someone to be able to, okay, here's the insight, and in most cases what that insight is going to surface is. Okay, we are behind on media spend on x, y, Z channels, which means we need to make adjustments to the media buying strategy, to the creative output and the volume necessary on those channels.

We might need to reassess our, our standup new incrementality test to get the understanding of if we're launching a net new channel, what the true impact of it is, right? Like these are, and typically this is where we see it's 4, 5, 6, 7 people involved in that workflow. But. If you, the, the higher level of capacity that you have in an individual to be able to accomplish more of those problems is going to lead to that insight to action loop being as tight as possible.

And this is I think we're gonna have a lot more on this in, in the coming weeks here as we get to a more formal launch around this idea that we've been sort of teasing and skirt skirting around for, for a while now, which is our profit engineer. The thing that we have been. Building around our core growth strategy offering at CTC and that workflow and enabling that person to be able to occupy the roles of growth strategy creative strategy, media measurement and management of your of your met meta media buying as well which are like those four things.

Account for a large majority of where the workflow and decision making happens for.com businesses as it relates to the the performance marketing. And so, being able to have. Tooling systems and then the a high level of individual that has the capacity to do all of those things. It's our belief that within stats, we have the data integrated at a high level integrity.

We have the data lives in the context of targets, and then a measurement framework within incrementality. And then we have a high level of. A high level of individual that has the capacity as the profit engineer leveraging the tooling to be able to accomplish this core workflow that's going to deliver the best outcome along these dimensions.

So that's what, what we've been building for, that's the bet we've been making. That's what we're, we're focused on. But the capacity, that sort of final check mark of like, okay, do you have the right people and the most effective system for then accomplishing the what's needed based on where your business is at?

That's, that is our, our, our solution, the thing we've been, we've been building to be able to accomplish that.

[00:27:44] Richard Gaffin: Yes. And on that, no. I had a conversation with Taylor yesterday that we'll be putting out as a podcast in a couple of weeks, or rather, I should say a week after you hear this, that's gonna flesh that out a little bit more and kind of give a sense of, of exactly what's on the way. But just, just to encapsulate it in a metaphor that I used, which is that one man in a backhoe can dig a hole faster than six men with shovels.

And that's kind of like what we're putting together in the world that we're moving into. So, now obviously, like if you want to. Put this together, or, or rather I should say, begin down the path towards clarity and then clarity of data, rather leading to precision of action. The number one thing you could do, of course, if you are a brand that's in the 10 to a hundred million range, is to talk to us, comment thread co.com, hit the highest button, let us know you're interested. Now, what's the second most useful thing you could do, Luke, if you had to do one thing right now to get to this point of data clarity.

[00:28:38] Luke Austin: I feel, I feel like this is a trick question. 

[00:28:40] Richard Gaffin: It's, it's not a trick question, I promise. This is open-ended. Usually I'll ask like if you're not currently a client, what is the thing that you can do?

[00:28:48] Luke Austin: I would, I would do, I would do those, those two things. So the what we walk through the three sort of steps in the check marks, like really evaluate that for yourself. Maybe have a conversation with some of the leaders in your team or at, like, do, do what, do we all say that we have check marks in each of these, in each of these areas?

Right. Or which are the ones that are, that are missing? To help surface where the biggest gaps are. And then the second sort of thought exercise slash conversational piece with some of those people that you could do is the, like, how much of our time spent on meetings and phone calls and Slack messages.

So, orients around trying to understand what's happening and get into clarity versus actually doing things and solving problems. So I would, I would go do those things, have those conversations. I think it's fun thought exercise, fun conversation piece. And I think we'll highlight where some of the gaps exist and in your organization, if they do as it relates to this conversation.

[00:29:37] Richard Gaffin: That's right. All right, folks. Well, that's going to do it for us, for this for this particular episode. But yeah, a again, if if this is something you want us to build comment thread code.com, check us out, hit the high risk button, we'd love to chat. But until next time, take care, y'all.