From Last-Touch to Incrementality: Adobe Marketing’s Measurement Transformation

[Music] [Lily Chiu-Watson] Hi, everybody. My name is Lily Chiu-Watson. Thank you for coming to our session. I have a very important question first. So who here likes to garden? Anybody? Yes! All right. So this is around the time of year, every year, when I start thinking about what I want to do for my garden plan. And part of this entails going back into my spreadsheet of what I've done previous years to look at when did I plant my seeds? When did I pot them up? When did I put them in the garden? This is proof I actually have a spreadsheet with pictures. And I say this because there are so many elements that go into planning your garden every season. So there are things that are in my control, like what potting soil, when I'm going to start my seeds, when I'm going to apply fertilizer. And then there are other things that are outside of my control, like the weather, for example. And all of this is in pursuit of a tomato yield that looks like this, even though, sadly, right now, mine looks more like that. But the reason I bring this up is because I think that marketing, measurement, and planning is actually not very different. In both, gardening and marketing measurement, I'm trying to find signals that help me understand what are the tactics that actually are contributing to my goal, which is to have better and more tomatoes. And as our title says, we're going to talk about moving off of last-touch attribution to incrementality, right? And if I think about what last-touch would look like in my gardening journey, typically, the stage that happens before you harvest is that you're often applying fertilizer. So if I applied last-touch to understand what matters most in me trying to grow as many tomatoes as possible, I'd probably focus on just dousing my plants in fertilizer instead of thinking about other factors, like soil health or my watering method or responding to other factors that are happening, like garden pests or weather. And so Matt and I are going to talk today about how your marketing measurement should always be in support of helping you make the best decisions to achieve your business outcomes, whether that's tomatoes or revenue.

So I'm Lily. I lead Product Marketing for Adobe Mix Modeler, which is our unified marketing measurement solution. But what a lot of people don't know about Mix Modeler is that it originally grew as a in-house solution that was custom-built for Adobe Marketing. And so Matt here was critical to the idea, the design, the development, and the execution of this solution that grew to be Adobe Mix Modeler. And I think that when I'm talking to brands about transforming their measurement, I think that organizational alignment, change management is as much an obstacle as maybe the underlying technology itself. And so that's why I'm super excited to have Matt here because he was on this journey right over the last eight years, and he can really dive into the nitty-gritty of how they executed and then also some of the missteps and gotchas along the way. So, Matt, I would love for you to just share a little bit more about what your role is and then what a day in the life looks like. [Matt Scharf] Yeah. Sure. Thank you, Lily. And hi, everybody, I'm Matt.

I run a Growth Marketing Performance organization with a nested Global Media Center of Excellence inside of that, which basically means two things. One, we're on the hook for marketing effectiveness measurement at Adobe, and the other is that we do that on a global scale through a unified measurement framework and operations framework that we partner with our agencies on, as well as our media partners on a global scale. That's what we do in really quick colloquial terms. But in terms of a day in the life of, I think probably the best way to answer that is to point to three different organizations that we work with internally. I'd start with finance, and so you might imagine that's like understanding the overall business targets, understanding and recommending how much marketing budget and the Media Mix and a going-in plan we would need to support those targets and any remaining gaps. The other is a go-to-market organization, which may or may not exist in your organizations or they're called something else, but basically, they influence all organizations internally at Adobe to just build a path and hit the number. And so we work with them really tightly in the quarter and largely in the form of understanding and developing a path to the targets and what are the areas of strength and weakness in the business, and how do we build contingency plans to claw back to our targets and ultimately work together to attain it. Lastly, of course, marketing teams, and so we have Adobe Mix Modeler applied internally at Adobe, and we hold the marketing teams accountable on a global scale to hit a target that is commensurately stretched according to the business targets. And we also partner with them to build a path to the number through empirical evidence, through applied Analytics and Insights. And so Lily teed this up, but the reason why I'm up here with Lily is customer zero. And I think if you saw the Keynotes this morning and David Wadhwani talk about customer zero, basically, I'm in the marketing organization if that wasn't clear yet inside of Adobe and we need to be the best user of our Experience Cloud solutions, period. And we hold ourselves to that accountability, and so that means two things. It's a bi-directional thing. So on one hand, we are the end beneficiary of these tools and they power all of our customer experience across Adobe.com and marketing channels. But the other side of it increasingly so is we, just like everybody in the room, have basically the same challenges, and oftentimes we'll develop our own solutions to those problems that end up being products that we sell. And so great example is Adobe Mix Modeler, the other is GenStudio that you heard all about probably this morning and leading up to Summit. This is a long-winded intro, but I feel compelled to share a goal and then my hope for everybody, like I'm just going to lay it all out there. Our measurement journey over the last 8 to 10 years, as Lily said, I'll tell you what we learned. Some of its hindsight in 2020. Some of it was premeditated and it worked and it didn't. I'm going to share as much as I can on that. And my hope is that you either learn something you want to apply internally from a change management perspective, or you learn something you definitely don't want to do because we did it wrong. So that's the goal. Awesome. So maybe we can start by talking about technology, people, process. I feel like this is something that I have a lot of conversations with customers about where they really want to understand, again, how did we think about this holistically? Yeah. And I'll bang through these pretty quick, but technology people and processes, we see this as basically three recipes into this formula that allow us to do two things. One, it powers our data-driven operating model internally, and I'll just flag that for the time being. I'll talk about that in just a moment. The other is, it's the variables that we play with to navigate change, and we're doing that a lot lately as a marketing organization. So the first starts with technology. Technology is basically the foundational footing of everything we do. And again, it's the Experience Cloud solutions that power all of our customer experience for us as a marketing organization. And then we get to the people in process. So the people in process, this is a conceptual way we think about things. The people in process surround the technology and the process is the highest ideal organizational outcome we have in mind whatever that may be. And then the people is where we hire for behavior. So what is the behavior that is in service of the overall internal process that we want to achieve organizationally? And so this whole concept is going to show up throughout the course of the conversation because it's such an integral piece to how we think about doing things. And lastly, going back to DDOM. So a lot of this session is about how we've navigated change and how we've built relationships internally and externally with our agency partners over time. And the Data-Driven Operating Model that we refer to internally as DDOM, and it's basically the way we run the business now. The catalyst for it becoming a thing was I would point to when we cut over to this cloud subscription model, and that accelerated the need for us at Adobe to have our finger on the pulse of what's happening all of the time on a day-to-day, week-to-week basis. We went from arm's length to 24/7 relationship with our customers, and that was one of the catalyst for us to think differently as a marketing organization to evolve the way we approach marketing and marketing measurement. And so that's the foundational underpinning. I do want to take just a quick moment to tell a story where I think it illustrates how all of these constituent pieces are working together and what I would say is like the sum of their parts is greater than their whole, and it goes back about two years where our CEO actually asked us to start sharing a perspective to the executive team on business performance. And so if that seems weird as an ask to a marketing or analytics organization, I think it's a reflection of how these things are coming together. So marketing is increasingly so huge strategic partner nowadays in this Data-Driven Operating Model. And the modern marketer, we think about a lot as evolving to be this connective tissue between what's happening out in the landscape, whether it's AI or other developments, and the implications into all of our internal organizations. And so that's putting us more front and center into these conversations with the executive team and so forth. So it's been about two years almost of us writing a written word weekly summary of business performance. And I think it's because if you anchor on the technology side of things really quick internally at Adobe because of our application of Adobe Mix Modeler, there's a ubiquitous, this is my POV and I'm a little bit biased, but there's a ubiquitous belief that marketing has a firm grip on performance and understanding marketing performance of the business. And so if you go to the process and people and click out on that concentric circle a couple times, it's not too far of a stretch to ask the marketing organization to also have a perspective on the business performance. And so in a few weeks, it'll be now about two years of us writing this email up in the E-team coming from the marketing organization, which I think is reflective of this technology that we've adopted. Okay. So it sounds like things are in a good place now. Can you go back in time and tell us how things were at the start of this journey? Yeah. Let's get into that because it wasn't all great. I think, going way back, we had last-touch, but we had last-touch reported independently across three different marketing channels. And so the implication to that from a people and process perspective was that we had a 1+1 = 3 Problem. - Sorry. - Yes. Can you describe what that means, 1+1 = 3? Yeah. So just imagine sitting in front of finance and saying, "Hey," I'll just for simplistic purposes, "we drove 100 conversions last week." And they say, "Well, the business only saw 80 conversions." So what's going on here, right? And so there's a total disjointed arm's length relationship with finance. We weren't really speaking the same language at the time, and that was largely the catalyst for us to think differently. And in addition to the catalyst I mentioned earlier of us shifting to a far more accelerated cadence of reading performance in the business. So then, what did you guys move on to next? What was identified as that solution? Yep. It was back in the day, of course, when cookies were still highly valuable to us. We immediately cut over to a multi-touch attribution model. But we took a crawl, walk, run approach, and this was a crawl approach where we kept the same channels. We moved over to as a deterministically stitched fact based time stamped type of attribution model. Okay. So I've talked to a lot of brands where they're making decisions based on last-touch attribution. I won't do raise of hands, but I suspect it would be a large part of this room. Yeah. But when you move to multi-touch, or any other attribution method, actually, or measurement, some channels go up and some channels go down, right? And I think when we think about last-touch, we think about paid search looks really, really good. Yeah. How do you manage that change? Yeah. We should spend a little time on this one because the navigating of change management here was everything because we'll just leap to where we ended, which was a quantum leap forward in our sophistication and our relationship with finance, and our relationship with other organizations internally. It was amazing. But how we got there was more difficult. And so I think I would point to, and again, this is one of those examples where it's hindsight in 2020. But there was a managing up, a managing down, and a managing out component. And each one of those were different personas that warranted a different conversation, and so if we define that, the managing up was connecting with executives and saying-- The nature of the conversation was, "You're used to seeing these numbers, you're going to see these numbers, and they're going to be a far lower." But here's what that means, and it was a conversation on incrementality, a conversation on ROI, a conversation around when you give us budget, and we tell you what it's going to drive, you will see that in the business. And so that component was far easier from a change management perspective, at least for us internally, where the executive team is like, "Well, we just want something more sophisticated, and those use cases sound fantastic." And so I want to avoid the word recommendation of my opinion on what you should and shouldn't do. But for us, there was a little bit of trepidation on whether or not we are going to indict what we were doing before, and my only recommendation would be don't worry about that, especially with the executive team. They just want something more sophisticated. And so that was the managing up piece. The managing down piece, so we can maybe define that as media practitioners, I can't underestimate how important this component was. And going in, we didn't know how much time of change management this would require. And I would say spend disproportionate amount of time with paid search. And at risk of preaching to the choir, through last-touch, paid search is the hero, right? And through a multi-touch or any other degree away from last-touch, they're not going to be as much of the hero. And so there was some concern about optics and budgetary decisions going forward and stuff like that. So one thing that we did that seemed to work, to manage, to navigate that change was, we took the new measurement capability in a rescored history, whether you want to go back a quarter or two or three or a whole year. And we showed-- Well, this is getting into that what we did with the media teams are, just real quick, I'll insert this other piece is we used a bit of a crutch, or a long on-ramp, if you will. And there's two outcomes of this new model. One was, read on incrementality, like an incremental read, and the other was a fractional read. And of course, the fractional read is spread across the whole path. So it's still the conversion but spread across the whole path. And it was a little bit of a baby step forward into incrementality, and we agreed with everybody, "Listen, we have to run the business on incrementality because you're not going to give marketing budget just to get inside of a path that's going to exist anyway. We have to drive incremental conversions beyond what would happen." So we had this period of time where we read out on both outcomes, and over time, we were interpreting the business and we started to all get more familiar and comfortable with the lower number that was reflective of incrementality, that was an easier transition for us but it was a little bit of an on-ramp, a crutch, if you will. Lastly, managing out. So with media partners and agencies-- At this time, I remember this very vividly because I was on the display team, display analytics and operations team. And so we had long engagements with all of our media partners, and what we did is we showed a juxtaposed view of how you're performing through last-touch and how you're performing through this new measurement solution that we're about to launch. And we obfuscated everybody else except for the person we were talking to at the time, and the best example I can give is that through last-touch, we had a partner that was head and shoulders, number one on return on investment. And they were dead last by a long shot in the new multi-touch attribution model. And these examples are probably not too common, but this happened. So the nature of that conversation was if you don't do anything different, then you're going to be at risk of not being a part of the Media Mix. And so we tried to educate along the way with a long runway before the actual launch of the new measurement capability. Got it. And then I know we've talked about then the channel expansion. I don't know if you want to talk a little bit about that strategy. Yeah. Yeah. So again, this is the nitty-gritty of our evolution, but we moved over to multi-touch, and we developed a really great relationships internally, and we had to see that at the table strategically. And then we started actually having such high fidelity on what the budget would do for us that we started turning down budget with finance. And that had a short-term positive impact because they're like, "Wow. Awesome. Thanks for the fiscal responsibility." But then quickly, we realized we were doing that too often, and it was because we didn't have enough investment points in our Media Mix. And so we stood up a channel expansion strategy. So we added Snapchat, TikTok, YouTube, you name it.

And for a short period of time, things got even better, and we had even a longer runway where we were investing in the things that we needed to and the model wasn't a restrictor to investing in things we knew we needed to invest in. And so but that was very, very short lived, frankly, because what happened, and this is not novel information, and we all remember this, between 2018 and 2024, and this is not even including all of the announcements but cookie deprecation and consumer privacy laws just started rolling out very fast. And so this had two impacts on us. One was all of the channels we just expanded into, we started losing fidelity in, and that kept us from having a really firm justification to continue investing in them. So that became a problem very fast. And then the other thing is we were looking out on the horizon a little bit here where this is a little bit more of a premeditated strategy, and it's not really too eye-opening. But we knew that this pendulum was not going to swing back in our favor in terms of cookie dependency. It was just that those days are long gone. And so we reached this fork in the road. And for us, we sat back with this, and there was three possible solutions. One was revert back to last-touch, and we immediately divorced ourselves from that because while it would have been easier, it would rip out all of these foundational pillars that we're standing upon to have these better relationships and a seat at the table strategically, and how to run the business with internally at Adobe. The other was to keep playing whack-a-mole. And what I mean by that is we went through a pretty long period of time, a couple years where we were trying probabilistic stitching, fancy solutions, getting data that propped up everything, and then every time we did that, that got ripped out from underneath us. And so the last solution that we ended up going with is just build something for the future. And so something that was a little bit more future-proof, and that was a pretty seminal change for us in the journey. It sounds like a no-brainer, but I guess how did you know that a solution that was future-proof existed? How did you actually get it to be go from an idea to an executed success, right? At Adobe, we're a big company. We have a lot of ideas. Not all of them make it through. And so I'm curious can you walk us through a little bit of the pieces of that process? Yeah. Yeah. Well, I have the benefit of being the mouthpiece of this whole journey but there's a lot of people involved. But I would give my boss a ton of credit here. He's a very prophetic thinker, and he also is able to point in the direction that we need to go and provide air cover for the team. And so the one thing we leaned on first is, at Adobe we've gone through all of these digital transformation changes over time, and we've developed a culture that basically throughout the playbook every single time and start new. And we did that this time as well.

But the way we did this is, leadership pointed in the right direction of what we need something else. And the only guidance that was provided was three things. We can't lose three things. We can't lose incrementality out of this new measurement solution. We can't lose a weekly read of it, and we can't lose the level of granularity of understanding of performance that the media teams actually conduct optimizations at. And so those were the only guardrails. And then practitioners went off for a three-day offsite, and they were like you have the time to do the thing you need to do, and they went off. The talent profiles are on the screen. It was data engineering, media strategy, ops, and data science, and all sorts of talent profiles, and they got together for three days. And what came out was a proposal and a roadmap to get there.

And what it was is it was a very stark departure from a single measurement solution and three different solutions supporting each other in somewhat of a three-legged stool approach. And so-- Why is there a picture of Swiss cheese? Yep. So the Swiss cheese-- This is a good and hopefully, this analogy lands. The solution was part Mix Modeling, part Attribution Modeling, and part Experimentation. And so the Swiss cheese analogy is that, which is why we felt good about moving away from a single solution because each solution is going to have holes in it just like Swiss cheese. And then you layer another methodology on top of that, you're going to block some of those prior holes. And then you layer on experimentation on top of that as the third layer of cheese. And by the time you get to that point, you've blocked most of the holes. And so we're relying on all three in concert with each other. And so what I would say here is and oftentimes, you may hear me say model, but this is more of a measurement ecosystem now than it is a dedicated single model necessarily. It is a single model in the form as it spits out a solution, at least spits out the KPI that we're looking for. But we use the whole thing, that threefold, the three-legged stool, three slices of cheese as the whole ecosystem. So that's a pretty big change on the technology side. And so how did the organization change as a result or not? Did you already have all the right people? Yeah. And this is where and Lily and I talked with customers a lot. As I look back, I didn't realize this at the time. But what was crazy is that during the period of time of us building that new measurement solution, we were also engaging in almost a year-long agency RFP. Did you want to do both things at the same time? No. And that's one thing I would not recommend. But lo and behold, we launched the new model live in market at the same time we kicked off with our agency. It's just the timing just happened to fall in place like that and it worked out. So I think as I was reflecting on this, there's a number of things that I think I would point to as components of that success.

And the agency RFP was going from 15 agencies down to 2 on a global scale, and this was a pretty big transition. But on the technology side, there is such-- If we define technology here as our measurement solution for measuring media, there's such adoption internally within the walls of Adobe and such trust in that model that we did something really unique in the agency RFP, which brings us to the people. And so the two things we did here is we made it really clear at the beginning in the RFP but also during the staffing stage, once we selected the agencies, that we didn't want any media analytics resources on the agency side, which I think is pretty unique. You guys can tell me. And the reason is because we have the technology internally, and we have the analysts internally that surround that from a people and process perspective. And the operating model we wanted to establish is the same one we have internally with our own internal employees, which is the analysts tell the media teams how they're performing against the target, and they hold them accountable to meet that target. And they also help them meet that target. And so with the money saved on what we would have staffed on the media analyst side, we hired a bunch of media strategists and we obsessed over operations. I came from the agency side running an operations team, and so I'm, again, a little biased on that, but we obsessed on operations. And so we loaded up on those talent profiles instead. The other thing we did on the people side is going back to hiring for behavior and in service of the organizational behavior we want to strike.

We conducted chemistry meetings with the agencies, and just did a vibe check to make sure that we were mind melded on behavior and organize and really articulated what our ideal organizational behavior is and discussed how we can do that together, and that was really super fruitful.

On the process side, this is interesting too. In that, we embedded the agencies directly into our pre-existing motions. And so what that means is we get together on a weekly basis multiple times with the media teams, and we sit down and the nature of the conversations, are we red, are we green, and what are we going to do about it? And it's a conversation around what they're doing out in market and how that's reflected in the results, and then how we can lean into some of that. Those tailwinds are optimized away from some of those headwinds, and we bring the agencies into that conversation. And that's packaged with basically opening every little detail of the model to our media teams, which I'm not entirely sure how unique that is, but for us, it's predicated on this. I subscribe to this belief that if you have confidence in your measurement solution, you should allow the media teams to gain the outcomes. And so we share all of the variables that are at play, how things get scored, the coefficients and I think just a little quick story on that just to bring this to life and with a real life example is oftentimes, we get into a little bit of debates where the meeting seems like, the model is not giving me credit for this thing that I think I actually get more credit for. And then we're like, "Well, okay. We need to either do one of two things." If we agree, then we just get in the roadmap and we fix that. If we don't agree, then we surround that with experimentation to confirm or refute this hypothesis. And so this particular instance, we, just two weeks ago, launched a new version of this measurement capability internally that ripped out a variable that was actually under crediting, more the upper funnel media that after some empirical evidence and testing, it proved out that actually it was an old variable in the model that used to be relevant that no longer is relevant because of the shifts and how people are converting on Adobe.com, which is ultimately going to allow the media teams to invest more upper funnel tactics to be more in line with the strategy that we set at the beginning of the year, just as an example of what we're doing on the process side. In that example, it's really important your technology can actually support that kind of response, right? Because I think that a lot of times people are doing planning or running a model or rebuilding a model once, twice a year. Yeah. And in that case, then you couldn't really do that. No. No. Yeah. It gets into the two use cases, the planning and the end quarter optimization component that this particular measurement solution allows us to get into, and we can totally talk more about that one, for sure. Yeah. Maybe so you've talked about operational efficiency, improvements in relationships within Adobe, with our partners. And so maybe we can have you talk a little bit about actual metrics and revenue. Yeah. And there are a lot, but I think, I mean, in the spirit of the nature of this session and navigating change and developing relationships internally and I'll put my finance hat on as I speak about this one because we have a great relationship with them, but that doesn't preclude them from interrogating us on how we're performing either. And so over the course of the last five years, we've more than increased our budget by 2.5 times. And if that's not an indication of internal adoption and understanding and advocacy of the model, then I don't know what it is. During that course of time too, we have increased our share of contribution to Adobe.com reoccurring revenue by 75%. So marketing share over that same period of time has gone up disproportionately, which is a huge number that we track every time we get into a marketing effectiveness review. We revisit that number. The thing that is not reflected here that is probably most valuable is the standard economics as we all know with media is, the more you spend, the lower your ROI is going to go, and you just try to make sure it doesn't go below, right? But during the same period of time, we've actually increased our return on investment. And I think what I point to is two things. One is the planning motions, so a responsible allocation of the proper budget to support the targets without it going negative to start the quarter. And then in quarter, the ongoing weekly optimizations that we're conducting is the other component that allowed us to increase the contribution while also increasing ROI at the same time. So maybe we can stay on that because I think a lot of brands, most brands are doing that pre-quarter planning, but I'm not certain that the metrics that are driving the planning are driving in-quarter optimization. - Yeah. - So what does that look like tactically? Yeah.

That may be a bit of a differentiator in how we're operating, I agree. So I think I'll spend less time on the planning and more time on the optimization side because I think most brand, most companies are probably familiar with in this room, but I would, excuse me, in the simplest terms, I would say the pre-quarter planning is, the outcome of that is an actual going in plan, which is inclusive of the budget, the Media Mix, the flighting, the percent of the target that is covered by marketing and so forth. So we have a really firm documented going in plan, and then there's this baton handoff. The relationship between these two is the same ecosystem guiding both these things, so we can predict what we can guide on marketing budget to predict what's going to happen and then actually monitor that and optimize that to hold true to that throughout the quarter in the in-quarter optimization. So if the left side is the going in plan, the right side is plan to diverge from the plan, basically. And my favorite quote from Mike Tyson is, "Everybody's got a plan until you get punched in the face." And so that analogy serves here. So what this looks like on the encoder side is we've got the targets, we've got the budget, we've got the Media Mix, we got the tactics in place. And then there's these two types of interlocks at different cadences, and the interlock one is a weekly interlock with the business. It's going back to the top of the conversation about DDOM, that operating model internally. It's all organizations getting together on a weekly basis to understand the pockets of strength in the business and weakness, the attainment to target, what are we going to do about it, that kind of motion every single week, and there's a subset version of that that I talked about with the agencies and the media teams as well. So that's what I would call interlock one. It's basically business performance interpretation. And that also is one of the mechanisms that derives this weekly E-team summary of business performance as well. The other interlock is more of episodic cadence, but pretty frequent still. So it's a week three, a week six, and a week nine dedicated, pre-meditated conversation with the go-to-market and finance and other organization, and it's a conversation where it's part tell and part ask. So the tell is, okay, we're in week six. We have done seven base budget shifts across channels and regions and product portfolios and we are increasing our targets and holding the marketing teams on a global scale accountable to a stretch target that you can take to the bank in terms of an outlook. That's the tell part. And we don't ask for permission on that. We just do that. The ask part is more of, like we put it on the table so to speak as optionality for the business depending on the risk profile we're looking at in the quarter, how we're performing. It could be in the form of a small, medium, and large type of funding ask where we say and I'm just making this up. You can give us $2 million to return $3. You can give us $6 million to return $6 or in a bit and that is a function of the same model giving us a sense of where our marginal returns hit a break-even. When does that next dollar go negative? And then the conversation there is, if things get funded, we say, "Listen." Okay. We're going to put this into market in week seven. By week 13, you will see this re-occurring revenue show up in this region against this product and so on and so forth against this route to market. And it's that kind of conversation, and so that's the relationship between the two motions. Okay. So I feel like we went through a lot. I want to leave time for questions. So maybe you can recap the key takeaways-- Yeah. From what we just talked about, and then we can open it up to people. Yep. Sure. And then sticking with this technology, people, and process framework, on the technology, so the measurement side, I think what we've learned is just to evolve measurement continuously and do a crawl, walk, run approach, if that makes sense, and don't be afraid to use multiple methods like the Swiss cheese analogy. That's what we've learned, and it seemed to have worked for us. I know all problems manifest differently in different organizations, but again, that's what worked for us. And then incorporate other variables, and this comment is more applicable through this planning motion where whether it's explicit input into the model like economic data that you're plugging into the motions to guide budgetary allocations, that's been huge. Or even implicit observation of data, brand search volume changes over time, competitive pressures happening, bring those into the conversation around planning and in-quarter optimizations, and it enriches-- The premise as to why we need more budget or it enriches that conversation. Lastly, in technology, understand content effectiveness, and we've heard a lot about GenStudio. So we're sinking our teeth into this one heavily. So the way I think about it is, you can measure a touchpoint but going deeper into that, you're measuring the creative asset behind the touchpoint. And then increasingly so we're trying to penetrate even further deep into that to understand the components within the creative asset. And GenStudio is a big asset for us there to understand for whatever reason on TikTok, I'm making this up, but a black background is better than a red background, but on Meta, a yellow background is better for the same exact creative asset. And so we're learning these insights and the proliferation of creative of what a marketer needs to get out in the wild is just unbelievable. I'm preaching to the choir, unbelievable hockey stick and the consumer expectation for that is huge. So anyway, we're investing a lot in the content effectiveness. On the people and process side, this is framed for both-- I've tried to collapse learnings for both an agency of RFP if you're going through that exercise or evolving your measurement solution. So I would say focus on the initial scope. I mean, excuse me, focus the initial scope to something that is gettable and something that teams can sprint to crawl, walk, run approach, something to that effect. And just so everybody has a really clear understanding of what a 30-day, 60-day, 90-day win would be. And then the manage up, down, and out. Again, this is executives, practitioners, media partners, and agencies, and each one of those, for us, justified a very different conversation.

And then organize the teams ahead of the change. That's one big thing we did with this agency transition. We had a lot of new people that joined to help rally around this agency transition, and we brought them in far ahead of the actual change. Same applies to a measurement change that you may be going underway with. And I'd say invest in operations. These are the unsung heroes in my opinion. If you don't hear from them, that means they're doing an amazing job because if there's no data problems, they avoid the garbage in, garbage out. As a part of this global operating model with our agencies, we have an in-house team that defines the standards by which we do naming conventions, trafficking processes, data flows, and all of that. It's just so that across the globe, we allow for some divergence and strategy and execution, but the operational execution is on rails. And then hire for ideal behavior again. So whether it's an internal full-time employee or an agency, we're really looking for hiring for the behavior that is in service of the ultimate organizational behavior we're going after. And lastly, bake in transition time. I'll just say really quickly on this one. Every time we do this, we have a one quarter do no harm concept where agency comes onboard, don't do anything fancy immediately. Just carry through this existing strategy for one quarter and let's get through this. We can't have ebbs and flows in the business. The whole point of this thing is to actually increase our contribution over time as opposed to have some hiccups. So that's the last one I think I would point to. Again, a reflection looking back in hindsight 2020, these are the things that seem to work for us.

Awesome. So we talked a lot about operations, change management, organizational alignment today. We have a lot more sessions to go at Adobe Summit around Marketing Measurement. And one of them, I wanted to call out is that if you want to actually go now more deeply into the technology itself and the modeling decisions that were made, we actually have a session tomorrow directly with our data science team as they talk through that development process. So that's tomorrow at 2:30. It's called Marketing Mix Modeling at Adobe: Learn to Predict the Future Like We Did. And then we also have a session with H&R Block, one of our customers, tomorrow morning at 8am. So would love for you all to attend other sessions to learn more about the technology itself. All right, well, thank you, Matt, and thank you so much for attending our session. - Thank you. - Really appreciate it.

[Music]

In-Person On-Demand Session

From Last-Touch to Incrementality: Adobe Marketing’s Measurement Transformation - S415

Sign in
ON DEMAND

Closed captions in English can be accessed in the video player.

Share this page

Speakers

Featured Products

Session Resources

Sign in to download session resources

About the Session

Go behind the scenes on how Adobe marketing increased its contribution to subscription growth by 75% after moving beyond last-touch attribution and building a measurement system that truly reflected marketing’s impact. The journey? A bold, step-by-step transformation that aligned executives, united marketing and finance, restructured agency collaboration, and implemented a sophisticated, future-proof modeling solution.
 
In this session with Adobe’s VP of Growth Marketing Performance, we’ll break down:
•    Why we abandoned last-touch attribution—and the executive alignment needed to make the shift
•    How we built a scalable measurement framework—from strategy workshops to implementation
•    The role of modeling and experimentation in overcoming data loss from cookies
•    Our strengthened agency relationship and processes —how consolidation and clear accountability improved measurement accuracy
 
If you’re struggling with measurement challenges in a privacy-first world, this session will provide practical, battle-tested guidance straight from Adobe’s playbook. Learn what worked, what didn’t, and how to apply these insights to your own organization.

Industry: Advertising/Publishing, Automotive, Commerce, Consulting/Agency, Consumer Goods, Financial Services , Healthcare and Life Sciences, High Tech, Media, Entertainment, and Communications, Retail, Telecommunications, Travel, Hospitality, and Dining

Technical Level: General Audience

Track: Customer Acquisition

Presentation Style: Case/Use Study

Audience: Marketing Executive, Marketing Analyst, Marketing Operations , Business Decision Maker

This content is copyrighted by Adobe Inc. Any recording and posting of this content is strictly prohibited.


By accessing resources linked on this page ("Session Resources"), you agree that 1. Resources are Sample Files per our Terms of Use and 2. you will use Session Resources solely as directed by the applicable speaker.

New release

Agentic AI at Adobe

Give your teams the productivity partner and always-on insights they need to deliver true personalization at scale with Adobe Experience Platform Agent Orchestrator.