Distilling the Content Cocktail: Discover Ingredients That Convert

[Music] [Jennifer Werkmeister] But I'm super excited to talk to you about some content today. So I'm Jennifer Werkmeister. I'm a Senior Product Manager at Adobe. I've been with Adobe for over five years now. And for the last year, year and a half, I have only thought about content. So very excited. [Abhinav Saxena] Hey, guys. This is Abhinav with Verizon for more than eight years now. Have been doing Adobe Analytics even before that. This is about fifth Summit I'm attending. Very excited to be here in the presenter role to show you whatever work I did with Jennifer here.

Great. So just to set the stage a little bit, the problem that we're all looking to solve, organizations are spending a lot of money on content. Now content is everything. It's what grabs interest. It's what creates engagement. It's what converts. But it can be incredibly expensive to create, manage, and serve all of that content.

So just to give you a bit of an example, this is a real example from a real Adobe customer, and they have around 1,000 products. And they have around 25 images for those products. And that's actually pretty modest. So if they're going to alter those images in any way for any of the 15 global regions that they're operating in, that already has ballooned up the number of images that they have to manage to 375,000. Now imagine if they try and personalize that for any number of the different customers or customer context that usually becomes millions. Now that's obviously not feasible for a lot of organizations, but we all know that GenAI is going to start helping organizations meet this demand for content.

So GenAI is going to help. Content is going to balloon. It's going to meet that demand. But marketers are still going to be left with these questions. We need some sort of measurement solution to scale with the way we're scaling content creation. You still need to understand how that content's actually performing. What do you actually need to prioritize as you're creating that content? What's the ROI on those efforts? It's still not free. And are your personalization efforts actually worth it? So we've been hearing this a lot from customers, and this is the questions that have been keeping me and a pretty large team at Adobe up at night for a while now. So I'm super excited to introduce you to Adobe Content Analytics. So the whole point of this solution is to help you measure the ROI of your content within the context of your customer journey. So what it does is it leverages GenAI derived attributes to help you to understand all of your content at a super granular level, and then measures that content comprehensively across your site. So once you understand how content and content attributes are actually resonating with your different users and impacts their journey, you're going to be set up for better optimization and personalization.

So let's get into a little bit of how that's actually going to work. So all of this really hinges on the AI/ML services that are helping to break all of your content down into those little granular pieces. This is something that would be impossible to do manually. So let's talk about those pieces and how they all ladder up together. So first, there's the experience. And you can think of the experience as just a unit of content in a specific arrangement at a specific point in time. So for most of this presentation, I'm going to be focusing on the web context because that's where we've started with Content Analytics. However, keep in mind that we will be expanding to other channels here very soon. But for now I'm mostly going to focus on web. So right now we're defining the experience as the entire web page. And we absolutely understand that you think of content units as something maybe a little bit smaller than that and we're working on making that more granular. But just for now bear with me. Think of it as an entire web page. So if you have a home page and you have that home page running as it is for about a week that's an experience. However, if you change out the banner, change out any of the content on your home page, now it's different. That now counts as a different experience.

So within an experience, you have text and you have images, and let's be real, you have other content like videos, and we are getting to those. But for now we're going to focus on text and images. So of that text, it breaks that text down into all of the attributes that you could want to know about that text. So that's going to be obvious things. All of the counts you could possibly want. Word count, sentence count, paragraph count, emoji count. It also has obvious things like keywords. But what I'm really interested in is when you start getting into things like tone of voice, persuasion strategy, narratives. This is what's going to help you dial in. "How do I actually speak to my customers?" "What voice is connecting with them?" "What persuasion strategy do they respond to?" Okay. So then, also within experiences, you have assets. And I'm going to use the word asset and image pretty interchangeably. So assets, we have a bunch of attributes on them as well. So you have things like the subject matter, obviously, background scenes, foreground colors, background colors, overall tone. We even have down to the camera angle for the photograph that was taken. If it's not a photograph, we also have photography style or image type.

So each asset gets an asset ID. And so the asset ID is associated with the URL. However, we found from the customers that we've been working with in early applications of this and in betas that they actually have the exact same image occurring all across their site that for one reason or another has different URLs assigned to it. And so these assets also get something called a perception ID. And what the perception ID does is it looks at assets that are visually the same and says, "Hey, this is the same image. We're going to assign it the same ID." So now you can look at the performance of that asset as a whole, even though it's coming from different URLs. You can also then break the performance of that asset down by the different placements and context of where it appears.

And for those of you who are interested, who maybe are already with us with Analytics or Customer Journey Analytics, I'll get a little bit into the architecture of how this works. So the content, it's extraction and the events that happen with the content. That's going to be the impressions. That's going to be the clicks, any interactions with that content. That's going to be collected by the Web SDK. Now I know some of you may not have migrated to Web SDK, and that's totally okay. You just have to be okay using Web SDK to collect those additional content events. If you are using Adobe Analytics still and you're sending that through to AEP, you can do that too. So Web SDK collects all of these events, and it's all sent to AEP. That's the Adobe Experience Platform. And from there, it goes to the featurization service or the content identity and feature service as we have it here. And so that's going to be what assigns things like the perceptual ID and all of the other AI/ML stuff, assigning it the attributes, etcetera. So now you have the content and all its full metadata profile and all of the content events sitting in AEP alongside your customer journey event data and any other data that you've collected in AEP. And so that's now available for analysis in Analysis Workspace in CJA.

Now the reason why this architecture is important and the reason I'm showing it to you today is because AEP is the foundational architecture on which other Adobe DX application sit. And so what this is going to do is it puts us in a good position to start ingesting content datasets from other channels, and in the future to provide integrations between Content Analytics and the places where content authors, marketers exist because we know they are not always stepping into Analysis Workspace. Is that right? Yeah.

So one of the questions that I get asked a lot as I'm talking to customers about Content Analytics is, "Okay, now we have all this extra data, how do I think about it in the context of what I'm already doing? How does this fit with, for example, an A/B test?" So up until this point, we haven't had this scale of information or this granularity of information. And so the A/B test or even a multivariate test has been one of the only ways that you can figure out what content is actually working. But with Content Analytics, I'm going to ask you to think a little bit beyond the A/B test. So we're going to go through a little exercise today. So let's A/B test these two cocktails against one another. I think cocktails are a good analogy. You have all of these different ingredients. Together, they make a whole just like your content. And for my dry people, by cocktail, it doesn't necessarily have to have liquor. It can totally be a dry cocktail. I come from Utah. The dry cocktail scene is thriving, and it's pretty awesome. Okay. With that said, raise your hand if you're more interested in the cocktail on the left.

OK. Raise your hand if you're more interested in the cocktail on the right.

OK. It's pretty even split. I think the cocktail on the left won. I did make this presentation before I quizzed you all, so good thing.

So we know the cocktail on the left is a winner. And if you ask me, I would have said, "Yes." I like the cocktail on the left. It looks like an old-fashioned thing. I like those.

But, unfortunately, marketers are not like bartenders. Bartenders find a winning combination. They put it on the menu. They serve it again and again and again. Marketers, you can't do that. Content has to be fresh. You're constantly having to come up with new content. So now what do you do if I ask you to mix another drink? And all you know is I like that cocktail. What about it was successful? Was it the underlying liquor? Was it the cherry? I do like a Luxardo cherry. Anyone? Is the blood orange? You don't know. And then when you mix it with other ingredients, how's it going to react? How do you know how to make a successful cocktail? Well, obviously, if you serve me a bunch of cocktails, you may start to notice some trends over time. You'd find that I like both that margarita looking thing and the old-fashioned looking thing. You wouldn't have known that if you just A/B tested it, but I like both. I also like that mojito looking thing. And so you might suspect, "Oh, she might like citrus." But to really understand how I'm going to react to a new cocktail or what cocktail to make for me, you'd really have to keep track of many, many cocktails and carefully itemize all of the ingredients in those cocktails and how I had responded to them. Now if you had done that, you could say with some level of confidence of, like, "Yeah, most of the cocktails with whiskey in it, she likes. She also likes some gin or tequila depending on what they're mixed with." And so if you had a pretty good idea of that across, not just the base liquor, but the fruit, the garnish, the glass, you would have a pretty good idea of how I'm going to respond to a new cocktail with any mixture of those things. You'd also have a pretty good idea of how to mix a new cocktail, especially for me.

So that's what Content Analytics does. So it's not a replacement for the A/B test. You may still want to mix two new cocktails and test them against each other, but now you have a lot more information going into what you're creating and what you're testing. Because it collects the ingredients across all of the experiences, all of the content on your site, and because it's overlaid with that customer journey information, now you have that comprehensive view. You can start to correlate what are the elements of the content that are resonating with customers. Not just resonating with customers, how do different customers respond to different content? And so this is one of the things that's really exciting to me. Historically, we have grouped customers based on things that we think makes them alike. And from that, we've tried to understand, again, what content is going to resonate with that group? But what if you could group people based on what they prove they're interacting with and resonating with? So thinking it of the other way around. So with these insights, you're going to be, again, in a much better place to optimize your content, to personalize your content, and for any kind of real-time activation.

So I get really jazzed about the attributes and the personalization concept because this is a new granularity that we really have not had access to before. But Content Analytics has value on multiple levels besides that. So when I first started this content journey in talking to customers, one of the levels that I really overlooked was just where are the assets appearing? Am I overusing or underusing assets? Most of our customers are really flying blind when it comes to content. So even just an inventory of where on my site did this appear is something that hasn't been available to them. Because we're able to capture those accurate impressions, we can tell you what remaining page that image is still existing on.

The second, and I think of this as table stakes for any Content Analytics solution, is just, what is the engagement with this content? So what's the average click-through rate for certain assets or even at the attribute level, certain attributes? And how can I tell if an experience is still effective? Maybe if you serve content, it's really effective in the beginning when it's still fresh, but we know that content gets stale. How do you know when it's stale? The third level, this is where I start getting excited. And this where it's really magical to overlay that content granularity with your customer journey event data is really getting to not just what content led to engagement, but what content led to conversion. So we've had a pretty good time so far of tracking click-throughs and customer journey is down the way to conversion. But what happens when someone just sees an image? Maybe they don't click on it. Maybe there's no way of registering what the engagement was with that image. Now you can accurately capture that impression, and you can attribute having seen that image back to any other action that you're wanting the customer to take. So this is really great if your images are non-clickable assets. It's also overlaying it with that customer journey data that's going to get you to the ROI. So if you're tracking revenue that comes through on your site, you're able to attribute that revenue back down to the individual asset level or the individual attribute level.

And then, of course, what I think is the most exciting, what I really want to see it used for more is this idea of personalization. Again, now with AI, we are in a much better position to actually personalize content for customers. So when we talk to customers, there's two different paths that we see them taking in personalization. The first is, "Hey, content is still really expensive to produce. I know I have to personalize, but please tell me where I absolutely have to personalize. If I test this piece of content with this market and this market, is it okay for both, or do I have to make a tweak?" And then you have the customers that are all in on personalization, but they still need to understand how. Like, how are you supposed to personalize it? And so this helps them to understand what are really the opportunities for personalization within their content.

So for those of you who are already familiar with Customer Journey Analytics, so this is something that already allows you to pull in all of your customer journey data at an event level. You may ask, "Okay, what is Content Analytics providing me on top of what Customer Journey Analytics already provides?" And so this is it in a nutshell. You're going to get new metrics and dimensions. So like I said, all of the counts of those things, also all of the clicks, all of the impressions, you're going to get the experience ID, the asset ID, the perception ID. And as I mentioned, all of the different attribute categories and their dimension items, all of there in the left rail for analysis and workspace.

Also you're going to get that automatic attribute association. So it is incredibly labor-intensive to try and tag all of these assets yourself. This gives you a comprehensive, out of the box way of tagging all of the content that appears on your site.

And I've mentioned before, and I'll double down on it, it's the accurate impressions. So we have customers that have been getting to impressions in different ways. But, typically, this has to do with specifically tagging and implementing on a specific unit of code on their site. It usually has to do when it's loaded with the page. When we talk about asset impressions, we're talking about the image actually being viewable within the viewing window. So if you have rotating content, if you have rotating assets on your page, some kind of carousel, what it's actually doing is it's looking for new asset URLs showing up on the page, and it only counts as being viewed if it's 75% viewable within the viewing window. So assets that are appearing below the fold, for instance, on your product pages, maybe they're not even getting seen. And so this lets you have that actually accurate impression count so that when you start attributing actions downstream back to those asset impressions, it's correct.

And the last is that perceptual de-duplication. So as I mentioned, we have customers that have the same asset appearing from different URLs all over the place. This helps you to bring that down to a single ID for that asset. This is the same asset in understanding its performance and aggregate. One of the things that we're doing is looking at how we can expand that definition of perceptually the same. So right now it gets it if they're exactly the same. But what happens when it's just a little bit smaller, lower resolution? It's been cropped a little bit, or maybe there's a little border around it. We're looking at how we can expand this definition so it can capture all of these, as it's basically the same asset. And so when you want to consider its performance, you want to consider its performance in aggregate.

Okay. So I've talked a lot about what it can do for you and the art of the possible, but let's get into a live demo. So let me preface this by saying this is a live project and this is live data. This is from our own Adobe Store, so Adobe employees actually like shopping and buying things in the Adobe Store. So say a little prayer to the demo gods for me.

If you all could say the prayer to the demo gods one more time, that would be great.

Okay. So this is a slightly tweaked version of the template that we're going to be providing to all Content Analytics users. And so I'm starting here with just experiences. And I'll be honest with you, the Adobe Store was recently redone, so we're working with a pretty short window of data. And we see that there's a drop in a click-through rate of the experience. So that's just going to give me an idea of the engagement on the site, a little bit of what to look for. And that drop is not surprising to me. Having spent some time with the Adobe Store, I could see that people were buying a lot of merch prior to Summit. And in the last week, not so much.

So scrolling down, I have all of the experiences that have been viewed the most in the last week. I have them ordered by views. And I would typically expect that to be mostly our home page. And I do see that here. I see several different instances of the home page. And remember, an experience is how that home page appeared in a certain way at a certain point in time. So as we trade out content, as we trade out different Adobe objects that we're promoting, you're going to get a different experience. And so if I wanted to see what version of that homepage I'm looking at, I could click the little info icon...

And it pops up this little window. So I have a bigger thumbnail. It doesn't show me the whole page, just a little snapshot at the top, but that helps me to quickly visually understand what version of the page was this. I can also see how many impressions it's had of all time. I can see the first time that experience appeared and the last time that experience appeared and I can see how many assets were within that experience. So I see that there's 18 here. If I go ahead and break it down, it's going to pop me out of that window, and it's going to do the breakdown right in the table so I can see all of the different assets that occurred in that experience. And so with that, I can start to understand how those different assets contributed to the success of the experience. Here, I have orders. I'm using a linear attribution model. Because it's workspace, you can obviously use whatever attribution model fits you the best.

But I can see that, curiously, this top image on our home page is actually looks like it's contributing to a lot of orders. And all that means is that people see it, and for the number of times it's been seen, there's a lot of orders that have been attributed to it.

So that's a good way of looking at experiences, but I know with the Adobe Store, we're actually rotating content a lot. And the same product will appear multiple times on all of the different pages. So it makes sense for me to scroll down and look at my assets.

So here, I have the assets actually sorted by the orders that are attributed to them. And here again, I see, curiously, this first image on our home page actually contributing a lot to orders. The second one, however-- Let's go ahead and zoom into that. So it's this funky little fanny pack. It's getting viewed a lot. And while it's still contributing to a lot of orders, comparatively less than that first image on the home page. So this starts to give me an idea of what assets are actually really making a difference. Now what we see typically, particularly with this particular attribution model, the number of times that an asset has been seen, it tends to be correlated with a similar number of order conversions, but there's always outliers. So one of the great things, because this is a workspace and all of these dimensions are available to use, you can actually start dropping them into visualizations. So here I actually have a scatter plot where I've plotted the asset views on the Y-axis and the orders attributed to them on the X-axis. And you can see there's a very strong correlation line, and that's what we would expect. But where the opportunity occurs for us as marketers is in those outliers.

So here, in this bottom quadrant, I can see that there are assets that are not being viewed very much, but are associated with comparatively a lot more orders than most of the assets.

So I'm just going to pop open this little table here.

And if I hover over this bottom asset in that quadrant again, not that many views, but a lot of orders, I can see that it is that top asset on the home page again.

So let's talk a little bit more about the attributes because, again, that's the exciting thing. That's the secret sauce. So I've talked about the different attribute categories, foreground colors, background colors, background scenes, subjects, etcetera. But one of the things that we've also done with Content Analytics is we've smooshed all those assets together. So we have one dimension for all of the asset attributes. And so by plotting them here on what are the top converting asset attributes, I can start to see which of those attributes really matter. So maybe background colors don't really matter unless it's something offensive, like lime green or orange. I don't know. So again, this is limited data, so the insights from this are not awesome yet. I don't actually think overall tone neutral is what's driving Adobe employees to purchase a lot of product. But you can imagine how over time, this is going to give me a really good idea of what does and doesn't matter. I've plotted what the percent changes over time. And, again, imagining that this is happening over a longer time frame thank you for redoing the Adobe Store.

If this were over a longer time frame, I could look at these different attributes and say, if there was a big percent change, something was spiking, I could look into that particular attribute and very quickly identify, "Is this a spike?" Or if that line graph were ever to populate, there we go. "Is this actually a trend? Is this something that I, as a marketer, can capitalize on?" So let me go ahead and show you something else that's cool. So let me back out of here a little bit.

If I were to look at just one attribute category, so say asset foreground colors, I can also very quickly identify what's changing and what's popping. So again, orders are generally down. We're all coming here to Summit. Too late to order anything more. Not as many orders. But I do see that green is contributing to comparatively more orders just in the last week. And so this is different than the way everything else has moved. And so again, because everything is available here in the left rail, what I can easily do is drop in the asset ID. And I can see exactly, "Okay, you said green. Do you really mean green? What kind of green? What assets are contributing to that green?" And I can see there's mostly this weird little palm tree and this eco sign. But you see there's a problem here. If I expand this...

You can see that it's collecting that palm tree and that eco sign a whole bunch of times. And that's because on the Adobe Store, just like many of your websites, it's actually for some reason coming from a different URL. And so that's where I want to use the perceptual ID.

And so if I drop that in instead, it's going to take that from 10 assets down to 4. So we still have two of those little green eco signs. If you look really closely, they are very slightly different. But this has helped me to really reduce that information, and I can see, yes, it is still mostly this one green eco sign, that's appended to a bunch of the Adobe products that for some reason, in the last week, Adobe employees are responding to more.

One more thing before I conclude the demo. I want to show you something that's really fun and kind of interesting to do. So imagine I am optimizing the Adobe Store for the different regions. We have Adobe employees all over the country, all over the world. If I look at asset foreground colors and I look at associated orders, I can sort it then by Adobe employees that are in California. We have a huge office in California or my home office in Utah. And I've just done a real quick percent change between those two just to get a quick visual conditional formatting, what's popping, what's not. And I know that there's 75% less employees in the Utah office than the California office. So I'm using that as a baseline. But if I scroll down with this conditional formatting, I can very quickly see employees in the Utah office have been purchasing or purchasing after seeing assets that are olive, assets that are dark green, and less so after seeing assets that are mustard or assets that are brown. So you can imagine again, I don't think that that's necessarily true because this is a week's worth of data. But you can imagine how over time, I could see some real trends in what are the preferences between different offices, different locations. And so if I wanted to optimize the Adobe employee store for the different offices, I could actually pick and choose what products I'm showing based on those preferences.

Here we go. And so for that, I actually want to turn it over to Abhinav. So he's been working with us in the early days of Content Analytics production. He's been an awesome beta tester. And so he's gotten a little bit of a preview of what Content Analytics can do for Verizon, and his insights are way cooler. Yep. Thanks, Jennifer, to show us the power of Content Analytics.

Yes. So as Jennifer was saying, we, at Verizon, started getting into the beta for Content Analytics when Jennifer mentioned that something is coming.

We jumped on it. We were very happy and excited to see what we can do with Content Analytics and that's what I'm going to share here, yeah? So before Content Analytics, how things have been working historically. It's not like we are not capturing content or not tracking content. We are. But because you can understand how many pic images assets show on the site we have to go and tag them or track them up in a particular way. Again, if one image is showing on the site at one place, there are carousals. We are updating our-- Let's stick to homepage here. So homepage is updated at least once a week, maybe more depending on new product launches, depending on new offer launches. Right? So we update carousels and one image showing in one place of a carousels, same image showing at a third place at a carousel. After a week it constitutes two rows because we capture locations, positions. We capture the name of the image. We capture everything, all of the impressions for an image, yeah? We put it into a list where it shows up into its own row items, yeah? That's how it has been working. Now you can't do it for each and everything on the site. We can understand that how big it is going to be. So we do it most of the marketing landing pages and the homepage. And then it comes to reporting. Right? When we have tracked it all, we have to build reports on it, which again is another complex task because you see on there, all of the data is redacted. But if you see, the blue ones are all segments, yeah? So the analysts have to build segments based on locations, based on positions, based on the names of the asset they want to report on, right? The marquees, the hero app banners everything and in accordance with the metrics, right? So how the marquees performing today versus the marquee which has changed, two days later how is it performing? We put them side by side, yeah? So this is how it is working today. Labor intensive both in terms of implementation and reports.

Impression, clicks. We can do it but not the best way, how it should be done, how Content Analytics can do for us.

Okay. Moving on. So what Content Analytics can do for us now, yeah? So first, it gets visibility for everything, all of the assets, right? I mentioned, we do it but we cannot do it for all, yeah? Content Analytics can do it for all because we do not have to go and put everything as a tagging, implementation is easy. Jennifer showed, we pick up the assets, we put it into AEP, we build a schema, push it into CJA, yeah? As simple as that.

Moving on, accurate impressions. Right? So currently when the page loads all of the tiles, images, whatever, you call it today, they load above the fold, below the fold doesn't matter, as the page is loading we are capturing all of it. Yeah, Content Analytics as per the definition if a image is 75% and up above on the page that's when it's going to track it. So your impression will only increase for that image if it is coming on the screen. Not every time it's loading on the page, yeah? Next one, asset groupings, yeah? So Jennifer showed in the great demo that you don't have a different line item for all of these assets which are very similar. They can be grouped together. Same asset showing on the homepage, showing somewhere in the-- Buried somewhere else on-site, can be grouped together drop it down your page name on it, break it by page name you will see how that asset is performing on the home page versus maybe on the product page itself, yeah? Okay this one is again, I love, you just saw on the demo that you can see the image actually on the Content Analytics, yeah? So imagine it this way, you build a report today that says your marquee is performing X, Y, Z. Your image on level two position one is performing great. But then how do you know what that image is, yeah? You go on to the site, you look at what is that image, and then you make that connection, yeah? And imagine your report and this report is going to the executive, yeah? So you tell them that our hero image, yes, last week was performing way better than our hero image today. They don't have a clue what was the last week's hero image, today's hero image. But when you have it on your report, on the workspace, click on it.

You can expand the thumbnail. You will see all of the thumbnails of the images showing on your report, yeah? So great way to see it. Great way to analyze it right then and there to make the connection that both this image is performing better than this image, yeah? Okay.

What did I miss? Yeah. Let's move on. So that's where we are here. We have these images. You see, similar images on the left, on the right, we're showing some phones. One in red, one is yellow. What team did, working on Content Analytics beta, they put some images out there on the home page. We came out with a very straightforward realization with our insights that yellow is performing better than red, yeah? We don't know why but the data is showing it.

Another one which we saw is a trend, being a phone company, we always expected assets showing for phones way more. We built the report for the assets. Obviously, phone showed up. But then we built a report for the conversions. For the assets, phones were up. We put them in an ascending order for the conversion rate. Suddenly watches popped up, yeah? And it's not Content Analytics but visually you can see that all of the top images for the conversion, showing less converting more were watches, yeah? This can be used in optimizing the site again.

Yeah. Another one. Bright color images converting better as compared to something like a black or a gray or a neutral color.

So we have multiple examples out there which this can be used. Not just color, it can be a tone. It can be, whatever, by channels, by devices. You can see, as Jennifer was mentioning, it is a part of your CJA, so you can drop your devices there. You can see what colors work on mobile phone versus what color work from desktop mode, yeah? Another great development what I'm waiting for, it's again in the vision that what channel, what images work better, yeah? Same image, working on an email versus on a banner ad versus on the site. Yep. So where we are going from here? Right? The future.

We all are hearing about content in the Summit Keynote.

What is that? Another tool we have? GenStudio? - Yeah, yeah, the GenStudio. - I think, yeah. GenStudio. Yeah. Everywhere we are pumping out content. AI is pumping out content. There will be millions and millions of content out there, which we can't track, yeah? And you have seen this image plan what you have to do, create, deliver, but how to report it, yeah? There's no point of creating and delivering without tracking on it. That's where Content Analytics come in, yeah? The more and more you deliver-- The more and more images you pump out, you go on the screen, you personalize, the more tracking you need it, yeah? The more reports you need it. What image is working better? Are we creating more and more images just for the sake of it? Are we creating because AI can do it? Are there images outlying out there which is not converting, which is not generating any clicks? Yeah? So these kind of metrics insight, we need to be generating in sometime very near future, and that's where the Content Analytics is going to start helping us, hopefully. Yeah. So, Jennifer, you want to take some example use cases from here? - Yeah. Sure. Right. - Thank you. - Thank you so much. - Sure.

Again, huge thanks to Abhinav for working with us. It was so cool to see this work on Verizon data, and some of those insights were super unexpected.

I could not have predicted things like, "Oh," we saw the watches showing up as performing really well across all of the pages. And because of the visual nature of the UI, it was just super, super obvious. Just real quick, other examples that we're hearing from some of the customers that we're working with, just use cases. Again, evaluating assets that are being overused, underused. Understanding if content is getting stale, both from an attribute perspective and from an asset perspective.

Being able to understand how the location, how the context of where that asset is placed, impacts its performance. So again, you can see the performance in aggregate, but we all know the context matters. And so being able to break it down by that context is super helpful. Being able to verify which creative attributes across all of the images are seeing the most engagement. Easily identifying the ROI for your content. So again, we were working with another customer who's actually tracking their revenue and they were, "Wow, being able to attribute an actual dollar sign amount back to individual assets, that's everything." And also being able to remove creative that has symptoms of content fatigue or that you need to get off your site for other reasons. Now you can actually figure out where they are. Wow. They are having a party.

Taking notes.

So whoever you are, whatever industry you're in, content's going to matter. We know this. Right? And so my hope and my vision is that now by understanding what it is about content that actually matters and is driving those actions you want your customers to take, you're going to be able to use content so much better, so much smarter. It's not going to be a guessing game anymore. And you're going to be able to really engage your customers. So I'm super excited to announce that this is going to be released for general availability next week. I and a huge team at Adobe has been working very hard on this. So we are very excited. And just like we had some really unexpected insights at Verizon, I'm really excited to see how this is used with other customers. So this is going to be super cool.

With that, I'm going to open it up for questions. But as I'm answering questions, please go into your Adobe Summit app. Please rank the session. You win Starbucks gift cards. You can also potentially win a pair of Bose headphones. I wear them all of the time. I live in them. They're worth it if you haven't used them. So please rate the session. And with that, I'll open it up to questions. Thank you so much for joining me. I know it sucks to sit in a presentation after lunch, so thank you.

- Thank you. - Thank you. [Music]

In-Person On-Demand Session

Distilling the Content Cocktail: Discover Ingredients That Convert - S104

Sign in
ON DEMAND

Closed captions in English can be accessed in the video player.

Share this page

Speakers

Featured Products

Session Resources

No resources available for this session

About the Session

When optimizing content for different audiences, how do you choose the right colors, scenes, or tone of voice? Adobe Content Analytics closes the loop on the content supply chain, delivering insights on exactly which content resonates with your audience for optimal conversion. AI and ML models break your content down to its most granular level, eliminating the time and resources needed for manual tagging. Tie content impressions and interactions to individual users to understand how it impacts their journeys. Easily surface which subjects drive engagement and which persuasion strategy converts, and attribute revenue back to individual assets.

Key takeaways:

  • Attribute conversion back to individual image assets
  • Identify which content attributes are most correlated with conversion
  • Focus content personalization on the highest impact opportunities

Technical Level: Intermediate

Track: Analytics

Presentation Style: Tips and Tricks

Audience: Campaign Manager, Digital Analyst, Digital Marketer, Web Marketer, Marketing Analyst, Content Manager

This content is copyrighted by Adobe Inc. Any recording and posting of this content is strictly prohibited.


By accessing resources linked on this page ("Session Resources"), you agree that 1. Resources are Sample Files per our Terms of Use and 2. you will use Session Resources solely as directed by the applicable speaker.

New release

Agentic AI at Adobe

Give your teams the productivity partner and always-on insights they need to deliver true personalization at scale with Adobe Experience Platform Agent Orchestrator.