Introduction: A tool for attributing and optimizing ad performance
From May 2013, when Adbrix first launched its service, we have observed the growth of the mobile advertising market. Now that the mobile advertising market has grown significantly in both quantity and quality, attribution functionality to measure ad performance is recognized as essential in all app categories. .
The ultimate goal for growth marketers running ads in the mobile app environment is to ‘optimize ad performance’ and Adbrix’s attribution functionality, specifically for mobile apps, measures and Analyzes the impact of app advertising and provides a basis for evaluating performance. This is because it provides and distributes analytics (repost) data to media companies to help them use it to optimize advertising.
In other words, an attribution engine is a tool that must be used to convert the abstract concept of ‘optimizing ad performance’ into concrete numbers and functions. It wouldn’t be an exaggeration to say that it’s impossible to promote your app without using attribution tools.
However, for those new to mobile app attribution, it’s true that the concept of attribution and the various functionalities for advertising can be somewhat daunting. Functionality continues to evolve in line with the mobile app environment, leaving many managers confused.
In this article, we will organize and explain the basic concepts to understand ‘mobile app attribution’, focusing on the functionalities provided by Adbrix.
Mobile app analytics criteria: Understanding advertising identifiers
Any analysis requires a minimum single unit. For example, cookies used on the web, user’s membership ID, email address, phone number, etc. On mobile devices, each operating system operator (iOS, Android) provides ‘receive value’. unique ad formats’ on a per-device basis. In general, the advertising identifier provided by Google’s Android is called ADID, and the advertising identifier provided by Apple’s iOS is called IDFA. The attribution engine collects this value from the user’s smartphone and uses it as the smallest unit of analysis.
In the mobile app environment, the existence of trusted advertising identifiers is critical. This is because it is a means of analyzing and accumulating the behavior of application users on a smartphone basis. Although we do not know who the users of the app service are, we can know the exact records left by the app users’ smartphones (more precisely, their advertising identifiers). smartphone).
Therefore, you can directly analyze the user’s application usage behavior. Since this advertising identifier is also used by media companies, the accumulated data is essential for advertising purposes. Therefore, it can be said that the mobile application advertising ecosystem operates based on advertising identifiers.
Three features of mobile app performance measurement tools
It is important to have a clear idea and understanding of what the tool offers. Let’s look at the functionalities offered by attribution by dividing them into ad performance measurement (Attribution), Measurement, and Reposting.
1. Measuring advertising effectiveness (Attribution)
The first function of a performance measurement tool is to determine which media contribute to advertising performance
Performance measurement tools are often called attribution tools. Attribution is a commonly used concept in psychology. It is defined as ‘the process of finding the cause of a particular result or action’. For example, if my test score is not high (outcome), then it is a natural human thought process to figure out the ’cause’, such as whether the questions on the test were difficult Or am I not good at self-study? This is the attribution process.
If we apply attribution from the perspective of measuring advertising performance then it can be defined as “a process of establishing and evaluating standards to determine the advertising media that contributed to specific performance.” body”. Let’s take a closer look at the key functions and concepts offered in this article, focusing on Adbrix.
# Attribution process step 2 (Open and in-app events)
The first step of the functionality provided by the attribution engine is to determine which medium the user accessed from (App Opened). The easiest example to understand is deduplication of new users. In other words, it determines whether a user brought in through an advertisement is a new user or not.
Let’s say you run ads on medium A, medium B, and medium C without using attribution. Each media will measure advertising performance according to its own rules, and because they do not share the ad identifier that completed the ad engagement, duplicate advertising participants will occur. repeat. As a result, marketers end up paying to advertise to the same users over and over again.
However, you can prevent this by using attribution tools. The attribution engine can filter all duplicate ad participants by medium based on the ad identifier discussed earlier. In other words, the performance measurement engine acts as a kind of data collection hub and relay station, while also providing deduplication functionality for new executors. For example, ‘axg01bf…’ caused a new execution on a specific medium. If the advertising identifier’ is collected and stored once through the Adbrix SDK, Adbrix will not recognize the advertising identifier as a new install on any media.
Adbrix classifies the property of application execution through advertising as “Open Attribution”. This process includes Ad-Touch and Open App. First, ad-touch is divided into click, display, play, and video, and the app opening results by each ad-touch method are divided into New install, Reinstall, and Open deep link.
Here, Fresh Install refers to performing a fresh installation, Reinstall refers to deleting and reinstalling, and Deeplink Open refers to when the app is opened via deep link while the app is installed. setting. There are three ways to open apps through ads: Marketers using Adbrix can measure whether an app open through a specific medium is a fresh install or a reinstall, or whether a Specific Reinstall is it due to clicks or impressions.
If you differentiate between ad induction and app opens through open attribution, you now need to measure what events happened after the app open. Adbrix classifies this process as Application Event Attribution. This process includes App Open Events and In-App Events. For example, you can distinguish or aggregate whether an ad conversion called a purchase is due to a new Install or a Reinstall.
Adbrix divides the attribution phase into Ad-Touch – app open and app open – in-app event, allowing growth marketers to derive detailed analytics tailored to the characteristics of the ad being used. operate.
# Open allocation method
When attribution engines, including Adbrix, identify app executions (new installs or reinstalls) from Ad-Touch, they use one of three methods: Google referrer, price matching advertising identification and fingerprinting. Because this is a method for capturing the performance of specific media (i.e. for attribution), it is important to understand each characteristic and tailor it to the needs of the strategies and communication policy.
The Google referrer method is a method of determining the revenue stream path using the referrer provided by Google Play when the application is installed/run by accessing it directly on Google Play. Near 100% accuracy but can only be used on Google Play and tracking may be cut off if the ad participant leaves the Google Play Store.
Advertising ID matching is a method in which the media provides the advertising identifier (ADID/IDFA) directly to the performance measurement tool. Like Google’s Referrer method, this method has close to 100% accuracy but can only be used if the medium can provide an ad identifier.
Fingerprinting is applied when the Google referrer and advertising identifier matching methods cannot be used. This is a method of determining the inflow path using information that can be gathered from the user’s application usage environment. Information used varies, including IP address and device environment. It has the advantage of being able to measure performance even when Google’s referrer and advertising identifier matching method cannot be used. However, because the information used is indirect, there is a possibility of type 1 or 2 errors, even with very low probability.
So, if needed, you can lower the priority of open attribution based on ad touches or apply a relatively short lookback window of 24 hours.
#Open Attribution Modeling
When operating advertising vehicles, it is common to operate 2 to 3 or more vehicles and at most 10 vehicles at the same time. In these cases, users may respond to multiple ads. For example, if a user clicks on medium A, then clicks on medium B that appears a moment later and relaunches the app, on which medium will the new launch’s ad performance be recorded?
The budget for the new launch can be given to A, B, or split equally between everyone. They are called First, Last, and Linear attribution models, respectively. The First Ad Touch Attribution model gives credit to the medium that generated the first click. If a new launch occurs after multiple clicks, the performance of the medium that caused the first click will be recorded. Conversely, in the Last Ad Touch Attribution model, performance is attributed to the medium where the last click occurred, and in the linear attribution model, the same percentage of performance is attributed to each the medium that generated the click. Of course, there are more attribution models.
All attribution tools, including Adbrix, use the Last Ad Touch Attribution model among these attribution methods. The credit given to the advertising medium that causes the ultimate reaction to the advertisement. The same thing happens with Adbrix.
However, what should you do if you want to avoid using the audience identification method we reviewed earlier from being recognized as the last ad touch, or if you need to adjust the marking method so that it’s not detected? Identification is the last ad touch? Adbrix’s attribution model maintains the basic rule of Last Ad Touch, but is based on each Attribution Unit, which is a combination of Ad Touch and Attribution Method. It supports adjusting the priority (Grant) of recognition of Last Ad Touch and Ad Touch ~ Open Review window.
For example, assign high priority or set long lookback times to attribution units that are judged to be highly accurate or have a high impact on app installs and attribute units that are rated as having relatively low accuracy or having a weak impact on app installs. You can assign a low priority to the review unit and set a short lookback period.
Assume that click identifier and click Referrer units are assigned to Level 1 and click touch count is assigned to Level 2. At this point, Adbrix first searches for see if any of the last ad touches match the Level 1 Click Identifier and Click Referrer units. If no matching Allocation Units are assigned to Level 1, the last touch will be searched for Allocation Units assigned to Level 2 in the following order:
The implications of attribution modeling are simple yet powerful. By using the attribution model provided by Adbrix, you don’t have to rely exclusively on the Last Ad Touch Attribution model. For example, if you want to continue using fingerprinting but want to reduce its impact, you can assign fingerprinting to Level 2 or Level 3 instead, or significantly reduce the lookback interval.
# Looking back window
Another important concept used in performance measurement is the lookback period. The lookback period refers to the question “How long will advertising performance be recorded?” Adbrix supports setting up lookback periods by dividing them into Ad-Touch ~ App Open and App Open ~ App Events.
Ad-Touch/Review window when opening the application
First, let’s review the Ad-Touch review phase ~ Open the app.
How long must a new execution occur after the last click on Media A to be recognized as a Media A performance? 24 hours? 7 days?, 15 days? There is no exact answer for determining the time period, but as with setting priorities (levels), there needs to be an appropriate adjustment process according to the communications strategy and policy. .
Let’s say you have an ad with a lookback window set to 7 days (168 hours). If a new install occurs on the 10th day after the February 8th click on Medium A, that install is recognized as Medium A’s performance because it is included in the lookback period 7 days. However, if a new execution occurs on day 17, it is evaluated as a natural new execution rather than a performance of medium A because it is outside the 7-day lookback period.
Settings are made in the Open Attribution Modeling menu. You can set a lookback period for each Attribution Unit. In the example screen below, the lookback period is set for 7 days (i.e. 168 hours) for the Referral Click unit and 1 day (i.e. 24 hours) for the Click-Fingerprint unit.
adbrix Open Attribution Model settings menu (default settings state)
Open the application/In-app event review window
Once an app is launched via an ad, how long must an ad KPI be achieved before being recognized as the effectiveness of a particular ad? Ad-Touch ~ Open the app As with the review phase, there is no right answer but it is important to set internal standards.
If you set it as in the example below, it means that purchases that occur after a New Install through an ad will only be captured and analyzed within 24 hours of the lookback period. If you want to see the purchase results that occur after a New Install and a Reinstall together, simply add a Reinstall to the Open Attribution Type.
Review window selection screen Open app ~ In-app events in adbrix Attribution Conditions
In Adbrix, you can check the results by adjusting the App Open ~ App Events lookback interval from 1 hour to 31 days. For example, we can set a benchmark for our app by comparing the ROAS of the App Opens ~ 1-Day (24-hour) App Opens review period with the 7-Day ROAS ( 168 hours), ROAS for 15 days (360 hours), respectively.
2. Evaluation
The second function of a performance measurement solution is to evaluate running ads. We provide and report analytics so we can evaluate measured ad performance against a variety of criteria. Adbrix provides 4 report types and 6 analytical segmentation criteria for evaluation and analysis. Analytics metrics are divided into various categories like campaigns, media, documents, inflow keywords, etc. to enable detailed analysis.
The clearest way to analyze ad performance is to break it down into a revenue stream perspective and a post-stream in-app activity perspective. From an inflow perspective, it can be divided into new users (Fresh Install) and reinstall users (Reinstall, Reopen), and from an operational perspective, it can be divided into the activity of new users (Engagement) and user re-engagement (Re-Engagement). At the app launch stage, the key will be to secure new users (New Installs) and the importance of these users to the app (Engagement). On the other hand, if an app has been around for a long time or is running targeted ads, accurate performance analysis can only be done using indicators that focus on rerunning through ads. fox.
Recently, it has become common to measure specific post-action user activities through App Events links provided by performance measurement tools. You can check basic metrics like new install and retention rates just by linking the basic SDK, but using App Events you can even analyze specific behavior of users in the application such as membership registration, character creation, shopping cart and purchases… Because it is possible. You can think of them as connection points that can be evaluated as key KPIs in the application. In other words, have the new users reached our app’s KPI score (Engagement)? It is important to distinguish whether the referred user reaches the KPI (Re-engagement) point or not.
While the main starting point for an app is logging in or creating a character, if the corresponding App Event is not linked then ad performance is only evaluated by new fulfillment metrics. Therefore, the criteria for evaluating performance are still limited. However, linking custom events allows you to define a broader range of user behavior, broadening your evaluation criteria.
To analyze App Events, it is necessary to link the Event API provided by Adbrix to the desired point. What is important in this process is the definition of App Events. It can be seen as a process of specifying the points you want to analyze in your application. Once the key points have been identified in the app, you can discuss whether integration is possible in the app and then start the alignment.
Another reason why integrating App Events is important is because they are essential not only for measuring and analyzing performance, but also for optimizing ad performance or retargeting through ads. back and forth. We’ll look at this in more detail in part 3. Repost.
3. Repost
As we’ve mentioned multiple times, all measured and analyzed data is cumulative based on advertising identifiers. The same goes for App Events, which we looked at earlier. The third function of a performance measurement tool is to support the active use of analyzed and accumulated data from an advertising perspective. In other words, performance measurement tools are not limited to analyzing and reporting numbers.
The key concept in data usage is ‘Repost’. Reposting means providing data collected through a performance measurement tool to a media company (i.e. reposting). Media companies optimize performance or get the data needed for retargeting through reposting links with performance measurement tools. This postback scope includes not only simple execution, but also user actions associated with App Events. Therefore, if the goal is to optimize ads or perform retargeting based on logins, persona creation, purchases, etc., then setting up App Events is essential.
For example, if you want to optimize ad performance based on login information rather than new enforcement, or if you want to run ads that complete login, you will need to repost the ‘Sign In’ App Event . For example, in the case of Google, UAC App Action optimization can be enabled using the App Event postback feature delivered from Adbrix. Additionally, through retargeting support, you can target ads to users who haven’t completed membership registration after launching a new app or to a group of users who visit Daily apps but no purchase history.
Meanwhile, depending on the advertising purpose, it is necessary to set up which App Events to repost based on. If retargeting is retargeting, then multiple App Events are set up to differentiate the desired target user group, and independent (all) reposts are not limited to with the stream coming from the specific medium to be used to ensure sufficient targeting parameters. Or, if the ad is only optimized for login achievements, you can use medium-dependent postback (Only) because only the data to the user’s login needs to come from that medium.
Conclude
So far, we’ve learned about three functions provided by attribution engines. The ultimate goal that can be achieved through these three functions is “advertising performance optimization”. By measuring advertising performance according to set standards, analyzing it from various angles, and effectively using accumulated data, it is possible to specify vague advertising performance optimization.