Research

Research Interests

Substantive: Digital / Mobile Marketing, Promotions, and Advertising

Methodological: Bayesian Statistics, Field Experiments, and Natural Language Processing

Research Projects

Wait For Free: A Consumption-Decelerating Promotion for Serialized Digital Media

Abstract: Promotions for digital goods have typically focused on enticing users to accelerate their consumption. Here, we investigate the effects of a novel consumption-decelerating promotion, “Wait For Free” (WFF), applied to serialized digital content – sequences of interconnected episodes – monetized via episode-level paywalls. Specifically, customers can sample early episodes of promoted series for free, and can continue to do so by waiting a pre-specified time; or, for those unwilling to wait, by paying. Analysis of viewership data from an online digital comics platform suggests that the WFF promotion can in fact boost paid readership for the promoted series at the platform level, net of cannibalization, when applied to an appropriate set of comics. Using a combinatorial genetic algorithm, we efficiently search through sets of series that maximize paid viewership when promoted, over time windows of different lengths, finding that the genres and overall popularity in the solution set can change based on planning horizon. Finally, to understand individual user-level promotional effects, we employ a proportional hazards framework to identify the degree of within-series inertia in content consumption and switching behavior across-series, and how these are affected by WFF.

Keywords: serialized digital content, promotion optimization, Wait For Free, digital content monetization, consumption deceleration  

Comparing the Effectiveness of Retargeting and Acquisition Online Banner Ads: A Flexible Approach to Estimating Ad Stock

Abstract: One of the earliest and most extensive literatures in quantitative marketing concerns the measurement of ad effectiveness. Because ads do not “work” immediately, and often require multiple exposures, econometric approaches to measuring the cumulative impact of advertising typically rely on the concept of a (latent) “ad stock” or “goodwill”. Owing to the influential work of Nerlove and Arrow (1962), it has often been assumed that the contribution of each ad to the ad stock decays exponentially over time at a constant rate, captured by a single parameter common across all users and ad types. Here we examine how two different types of online advertising campaigns – for acquisition and retargeting – differentially affect online users’ behaviors by proposing flexible parametric and nonparametric Bayesian approaches for regularizing over past ad weights. Specifically, we make use of an online panel of individual-level ad impression data for a French financial services firm and online activity data of internet users who were shown its banner ads to demonstrate that relaxing the restrictive “single parameter decay” assumption allows us to more flexibly capture the differential impacts of acquisition and retargeting ads on website visits, both in terms of their initial impact on the day of exposure and their lingering effects over time. Our results also suggest that a constant decay rate over time may be an over-regularization, potentially entailing substantive artifacts dictated by the common Nerlove-Arrow assumption.

Keywords: online banner ads, retargeting ads, ad stock, Nerlove-Arrow, Gaussian processes

To Whom, When, and What to Ask?: Mitigating Unhealthy Behaviors and Detecting Relapse with Customized Real-Time Mobile Interventions

Abstract: Ecological Momentary Assessments (EMAs), a type of repeated real-time self-measurements, have been used extensively in health-related studies, especially those regarding substance usage, due to their capabilities both as a form of intervention and as an inobtrusive way of collecting granular data for predicting individual-level outcomes. However, one often-cited practical concern to using EMAs is response fatigue, leading to participant inattentiveness and even attrition, which may limit researchers’ ability to gather diagnostic information and hinder participants’ abatement progress. To assuage such concerns, we propose a Bayesian dynamic factor model that allows researchers to curtail EMA activations for individuals more likely to experience response fatigue and whose responses are expected to vary little over time, as well as to omit participant-specific “temporally redundant” items (i.e., for which responses are expected remain stable) from EMAs. We will be applying the framework to a rich panel dataset from a smoking cessation program where we expect the balance between participants’ response burdens and researchers’ ability to gather timely and relevant data to result in a decline in EMA noncompliance and an improvement in health outcomes.

Impartial Judges or Architects of Echo Chambers?: Role of Moderators’ Interventions on Reddit

Abstract: Many subcommunities in online forums and/or discussion boards have their own set of moderators who enforce their respective content policies. While they are expected to engage in impartial and consistent content moderation, they are often accused of possessing biases, most often of political nature, unfairly punishing posts and comments that are counter to their personal beliefs. Victims of such inconsistences may start self-censoring or become disengaged, gradually turning the subcommunity into an ideological “echo chamber” in which only conforming opinions are tolerated and progressively reinforce one another. By applying natural language processing techniques to text data obtained from the Reddit API and the website “reveddit,” which archives Reddit content deleted by moderators, we plan to seek empirical evidence of such moderation biases on various subreddits. We aim to explore the role of moderators’ interventions in the formation of echo chambers beyond that of self-selection of users into subreddits or content recommendation algorithms, by measuring the shifts in engagement level of users with different political leanings within each subreddit.

Optimizing One-shot Promotional Inducements in a Two-sided Choice Setting: An Application to Scholarship Offerings

Abstract: Each year, university admissions administrators face the complex task of distributing scholarship funds across admitted students with the objective of luring the most desirable cohort (in expected gender parity, standardized test scores, etc.) to accept their offers, subject to various stochastic constraints (e.g., expected total scholarship expended; cohort size). This constitutes a unique two-sided choice setting in which schools, when determining the optimal scholarship amounts, need to incorporate their beliefs about each student’s offers from competing ones. We use a dataset from a university graduate program to make inferences on each student’s unobservable choice (admissions-scholarship) set, and then evaluate the prospects of converting them under various scholarship amounts. To accurately assess the impact of scholarships on students’ propensity to accept an offer, we collaborate with the financial aid directors to conduct two rounds of efficient field experiments on portions of admitted students one year apart. Specifically, scholarship amounts are orthogonally adjusted within each stratum of estimated offer acceptance propensities to induce maximal variations in treatment.