Forging the Future of Work

AdGazer: Improving Contextual Advertising with Theory-Informed Machine Learning
Journal of Marketing

Contextual advertising involves matching features of ads to features of the media context where they appear. We propose AdGazer, a new machine learning procedure to support contextual advertising. It comprises a theoretical framework organizing high and low-level features of ads and contexts, feature engineering models grounded in this framework, an XGBoost model predicting ad and brand attention, and an algorithm optimally assigning ads to contexts. AdGazer includes a Multimodal Large Language Model to extract high-level topics predicting the ad-context match. Our research uses a unique eye-tracking database containing 3531 digital display ads and their contexts, and aggregate ad and brand gaze times. We compare AdGazer’s predictive performance to two feature learning models, VGG16 and ResNet50. AdGazer predicts highly accurately with hold-out correlations of 0.83 for ad gaze and 0.80 for brand gaze, outperforming both feature learning models and generalizing better to out-of-distribution ads. Context features jointly contributed at least 33% to predicted ad gaze and about 20% to predicted brand gaze, good news for managers practicing or considering contextual advertising. We demonstrate that the theory-informed AdGazer effectively matches ads to advertising vehicles and their contexts, optimizing ad gaze more than current practice and alternatives like text-based and native contextual advertising.

Michel Wede (UMD Smith) Jianping Ye (UMD PhD student); and Rik Pieters (Tilburg University, the Netherlands)


AI for Customer Journeys: A Transformer Approach
Journal of Marketing Research

AI for Customer Journeys: A Transformer Approach

Zipei Lu and P. K. Kannan, Smith School of Business, University of Maryland
(forthcoming Journal of Marketing Research)

AI for Customer Journeys: A Transformer Approach introduces a novel artificial intelligence (AI) framework for modeling customer journeys in digital marketing. Leveraging transformer-based models – originally developed for natural language processing - this approach analyzes complex sequences of customer interactions across multiple channels (e.g., search, email, display ads). Unlike traditional models, this method considers both the timing and type of interactions, making it uniquely suited to modern multi-touchpoint environments.

“Transformers give us the ability to see the journey as a whole, not just as a series of isolated interactions. That’s a major leap in marketing analytics.”
— PK Kannan

The core innovation lies in its use of multi-head self-attention mechanisms, which model each customer’s journey as a dynamic sequence of touchpoints. This allows marketers to not only predict the likelihood of purchase but also identify when and through which channels interventions are most effective. Furthermore, the model is extended to capture individual-level heterogeneity, enabling personalized insights into how different customers respond to marketing efforts.

“We designed the model to capture the complexity and individuality of digital customer journeys—something traditional models often overlook.”
— Zipei Lu

Using rich data from a major hospitality firm – including over 92,000 users and over half a million visits – the model demonstrates substantial improvements over traditional approaches (e.g., Hidden Markov Models, Poisson Point Process Models, and LSTMs). For example, the proposed model achieves an AUC of 0.92 for out-of-sample conversion predictions compared to 0.85 for LSTM and <0.70 for others. Moreover, it identifies high-potential customers with far greater precision – top-decile predictions yield an 88% true conversion rate versus 34% for LSTM.

Beyond prediction, the model offers descriptive marketing insights, such as how the effectiveness of email or display ads varies over time and across customers. For instance, the study finds that customer-initiated interactions (like direct visits) have stronger and longer-lasting effects than firm-initiated ones (like emails), and the optimal window for intervention is typically within 7 to 14 days before purchase.

The model’s structure also enables profiling and customer segmentation based on latent self-attention patterns, helping marketers understand nuanced motivations like last-minute business bookings versus long-term vacation planning. This insight can inform targeted messaging and A/B testing strategies.

Overall, this AI framework not only enhances predictive accuracy but also delivers actionable insights that can improve ROI, optimize channel mix, and enable real-time personalization in customer engagement.

Zipei Lu, Ph. D. Candidate in Marketing; P. K. Kannan, Dean's Chair in Marketing Science, both at the Smith School


Back to Top