Political Science
Public attitudes towards algorithmic personalization and use of personal data online: evidence from Germany, Great Britain, and the United States
A. Kozyreva, P. Lorenz-spreen, et al.
This fascinating study conducted by Anastasia Kozyreva, Philipp Lorenz-Spreen, Ralph Hertwig, Stephan Lewandowsky, and Stefan M. Herzog dives into public attitudes towards algorithmic personalization and data privacy in Germany, Great Britain, and the United States. It reveals striking objections to sensitive data use in political campaigns and the importance of transparent, user-respecting personalization strategies.
~3 min • Beginner • English
Introduction
The study examines how the widespread use of AI-driven personalization online—across social media feeds, targeted ads, recommender systems, and search—affects public attitudes toward personalization and data collection. Concerns include impacts on democratic discourse (e.g., targeted political messaging, misinformation, extremism), discrimination through targeting, and opaque data harvesting that can infer sensitive traits. The research asks: How aware are people of algorithmic influence online, and how acceptable do they find using their information for personalization? Given regulatory debates (e.g., EU Digital Services Act, platform policies) and the lack of public involvement in shaping algorithms and data practices, the study aims to provide empirical evidence on attitudes towards personalized services, the collection/use of personal information and data, privacy concerns, behaviors, and potential moderation by political leaning and demographics in Germany, Great Britain, and the United States.
Literature Review
Prior work indicates attitudes towards personalization are context-dependent, with more positive views for commercial applications than for political personalization (e.g., Ipsos Mori, Smith). Many in the US and Europe feel little control over their personal data and harbor substantial privacy concerns, yet behavior often fails to reflect these concerns—a phenomenon termed the privacy paradox (Acquisti et al.; Barth & de Jong; Kokolakis; Norberg et al.; with nuances from Dienlin & Trepte and meta-analysis by Baruh et al.). Algorithms can amplify misinformation and extremism and facilitate discriminatory targeting based on demographics and inferred traits. Regulatory calls have emerged to govern online political advertising and increase transparency. However, most studies have treated attitudes toward personalized services and data privacy separately, despite their interdependence. This study addresses that gap by jointly examining attitudes toward services, information used for personalization, and data collection.
Methodology
Design: Cross-national online survey conducted in German and English in 2019 (Germany: September; Great Britain and US: November). Samples: Germany (N=1065), Great Britain (N=1092), United States (N=1059). Sampling used quotas with post-stratification weights to match age (18–65), gender, and education distributions. IRB approval was obtained from the Max Planck Institute for Human Development.
Measures: (1) Awareness of AI and personalization online: familiarity with terms (e.g., artificial intelligence, machine learning, recommender systems, targeted/personalized advertising) and recognition of AI use across contexts (e.g., smart assistants, search ranking, social media advertising, news feed curation, dating recommendations). (2) Attitudes toward algorithmic personalization assessed across three components: services (e.g., political campaign messages, news front pages, social media posts, commercial ads, search results, recommendations for people, restaurants/shops, movies/music, local events); information used for personalization (e.g., age, gender, political views, sexual orientation, religious views, ethnicity, household income, relationship status, personality, personal events/tragedies); and data collected/used (e.g., content of emails/online messages, online interactions, location history, browsing/search history, typing/scrolling behavior, purchasing history, videos watched, likes/shares on social media). Response options for acceptability: not acceptable at all; not very acceptable; somewhat acceptable; very acceptable. (3) Privacy: Concern level (not at all, not very, somewhat, very concerned), self-reported privacy-protecting settings used in the last year (e.g., Facebook privacy, Facebook ad preferences, Google privacy/personalization, Google activity controls, Amazon privacy, browser privacy), and current privacy tools/measures (e.g., ad blockers, incognito browsing, avoiding monitoring, avoiding certain sites/platforms, privacy-respecting search engines, adjusting privacy/ad settings). (4) Demographics (age, gender, education, urban/rural) and political leaning (1=left-wing to 7=right-wing).
Analysis: Weighted analyses using post-stratification weights. Acceptance level defined as the arithmetic mean of item ratings mapped to [0,1]. Acceptability gap defined as the within-respondent difference between acceptance of services and acceptance of information/data. Reported margins of error (95% CI for true proportion of 50%): ±3 percentage points for N≈1000; ±10 for N≈100. Anonymized data and R code: https://osf.io/7nj8h.
Key Findings
Awareness: Majority familiarity with the term artificial intelligence (GER: 86%; GB: 74%; US: 67%) and targeted/personalized advertising (GER: 70%; GB: 58%; US: 50%); lower familiarity with recommender systems (GER: 34%; GB: 12%; US: 12%) and machine learning (GER: 42%; GB: 31%; US: 33%). Awareness of AI use was higher for smart assistants (GER: 70%; GB: 66%; US: 63%), search ranking (GER: 59%; GB: 52%; US: 48%), and social media advertising (GER: 57%; GB: 56%; US: 55%) than for dating recommendations (GER: 38%; GB: 41%; US: 40%) or news feed curation (≈44% across countries).
Attitudes toward services: Personalized political advertising was rated unacceptable by most in GER (61%) and GB (61%) and about half in the US (51%). Many in GER and GB opposed personalized news (front pages: GER 52%, GB 54%; social media news feeds: GER 57%, GB 51%). In the US, majorities found personalized online newspapers (60%) and social media news feeds (62%) acceptable. Majorities in all countries approved of entertainment recommendations (movies/music: GER 77%; GB 84%; US 88%), shopping recommendations (GER 77%; GB 82%; US 89%), and personalized search results (GER 63%; GB 60%; US 71%).
Attitudes toward information used for personalization: Majorities opposed use of sensitive information. Unacceptable to use political views: GER 71%, GB 59%, US ≈49%; sexual orientation: GER 71%, GB 62%, US 51%; household income and personal tragedies were also widely rejected. Only age and gender were acceptable to a majority across all countries. US respondents were more accepting than GER/GB of using personal events (55%), ethnicity (57%), relationship/marital status (62%), and personality traits (68%).
Attitudes toward data collection/use: Majorities opposed collection/use of online interaction data (with whom/how often: GER 77%; GB 66%; US 60%), location history (GER 69%; GB 57%; US 55%), and browsing/search history (GER 63%; GB 58%; US 53%). Roughly half or more accepted purchasing history (GER 44%; GB 47%; US 51%), videos watched (GER 44%; GB 52%; US 62%), and likes/shares on social media (GER 43%; GB 54%; US 65%).
Acceptability gap: At the aggregate level, acceptance of services exceeded acceptance of information/data by ~1/6 to 1/4 of the 4-point response scale, largest in GER (≈1/4), smaller in GB (≈1/5) and US (≈1/6 for information, ≈1/5 for data). At the individual level, 84–89% of respondents showed at least one acceptability gap (services > information and/or data); 64–75% showed a gap for both information and data; only 13–16% showed no gap.
Privacy concerns and behaviors: High concern levels: somewhat or very concerned—GER 82%, GB 81%, US 82%; not at all concerned—GER 4%, GB 4%, US 6%. Reported behaviors were modest: used in last year—Facebook privacy settings (GER 59%; GB 60%; US 63%), Google privacy/personalization (GER 47%; GB 44%; US 53%), browser privacy settings (GER 47%; GB 42%; US 48%), Google activity controls (GER 43%; GB 37%; US 48%), Facebook ad preferences (GER 28%; GB 35%; US 36%), Amazon privacy/personalization (GER 34%; GB 24%; US 32%). Current tools—adjusting privacy/ad settings (GER 37%; GB 39%; US 40%), ad blockers (GER 33%; GB 34%; US 36%), incognito browsing (GER 38%; GB 28%; US 35%), avoiding monitoring (GER 26%; GB 22%; US 24%), avoiding certain sites/platforms (GER 25%; GB 21%; US 24%), privacy-respecting search engines (GER 16%; GB 15%; US 20%). None used: tools—GER 20%; GB 24%; US 19%; settings—GER 20%; GB 24%; US 19%. Higher concern was associated with greater use of tools/settings.
Demographics and politics: Acceptance generally declined with age; US men showed a slight inverted U-shaped acceptance pattern (peaking around age 40). Older respondents were more concerned about privacy; men slightly less concerned than women. No notable associations with education or urban/rural location. Crucially, no political polarization: attitudes toward services, information, data, and privacy concerns were similar across the left–right spectrum in all three countries.
Discussion
The findings reveal widely shared ethical boundaries for algorithmic personalization. People accept commercial personalization (shopping, entertainment, search) but oppose personalization for political campaigning and, in Europe, for news sources and social media feeds. They also reject the use of sensitive personal information and many types of behavioral data for personalization. The observed acceptability gap—greater acceptance of personalized services than of the data/information needed to deliver them—suggests a tension between valuing convenience and valuing privacy. Potential explanations include trade-offs among incommensurable values and a lack of transparency/awareness about data practices underlying personalization, which may lead users to underestimate the extent of data collection. Despite high privacy concern, protective actions are limited, though more concerned individuals engage more in privacy management. The lack of political polarization indicates broad public support for privacy protections and constraints on political personalization, offering a clear mandate for policy action and platform self-regulation aligned with user preferences.
Conclusion
This study provides cross-national evidence that, while users broadly accept commercial personalization, they object to political personalization and the use of sensitive information and many behavioral data for targeting. There is a robust acceptability gap between services and the data/information used to fuel them, present at both population and individual levels, and attitudes are consistent across the political spectrum. The results underscore the need for transparent, user-controllable personalization that minimizes personal data use, avoids political advertising personalization, and respects public preferences. Policy and design implications include adopting data minimization, enhancing transparency and comprehensibility of privacy controls, and considering privacy as a networked public good. Future research should investigate the drivers of attitudes (and the acceptability gap), elucidate how people negotiate service–privacy trade-offs, and incorporate behavioral data to complement self-reports.
Limitations
The survey is cross-sectional and relies on self-reported attitudes and behaviors, limiting causal inference and potentially underrepresenting actual behavior. Samples cover adults aged 18–65 in three countries (Germany, Great Britain, United States) and may not generalize beyond these contexts or age ranges. The study does not identify the causal drivers of attitudes or the acceptability gap and does not directly observe actual privacy behaviors in situ. Awareness and acceptance measures depend on respondents' understanding of AI/personalization, which may be incomplete given noted knowledge gaps.
Related Publications
Explore these studies to deepen your understanding of the subject.

