Rebekah Tromble

Associate Professor of Media and Public Affairs

George Washington University

Rebekah Tromble is an associate professor in the School of Media and Public Affairs and director of the Institute for Data, Democracy, and Politics at George Washington University. Her research focuses on political communication, digital research methodology, and research ethics. Tromble is particularly interested in political discourse on social media, as well as the spread and impact of online misinformation.

Areas of Expertise: Political Communication, Digital Research Methodology, and Research Ethics

  • Tromble, R. (2021). Where Have All the Data Gone? A Critical Reflection on Academic Digital Research in the Post-API Age. Social Media + Society, 7(1).

    Abstract: In the wake of the 2018 Facebook–Cambridge Analytica scandal, social media companies began restricting academic researchers’ access to the easiest, most reliable means of systematic data collection via their application programming interfaces (APIs). Although these restrictions have been decried widely by digital researchers, in this essay, I argue that relatively little has changed. The underlying relationship between researchers, the platforms, and digital data remains largely the same. The platforms and their APIs have always been proprietary black boxes, never intended for scholarly use. Even when researchers could mine data seemingly endlessly, we rarely knew what type or quality of data were at hand. Moreover, the largesse of the API era allowed many researchers to conduct their work with little regard for the rigor, ethics, or focus on societal value, we should expect from scholarly inquiry. In other words, our digital research processes and output have not always occupied the high ground. Rather than viewing 2018 and Cambridge Analytica as a profound disjuncture and loss, I suggest that digital researchers need to take a more critical look at how our community collected and analyzed data when it still seemed so plentiful, and use these reflections to inform our approaches going forward.

    Full Paper

  • Bailard, C., Tromble, R., Zhong, W., Bianchi, F., Hosseini, P., & Broniatowski, D. (2024). “Keep Your Heads Held High Boys!”: Examining the Relationship between the Proud Boys’ Online Discourse and Offline Activities. American Political Science Review, 1-18.

    Abstract: This study examines the relationship between online communication by the Proud Boys and their offline activities. We use a supervised machine learning model to analyze a novel dataset of Proud Boys Telegram messages, merged with US Crisis Monitor data of violent and nonviolent events in which group members participated over a 31-month period. Our analysis finds that intensifying expressions of grievances online predict participation in offline violence, whereas motivational appeals to group pride, morale, or solidarity share a reciprocal relationship with participation in offline events. This suggests a potential online messaging–offline action cycle, in which (a) nonviolent offline protests predict an increasing proportion of motivational messaging and (b) increases in the frequency and proportion of motivational appeals online, in turn, predict subsequent violent offline activities. Our findings offer useful theoretical insights for understanding the relationship between online speech and offline behavior.

    Full Paper

  • González-Bailón, S., Lazer, D., Barberá, P., Zhang, M., Allcott, H., Brown, T., … Tromble, R., … & Tucker, J. A. (2023). Asymmetric ideological segregation in exposure to political news on Facebook. Science, 381(6656), 392-398

    Abstract: Does Facebook enable ideological segregation in political news consumption? We analyzed exposure to news during the US 2020 election using aggregated data for 208 million US Facebook users. We compared the inventory of all political news that users could have seen in their feeds with the information that they saw (after algorithmic curation) and the information with which they engaged. We show that (i) ideological segregation is high and increases as we shift from potential exposure to actual exposure to engagement; (ii) there is an asymmetry between conservative and liberal audiences, with a substantial corner of the news ecosystem consumed exclusively by conservatives; and (iii) most misinformation, as identified by Meta's Third-Party Fact-Checking Program, exists within this homogeneously conservative corner, which has no equivalent on the liberal side. Sources favored by conservative audiences were more prevalent on Facebook's news ecosystem than those favored by liberals.

    Full Paper

  • Vidgen, B., Harris, A., Nguyen, D., Tromble, R., Hale, S., & Margetts, H. (2019, August). Challenges and frontiers in abusive content detection. Proceedings of the Third Workshop on Abusive Language Online, Association for Computational Linguistics, pp. 80–93.

    Abstract: Online abusive content detection is an inherently difficult task. It has received considerable attention from academia, particularly within the computational linguistics community, and performance appears to have improved as the field has matured. However, considerable challenges and unaddressed frontiers remain, spanning technical, social and ethical dimensions. These issues constrain the performance, efficiency and generalizability of abusive content detection systems. In this article we delineate and clarify the main challenges and frontiers in the field, critically evaluate their implications and discuss potential solutions. We also highlight ways in which social scientific insights can advance research. We discuss the lack of support given to researchers working with abusive content and provide guidelines for ethical research.

    Full Paper

  • Nguyen, D., Liakata, M., DeDeo, S., Eisenstein, J., Mimno, D., Tromble, R., & Winters, J. (2020). How we do things with words: Analyzing text as social and cultural data. Frontiers. Volume 3 - 2020

    Abstract: In this article we describe our experiences with computational text analysis involving rich social and cultural concepts. We hope to achieve three primary goals. First, we aim to shed light on thorny issues not always at the forefront of discussions about computational text analysis methods. Second, we hope to provide a set of key questions that can guide work in this area. Our guidance is based on our own experiences and is therefore inherently imperfect. Still, given our diversity of disciplinary backgrounds and research practices, we hope to capture a range of ideas and identify commonalities that resonate for many. This leads to our final goal: to help promote interdisciplinary collaborations. Interdisciplinary insights and partnerships are essential for realizing the full potential of any computational text analysis involving social and cultural concepts, and the more we bridge these divides, the more fruitful we believe our work will be.

    Full Paper

  • Tromble, R., & McGregor, S. C. (2019). You Break It, You Buy It: The Naiveté of Social Engineering in Tech – And How to Fix It. Political Communication, 36 (2), 324–332.

    Facebook’s mission statement promises to “give people the power to build community and bring the world closer together”, and in his commencement speech to the Harvard class of 2017, CEO and founder Mark Zuckerberg spoke about his dream, even in college, of connecting the whole world. What becomes increasingly clear with each revelatory news story, is that Zuckerberg – and the leaders and engineers and designers at other tech firms like Google and Twitter – failed to see how an engineering mindset applied to achieve social goals could wreak havoc on society and democracy.

    Full Paper

Featured Publications

News