INSIGHTS

Dissolution of Phases and Contexts

Preprints as Scientific and Media Tools, and the Public Role-Taking of Scientists

The use of preprints—scientific manuscripts shared before peer review—has expanded rapidly since 2013, particularly during the COVID-19 pandemic, when around 25% of related research first appeared as preprints. To understand how researchers engage with this format, we surveyed 1,360 German researchers across disciplines. Findings show that while concerns about credibility remain, many scientists use preprints strategically for faster visibility, early feedback, and to complement the traditional publishing process. Adoption varies by discipline, reflecting different norms around openness and publication speed.

Preprints have also become more visible in journalism, especially during periods of rapid information flow. An analysis of three major German news outlets over a ten-year span (2015–2024) shows that scientific references in news increased during high-attention moments—such as the early COVID-19 phase or major climate events. During these times, journalists increasingly cited preprints as timely supplements to peer-reviewed sources, without fully replacing them. However, the preliminary nature of preprints was not always clearly stated, raising concerns about how uncertainty is framed for the public.

At the same time, scientists' roles in news coverage have shifted. A content analysis of 2,000 german news articles (2015–2024) reveals that scientists are often framed not only as experts but also as advisors, advocates, or even political actors—particularly in contested fields like public health and climate science. Linguistic analysis of quotations and verbs shows tensions between how scientists present themselves and how journalists portray them. This blurring of roles reflects broader changes in science communication, where urgency, activism, and digital media increasingly shape the public image and authority of scientists.

Perception of Roles, Credibility and Trust

How do people perceive information from different senders on social media - especially when it comes to science? This question guided two online experiments in which we explored whether users correctly categorise and consequently remember the roles of different senders (e.g., scientists or laypeople) or whether they confuse the roles, due to not having enough information about the senders. Our findings suggest that people confuse the senders’ role in the context of scientific communication - particularly when profile information is not visible. However, users who actively sought for profile information of the senders were significantly better at identifying the sender’s role. These results are presented in a scientific article currently under review at Human Communication Research.

Building on the experiments on role perception, we are conducting a second series of experiments to investigate how scrolling behaviour on social media affects the perceived credibility of posts and their senders. Participants were exposed to posts sharing scientific papers, preprints, or misinformation, supposedly posted by scientists, journalists, or laypeople. Initial findings reveal that both scientists and journalists tend to be seen as more credible than laypeople - even when scrolling. Misinformation, as expected, is rated less credible than peer-reviewed papers or preprints. Intriguingly, the type of content seemed to influence credibility judgments more than the sender’s role.

To gain a deeper understanding of how demographic and psychological predictors, as well as (social) media use, affect trust in science in the long term, we also conducted a four-wave longitudinal survey involving 1,462 participants. In addition to demographic data, the survey captured psychological and media-related variables such as science-related populist attitudes, conspiracy mentality, critical thinking, and social media usage. The full dataset has been collected and is now being analyzed to uncover long-term patterns in public trust in science.

NLP For Classifying Sources and Roles

Since taking part in the scientific online discourse got popular over time, especially in the COVID-19 pandemic, an understanding of the involved actors, their roles and shared sources is mandatory. We build a pipeline to extract a diverse set of linguistic features, covering lexical, syntactical, semantic and opinion information to classify scientific accounts on X. Next step is to build a dataset consisting of scientists, politicians, journalists and laymen to perform a large scale analysis of communication patterns of user groups and roles. Furthermore, we took part in the Multi-Author-Writing-Style Task at Clef 2025. Meet us there in September 2025!

Longitudinal Online Discourse Analysis

Stance Detection in Scientific Online Discourse

As scientific resources are increasingly shared and discussed online, there is a growing need to develop methods that compute and contextualize altmetrics beyond traditional citation counts. Our goal is to better understand how scholarly publications are referenced and received across different online platforms. To capture the stance of social media users towards the scientific work they cite, we manually annotate 2,000 posts from X (formerly Twitter) and Reddit. These annotations form the basis for training and evaluating automatic stance detection models and enable a large-scale analysis of how scientific findings are perceived, supported, or challenged in scientific online discourse.

Science News Sharing

While we focus on first-order citations (e.g., posts with links to scholarly works) when analyzing stances, scientific resources are also very often shared via second-order citations (e.g., posts with links to news articles that refer to scholarly works). We analyze how news outlets with different political leanings report on science, examine which scientific publications they report on, and track how often these articles are shared on social media (X / Twitter). This allows us to identify potential biases in the selection and amplification of scientific topics, to get a better picture of how political orientation might shape the visibility of certain topics and disciplines of research.