Submit a Manuscript to the Journal

Communication Methods and Measures

For a Special Issue on

Understanding and Addressing Biases in Computational Social Science

Abstract deadline
15 April 2024

Manuscript deadline
16 August 2024

Cover image - Communication Methods and Measures

Special Issue Editor(s)

Valerie Hase, LMU Munich, Germany
[email protected]

Marko Bachl, Freie Universität Berlin, Germany
[email protected]

Nathan TeBlunthuis, University of Michigan
[email protected]

Submit an ArticleVisit JournalArticles

Understanding and Addressing Biases in Computational Social Science

The Communication Methods and Measures special issue, Understanding and Addressing Biases in Computational Social Science (CSS), calls for work on biases in and through computational methods. Bias results from systematic deviations (e.g., due to sampling, operationalization, or data analysis) between the true value of a theoretical concept in a population and how a study observes it. Biases may result when data collection methods from CSS (e.g., APIs, tracking) sample non-representative populations or when limitations in data analysis methods (e.g., network analysis, machine learning, automated content analysis, simulations) distort inferences (e.g., Hovy & Prabhumoye, 2021; Sen et al., 2021). For example, computational modes of data collection enable large samples, but increasing sample sizes are often unrelated or even detrimental to data quality and representativeness (e.g., González-Bailón et al., 2014; Keusch et al., 2019). Empirical indicators for theoretical constructs are usually selected based on availability (e.g., “likes” as popularity cues) without validating that they correspond to theoretical concepts of interest (e.g., Baden et al., 2022; Wagner et al., 2021). Large language models are increasingly used in automated content analysis despite fears that they reproduce biases in their training data (Navigli et al., 2023). As such, biases may propagate through different stages of the research process, leading to biased estimates and inferential conclusions. Consequently, if not considered and accounted for, they may threaten the validity of measurements, samples, and results (Bachl & Scharkow, 2017; TeBlunthuis et al., 2024). We welcome submissions that reflect CSS’s interdisciplinarity: Understanding and addressing biases in computational methods requires engaging methodological literature and practices across various disciplines, including the social sciences, computer science, and statistics.

With this special issue, we want to create a platform for critical but constructive discussions on computational methods, their biases in CSS research, and applications thereof. Scholars employing computational methods need to take a critical perspective by better defining, systematizing, and quantifying biases. At the same time, they should aim for a constructive approach by developing solutions to address biases. To do so, researchers can and should leverage their knowledge of social science theories (e.g., how to operationalize concepts of interest) and social science methods (e.g., shared understandings of quality criteria such as validity; error correction methods from survey or content analysis research). Finally, the risks of biases in CSS are not confined to the academy. CSS data and methods are used by organizations, communication professionals, and social media platforms. Communication researchers are well-positioned to interrogate the consequences of biases on communication phenomena in such contexts.

We call for submissions that take a critical and/or constructive perspective on biases in CSS. Our special issue is interested in conceptual, methodological, and empirical studies that…

● define biases (e.g., develop frameworks to systematize different types of biases and how they may threaten the validity of CSS),
● quantify biases and their effects (e.g., develop or apply methods to quantify biases and their down-stream consequences, for example, algorithmic biases related to social media or organizations; discuss or analyze ethical implications, for example, racial or gender bias in machine learning and how they might lead to discrimination),
● address biases (e.g., adapt existing methods from survey research, content analysis, statistics, or other fields; develop new social science methods to mitigate biases or their downstream consequences),
● clarify the relationship between biases and validity (e.g., illustrate how different biases relate to construct, internal, and external validity; adapt and develop methods for assessing validity in the presence of biases; define thresholds for(un)acceptable biases, for example, considering the bias-variance tradeoff).

Submission Instructions

Timetable for Special Issue

● Submissions of abstracts (April 15th, 2024). Authors should submit an abstract without identifying information (900–1,200 words, including references) and a title page (including author names and affiliations) to the Editorial Team ([email protected]) as two separate PDF files. The abstract should clarify the research goal, key questions, and the research design, if applicable. Authors can but do not have to include results in the abstract. If authors do not include results, they need to clearly state the status of the project and the anticipated timeline for finalizing results.

● Notifications about invitations to submit full papers (April 30th, 2024). Invitations to submit a full paper do not guarantee final publication, as full manuscripts will undergo the journal’s usual double-blinded peer review.

● Submission of full paper (August 16th, 2024). Authors should submit full papers as PDFs and in line with the journal’s instructions for authors to the Editorial Team ([email protected]), not the journal’s submission system. We encourage authors to follow open science practices, especially those related to computational reproducibility. Once accepted, articles will be published “Online First” in Communication Methods and Measures. The special issue will be published as issue 4, 2025.

Instructions for AuthorsSubmit an Article