BCD21: CSCW21 Workshop - Biases in Crowdsourced Data Virtual, NY, United States, October 24, 2021 |
Conference website | https://sites.google.com/view/biases-in-crowdsourced-data |
Submission link | https://easychair.org/conferences/?conf=bcd21 |
Abstract registration deadline | September 10, 2021 |
Submission deadline | September 10, 2021 |
Investigating and Mitigating Biases in Crowdsourced Data
A workshop to explore how specific crowdsourcing workflows, worker attributes, and work practices contribute to biases in data. We will also plan to discuss research directions to mitigate labelling biases, particularly in a crowdsourced context, and the implications of such methods for the workers.
The workshop will be held at ACM CSCW 2021, virtually on the 23rd of October 2021 from 3 PM to 8 PM EDT.
Submission Guidelines
We invite participants to take part in the workshop challenge and/or submit a position paper.
-
Please submit a short position paper on previous or ongoing research work on biases in crowd data.
Submissions can be up to 3 pages in length (excluding references) and should follow the ACM Manuscript format (available as Latex template and Word template). The review of submissions will follow a juried process (see https://chi2022.acm.org/for-authors/note-on-selection-processes/). Submissions will be selected based on their relevance to the workshop themes, and the originality and novelty of the submitted ideas. Manuscripts should be submitted in pdf format at the workshop submission website by 10 September 2021. At least one author of each accepted paper must attend the workshop and all participants must register for both the workshop and for at least one day of the conference.
-
Participate in the Crowd Bias Challenge.
We plan to introduce a workshop challenge where participants will gather a crowdsourced dataset for a given problem. More details on the Crowd Bias Challenge page.
Workshop Themes
Through this workshop, we aim to foster discussion on ongoing work around biases in crowd data, provide a central platform to revisit the current research, and identify future research directions that are beneficial to both task requesters and crowd workers.
Understanding how annotator attributes contribute to biases
Research on crowd work has often focused on task accuracy whereas other factors such as biases in data have received limited attention. We are interested in reviewing existing approaches and discussing ongoing work that helps us better understand annotation attributes contributing to biases.
Quantifying bias in annotated data
An important step towards bias mitigation is detecting such biases and measuring the extent of biases in data. We seek to discuss different methods, metrics and challenges in quantifying biases, particularly in crowdsourced data. Further, we are interested in ways of comparing biases across different samples and investigating if specific biases are task-specific or task-independent.
Novel approaches to mitigate crowd bias
We plan to explore novel methods that aim to reduce biases in crowd annotation in particular. Current approaches range from worker pre-selection, improving task presentation and dynamic task assignment. We seek to discuss shortcomings and limitations of existing and ongoing approaches and ideate future directions.
Impact on crowd workers
We want to explore how bias identification and mitigation strategies can impact the actual workers, positively or negatively. For example, workers in certain groups may face increased competition and lack of task availability. Collecting worker attributes and profiling could raise ethical concerns.
Organizing Committees
- Danula Hettiachchi, RMIT University
- Emine Yilmaz, University College London and Amazon
- Gabriella Kazai, Microsoft Research
- Jorge Goncalves, The University of Melbourne
- Mark Sanderson, RMIT University
- Matthew Lease, University of Texas at Austin and Amazon
- Mike Schaekermann, Amazon
- Simo Hosio, University of Oulu
Contact
Please check the workshop website for contact details.