Dynamics of Gender Bias in Computing
- URL: http://arxiv.org/abs/2211.03905v1
- Date: Mon, 7 Nov 2022 23:29:56 GMT
- Title: Dynamics of Gender Bias in Computing
- Authors: Thomas J Misa
- Abstract summary: This article presents a new dataset focusing on formative years of computing as a profession (1950-1980) when U.S. government workforce statistics are thin or non-existent.
It revises commonly held conjectures that gender bias in computing emerged during professionalization of computer science in the 1960s or 1970s.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Gender bias in computing is a hard problem that has resisted decades of
research. One obstacle has been the absence of systematic data that might
indicate when gender bias emerged in computing and how it has changed. This
article presents a new dataset (N=50,000) focusing on formative years of
computing as a profession (1950-1980) when U.S. government workforce statistics
are thin or non-existent. This longitudinal dataset, based on archival records
from six computer user groups (SHARE, USE, and others) and ACM conference
attendees and membership rosters, revises commonly held conjectures that gender
bias in computing emerged during professionalization of computer science in the
1960s or 1970s and that there was a 'linear' one-time onset of gender bias to
the present. Such a linear view also lent support to the "pipeline" model of
computing's "losing" women at successive career stages. Instead, this dataset
reveals three distinct periods of gender bias in computing and so invites
temporally distinct explanations for these changing dynamics. It significantly
revises both scholarly assessment and popular understanding about gender bias
in computing. It also draws attention to diversity within computing. One
consequence of this research for CS reform efforts today is data-driven
recognition that legacies of gender bias beginning in the mid-1980s (not in
earlier decades) is the problem. A second consequence is correcting the public
image of computer science: this research shows that gender bias is a contingent
aspect of professional computing, not an intrinsic or permanent one.
Related papers
- Survey of Bias In Text-to-Image Generation: Definition, Evaluation, and Mitigation [47.770531682802314]
Even simple prompts could cause T2I models to exhibit conspicuous social bias in generated images.
We present the first extensive survey on bias in T2I generative models.
We discuss how these works define, evaluate, and mitigate different aspects of bias.
arXiv Detail & Related papers (2024-04-01T10:19:05Z) - The Gender-GAP Pipeline: A Gender-Aware Polyglot Pipeline for Gender
Characterisation in 55 Languages [51.2321117760104]
This paper describes the Gender-GAP Pipeline, an automatic pipeline to characterize gender representation in large-scale datasets for 55 languages.
The pipeline uses a multilingual lexicon of gendered person-nouns to quantify the gender representation in text.
We showcase it to report gender representation in WMT training data and development data for the News task, confirming that current data is skewed towards masculine representation.
arXiv Detail & Related papers (2023-08-31T17:20:50Z) - VisoGender: A dataset for benchmarking gender bias in image-text pronoun
resolution [80.57383975987676]
VisoGender is a novel dataset for benchmarking gender bias in vision-language models.
We focus on occupation-related biases within a hegemonic system of binary gender, inspired by Winograd and Winogender schemas.
We benchmark several state-of-the-art vision-language models and find that they demonstrate bias in resolving binary gender in complex scenes.
arXiv Detail & Related papers (2023-06-21T17:59:51Z) - Fairness in AI Systems: Mitigating gender bias from language-vision
models [0.913755431537592]
We study the extent of the impact of gender bias in existing datasets.
We propose a methodology to mitigate its impact in caption based language vision models.
arXiv Detail & Related papers (2023-05-03T04:33:44Z) - Gender Bias in Big Data Analysis [0.0]
It measures gender bias when gender prediction software tools are used in historical big data research.
Gender bias is measured by contrasting personally identified computer science authors in the well-regarded DBLP dataset.
arXiv Detail & Related papers (2022-11-17T20:13:04Z) - Gender Bias in Computing [0.0]
It offers new quantitative data on the computing workforce prior to the availability of US Census data in the 1970s.
A novel method of gender analysis is developed to estimate women's and men's participation in computing beginning in the 1950s.
arXiv Detail & Related papers (2022-10-29T00:10:25Z) - Temporal Analysis and Gender Bias in Computing [0.0]
Many names change ascribed gender over decades: the "Leslie problem"
This article identifies 300 given names with measurable "gender shifts" across 1925-1975.
This article demonstrates, quantitatively, there is net "female shift" that likely results in the overcounting of women (and undercounting of men) in earlier decades.
arXiv Detail & Related papers (2022-09-29T00:29:43Z) - Towards Understanding Gender-Seniority Compound Bias in Natural Language
Generation [64.65911758042914]
We investigate how seniority impacts the degree of gender bias exhibited in pretrained neural generation models.
Our results show that GPT-2 amplifies bias by considering women as junior and men as senior more often than the ground truth in both domains.
These results suggest that NLP applications built using GPT-2 may harm women in professional capacities.
arXiv Detail & Related papers (2022-05-19T20:05:02Z) - Are Commercial Face Detection Models as Biased as Academic Models? [64.71318433419636]
We compare academic and commercial face detection systems, specifically examining robustness to noise.
We find that state-of-the-art academic face detection models exhibit demographic disparities in their noise robustness.
We conclude that commercial models are always as biased or more biased than an academic model.
arXiv Detail & Related papers (2022-01-25T02:21:42Z) - Assessing Gender Bias in the Information Systems Field: An Analysis of
the Impact on Citations [0.0]
This paper outlines a study to estimate the impact of scholarly citations that female IS academics accumulate vis-a-vis their male colleagues.
By doing so we propose to contribute knowledge on a core dimension of gender bias in academia, which is, so far, almost completely unexplored in the IS field.
arXiv Detail & Related papers (2021-08-22T18:18:52Z) - Multi-Dimensional Gender Bias Classification [67.65551687580552]
Machine learning models can inadvertently learn socially undesirable patterns when training on gender biased text.
We propose a general framework that decomposes gender bias in text along several pragmatic and semantic dimensions.
Using this fine-grained framework, we automatically annotate eight large scale datasets with gender information.
arXiv Detail & Related papers (2020-05-01T21:23:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.