Dynamics of Gender Bias in Software Engineering
- URL: http://arxiv.org/abs/2508.21050v1
- Date: Thu, 28 Aug 2025 17:54:49 GMT
- Title: Dynamics of Gender Bias in Software Engineering
- Authors: Thomas J. Misa,
- Abstract summary: The field of software engineering is embedded in both engineering and computer science, and may embody gender biases endemic to both.<n>This paper surveys software engineering's origins and its long-running attention to engineering professionalism, profiling five leaders.<n>It then examines the field's recent attention to gender issues and gender bias.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The field of software engineering is embedded in both engineering and computer science, and may embody gender biases endemic to both. This paper surveys software engineering's origins and its long-running attention to engineering professionalism, profiling five leaders; it then examines the field's recent attention to gender issues and gender bias. It next quantitatively analyzes women's participation as research authors in the field's leading International Conference of Software Engineering (1976-2010), finding a dozen years with statistically significant gender exclusion. Policy dimensions of research on gender bias in computing are suggested.
Related papers
- Twenty Years of Personality Computing: Threats, Challenges and Future Directions [76.46813522861632]
Personality Computing is a field at the intersection of Personality Psychology and Computer Science.<n>This paper provides an overview of the field, explores key methodologies, discusses the challenges and threats, and outlines potential future directions for responsible development and deployment of Personality Computing technologies.
arXiv Detail & Related papers (2025-03-03T22:03:48Z) - Gender Dynamics in Software Engineering: Insights from Research on Concurrency Bug Reproduction [0.5284425534494986]
We present a literature review to assess the gender ratio in this field.<n>Our findings indicate that female researchers are underrepresented compared to their male counterparts in this area.
arXiv Detail & Related papers (2025-02-27T17:15:23Z) - Beyond Binary Gender: Evaluating Gender-Inclusive Machine Translation with Ambiguous Attitude Words [85.48043537327258]
Existing machine translation gender bias evaluations are primarily focused on male and female genders.
This study presents a benchmark AmbGIMT (Gender-Inclusive Machine Translation with Ambiguous attitude words)
We propose a novel process to evaluate gender bias based on the Emotional Attitude Score (EAS), which is used to quantify ambiguous attitude words.
arXiv Detail & Related papers (2024-07-23T08:13:51Z) - Navigating the Path of Women in Software Engineering: From Academia to
Industry [2.2732417897161934]
We focus on Brazilian women to extend existing research, which has largely focused on North American and European contexts.
Our findings highlight persistent challenges faced by women in software engineering, including gender bias, harassment, work-life imbalance, undervaluation, low sense of belonging, and impostor syndrome.
arXiv Detail & Related papers (2023-12-08T02:58:26Z) - Data-Driven Analysis of Gender Fairness in the Software Engineering
Academic Landscape [4.580653005421453]
We study the problem of gender bias in academic promotions in the informatics (INF) and software engineering (SE) Italian communities.
We first conduct a literature review to assess how the problem of gender bias in academia has been addressed so far.
Next, we describe a process to collect and preprocess the INF and SE data needed to analyse gender bias in Italian academic promotions.
From the conducted analysis, we observe how the SE community presents a higher bias in promotions to Associate Professors and a smaller bias in promotions to Full Professors compared to the overall INF community.
arXiv Detail & Related papers (2023-09-20T12:04:56Z) - VisoGender: A dataset for benchmarking gender bias in image-text pronoun
resolution [80.57383975987676]
VisoGender is a novel dataset for benchmarking gender bias in vision-language models.
We focus on occupation-related biases within a hegemonic system of binary gender, inspired by Winograd and Winogender schemas.
We benchmark several state-of-the-art vision-language models and find that they demonstrate bias in resolving binary gender in complex scenes.
arXiv Detail & Related papers (2023-06-21T17:59:51Z) - Dynamics of Gender Bias in Computing [0.0]
This article presents a new dataset focusing on formative years of computing as a profession (1950-1980) when U.S. government workforce statistics are thin or non-existent.
It revises commonly held conjectures that gender bias in computing emerged during professionalization of computer science in the 1960s or 1970s.
arXiv Detail & Related papers (2022-11-07T23:29:56Z) - Towards Understanding Gender-Seniority Compound Bias in Natural Language
Generation [64.65911758042914]
We investigate how seniority impacts the degree of gender bias exhibited in pretrained neural generation models.
Our results show that GPT-2 amplifies bias by considering women as junior and men as senior more often than the ground truth in both domains.
These results suggest that NLP applications built using GPT-2 may harm women in professional capacities.
arXiv Detail & Related papers (2022-05-19T20:05:02Z) - Assessing Gender Bias in the Information Systems Field: An Analysis of
the Impact on Citations [0.0]
This paper outlines a study to estimate the impact of scholarly citations that female IS academics accumulate vis-a-vis their male colleagues.
By doing so we propose to contribute knowledge on a core dimension of gender bias in academia, which is, so far, almost completely unexplored in the IS field.
arXiv Detail & Related papers (2021-08-22T18:18:52Z) - Gender Stereotype Reinforcement: Measuring the Gender Bias Conveyed by
Ranking Algorithms [68.85295025020942]
We propose the Gender Stereotype Reinforcement (GSR) measure, which quantifies the tendency of a Search Engines to support gender stereotypes.
GSR is the first specifically tailored measure for Information Retrieval, capable of quantifying representational harms.
arXiv Detail & Related papers (2020-09-02T20:45:04Z) - Multi-Dimensional Gender Bias Classification [67.65551687580552]
Machine learning models can inadvertently learn socially undesirable patterns when training on gender biased text.
We propose a general framework that decomposes gender bias in text along several pragmatic and semantic dimensions.
Using this fine-grained framework, we automatically annotate eight large scale datasets with gender information.
arXiv Detail & Related papers (2020-05-01T21:23:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.