Gender Bias in Artificial Intelligence A Systematic Review of the Literature

Main Article Content

Rosileine Mendonça de Lima
Barbara Pisker
Victor Silva Corrêa


Bias, Gender, Artificial Intelligence, Systematic Literature Review


This study presents a Systematic Literature Review (SLR) of Gender Bias in Artificial Intelligence (AI). The research was conducted using two techniques: a domain-based approach to SLR process providing a bibliometric sample description and in-depth examination of the thematic categories arising from inductive categorization, extracted from reading and interpretation of the final 35 sample articles analyzed. In answering three key research questions on the types, causes, and overcoming (mitigating) strategies of gender bias in artificial intelligence, three thematic treemaps were constructed, enabling systematic overview as an essential contribution to the literature. The main types of gender bias found in AI are categorized as societal, technical, and individual. Societal and socio-technical aspects stand out as the leading causes of bias, while debiasing, dataset design and gender sensitivity were the most frequent among the main strategies for overcoming bias. The study also proposes theoretical, practical and managerial capacity building and policy implications that aim to influence broad socio-technical challenges and refer to changes necessary, aiming to create bias-free artificial intelligence.


Download data is not yet available.
Abstract 311 | 690-PDF-v11n2pp8-30 Downloads 14


Antony, J., Psomas, E., Garza-Reyes, J.A., & Hines, P. (2020). Practical implications and future research agenda of lean manufacturing: a systematic literature review. Production Planning and Control, 32(11), 889̶̶925.
Arseniev-Koehler, A., Cochran, S. D., Mays, V. M., Chang, K. W., & Foster, J. G. (2022). Integrating topic modelling and word embedding to characterize violent deaths. Proceedings of the National Academy of Sciences, 119(10), e2108801119.
Asr, F. T., Mazraeh, M., Lopes, A., Gautam, V., Gonzales, J., Rao, P., & Taboada, M. (2021). The gender gap tracker: Using natural language processing to measure gender bias in media. PloS one, 16(1), e0245533.
Bardhan, R., Sunikka-Blank, M., & Haque, A. N. (2019). Sentiment analysis as a tool for gender mainstreaming in slum rehabilitation housing management in Mumbai, India. Habitat International, 92, 102040.
Bhardwaj, R., Majumder, N., & Poria, S. (2021). Investigating gender bias in BERT. Cognitive Computation, 13(4), 1008-1018.
Breazeal, C., & Brooks, R. (1997). Gender Holes in Intelligent Technologies. In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems (pp. 1187–1192). IEEE.
Chen, X., Li, Z., Setlur, S., & Xu, W. (2022). Exploring racial and gender disparities in voice biometrics. Scientific Reports, 12(1), 3723.
Conz, E., & Magnani, G. (2020). A dynamic perspective on the resilience of firms: A systematic literature review and a framework for future research. European Management Journal, 38(3), 400–412. http://doi:10.1016/j.emj.2019.12.004
Corrêa, V. S., Brito, F. R. S., Lima, R. M., & Queiroz, M. M. (2022a). Female entrepreneurship in emerging and developing countries: A systematic literature review. International Journal of Gender and Entrepreneurship, 14(3), 300–322. http://doi:10.1108/IJGE-08-2021-0142
Corrêa, V. S., Lima, R. M., Brito, F. R. S., Machado, M. C., & Nassif, V. M. J. (2022b). Female entrepreneurship in emerging and developing countries: A systematic review of practical and policy implications and suggestions for new studies. Journal of Entrepreneurship in Emerging Economies.
Crawford, K. (2021). The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence. Yale University Press (December 2022).
Crawford, K. (2013). The hidden biases of big data. Harvard Business Review Blog, Apr 1. Retrieved from Accessed on Apr 10, 2023.
Das, S., & Paik, J. H. (2021). Context-sensitive gender inference of named entities in text. Information Processing & Management, 58(1), 102423.
Deacon, T. W., & Brooks, D. R. (1988). Artificial Intelligence and the Bias of the Human Architect. In Proceedings of the 10th International Joint Conference on Artificial Intelligence (pp. 799–805). Morgan Kaufmann Publishers Inc.
DeFranza, D., Mishra, H., & Mishra, A. (2020). How language shapes prejudice against women: An examination across 45 world languages. Journal of personality and social psychology, 119(1), 7.
Draude, C., Klumbyte, G., Lücking, P., & Treusch, P. (2020). Situated algorithms: a sociotechnical systemic approach to bias. Online Information Review, 44(2), 325–342.
Dwork, C., & Minow, M. (2022). Distrust of Artificial Intelligence: Sources & Responses from Computer Science & Law. Daedalus, 151(2), 309–321.
Fyrvald, J. (2019). Mitigating algorithmic bias in Artificial Intelligence systems. Ph.D. Thesis, Uppsala Universitet. Available at
Fossa, F., & Sucameli, I. (2022). Gender Bias and Conversational Agents: an ethical perspective on Social Robotics. Science and Engineering Ethics, 28(3), 23.
Hägg, G., & Gabrielsson, J. (2020). A systematic literature review of the evolution of pedagogy in entrepreneurial education research. International Journal of Entrepreneurial Behaviour and Research, 26(5), 829–861.
Haraway, D. (1991). Simians, Cyborgs, and Women: The Reinvention of Nature. Routledge.
Haraway, D. (1987). A Manifesto for Cyborgs: Science, technology, and socialist feminism in the 1980s. Australian Feminist Studies, 2(4), 1–42.
Huluba, A. M., Kingdon, J., & McLaren, I. (2018). The UK Online Gender Audit 2018: A comprehensive audit of gender within the UK’s online environment. Heliyon, 4(12), e01001.
Jones, J. J., Amin, M. R., Kim, J., & Skiena, S. (2020). Stereotypical gender associations in language have decreased over time. Sociological Science, 7, 1–35.
Kordzadeh, N., & Ghasemaghaei, M. (2022). Algorithmic bias: review, synthesis, and future research directions. European Journal of Information Systems, 31(3), 388–409.
Kraus, S., Breier, M., & Dasí-Rodríguez, S. (2020). The art of crafting a systematic literature review in entrepreneurship research. International Entrepreneurship and Management Journal, 16, 1023–1042.
Kuppler, M. (2022). Predicting the future impact of Computer Science researchers: Is there a gender bias? Scientometrics, 127(11), 6695–6732.
Kurpicz-Briki, M., & Leoni, T. (2021). A World Full of Stereotypes? Further Investigation on Origin and Gender Bias in Multi-Lingual Word Embeddings. Frontiers in Big Data, 4, 625290.
Licklider, J. C., & Taylor, R. W. (1968). The computer as a communication device. Science and Technology, 76(2), 1̶3.
Machado, M. C., Vivaldini, M., & de Oliveira, O. J. (2020). Production and supply-chain as the basis for SMEs’ environmental management development: A systematic literature review, Journal of Cleaner Production, 273.
Mahmud, H., Islam, A. K. M. N., Ahmed, S. I., & Smolander, K. (2022). What influences algorithmic decision-making? A systematic literature review on algorithm aversion. Technological Forecasting and Social Change, 175. http://doi:10.1016/j.techfore.2021.121390
Martínez, C. D., García, P. D., & Sustaeta, P. N. (2020). Hidden Gender Bias in Big Data as Revealed Through Neural Networks: Man is to Woman as Work is to Mother? Revista Española de Investigaciones Sociológicas (REIS), 172(172), 41–76.
Nadeem, A., Marjanovic, O., & Abedin, B. (2022). Gender bias in AI-based decision-making systems: a systematic literature review. Australasian Journal of Information Systems, 26.
Noble, S. U. (2018). Algorithms of oppression. In Algorithms of oppression. New York University Press.
Oldenziel, R. (1992). Cynthia Cockburn, Machinery of Dominance: Women, Men, and Technical Know-How (Book Review). Technology and Culture, 33(1), 151.
Orgeira-Crespo, P., Míguez-Álvarez, C., Cuevas-Alonso, M., & Rivo-López, E. (2021). An analysis of unconscious gender bias in academic texts by means of a decision algorithm. Plos one, 16(9), e0257903.
Pair, E., Vicas, N., Weber, A. M., Meausoone, V., Zou, J., Njuguna, A., & Darmstadt, G. L. (2021). Quantification of Gender Bias and Sentiment Toward Political Leaders Over 20 Years of Kenyan News Using Natural Language Processing. Frontiers in Psychology, 12, 712646.
Paul, J., & Criado, A. R. (2020). The art of writing literature review: What do we know and what do we need to know? International Business Review, 29(4), 101717.
Patón-Romero, J. D., Vinuesa, R., Jaccheri, L., & Baldassarre, M. T. (2022). State of Gender Equality in and by Artificial Intelligence. IADIS International Journal on Computer Science and Information Systems, 17(2), 31–48.
Petreski, D., & Hashim, I. C. (2022). Word embeddings are biased. But whose bias are they reflecting? AI & Society, 1–8.
Reyero Lobo, P., Daga, E., Alani, H., & Fernandez, M. (2022). Semantic Web technologies and bias in artificial intelligence: A systematic literature review. Semantic Web (Preprint), 1–26.
Santos, S. C., & Neumeyer, X. (2021). Gender, poverty and entrepreneurship: A systematic literature review and future research agenda. Journal of Developmental Entrepreneurship, 26(3).
Savoldi, B., Gaido, M., Bentivogli, L., Negri, M., & Turchi, M. (2021). Gender bias in machine translation. Transactions of the Association for Computational Linguistics, 9, 845–874.
Scheuerman, M. K., Paul, J. M., & Brubaker, J. R. (2019). How computers see gender: An evaluation of gender classification in commercial facial analysis services. Proceedings of the ACM on Human-Computer Interaction, 3(CSCW), 1̶33.
Schopmans, H., & Cupać, J. (2021). Engines of patriarchy: ethical artificial intelligence in times of illiberal backlash politics. Ethics & International Affairs, 35(3), 329–342.
Schwemmer, C., Knight, C., Bello-Pardo, E. D., Oklobdzija, S., Schoonvelde, M., & Lockhart, J. W. (2020). Diagnosing gender bias in image recognition systems. Socius, 6, 2378023120967171.
Shrestha, S., & Das, S. (2022). Exploring gender biases in ML and AI academic research through systematic literature review. Frontiers in Artificial Intelligence, 5.
Tannenbaum, C., Ellis, R. P., Eyssel, F., Zou, J., & Schiebinger, L. (2019). Sex and gender analysis improves science and engineering. Nature, 575(7781), 137̶146.
Thelwall, M. (2018). Gender bias in machine learning for sentiment analysis. Online Information Review, 42(3), 343–354.
Tomalin, M., Byrne, B., Concannon, S., Saunders, D., & Ullmann, S. (2021). The practical ethics of bias reduction in machine translation: Why domain adaptation is better than data debiasing. Ethics and Information Technology, 23, 419–433.
Tranfield, D., Denyer, D., & Smart, P. (2003). Towards a methodology for developing evidence‐informed management knowledge by means of systematic review. British Journal of Management, 14(3), 207–222.
Tubaro, P., & Coville, M., Le Ludec, C., & Casilli, A. A. (2022). Hidden inequalities: the gendered labour of women on micro-tasking platforms. Internet Policy Review, 11(1).
Turkle, S. (2005). The second self: Computers and the human spirit. MIT Press.
Vargas-Solar, G. (2022). Intersectional study of the gender gap in STEM through the identification of missing datasets about women: A multisided problem. Applied Sciences, 12(12), 5813.
Vlasceanu, M., & Amodio, D. M. (2022). Propagation of societal gender inequality by internet search algorithms. Proceedings of the National Academy of Sciences of the United States of America, 119(29), e2204529119.
Waelen, R., & Wieczorek, M. (2022). The struggle for AI’s recognition: understanding the normative implications of gender bias in AI with Honneth’s theory of recognition. Philosophy & Technology, 35(2), 53.
Wajcman, J. (2004). TechnoFeminism. Polity Press: Cambridge.
Wellner, G., & Rothman, T. (2020). Feminist AI: Can We Expect Our AI Systems to Become Feminist? Philosophy & Technology, 33(2), 191–205.
Witherspoon, E. B., Schunn, C. D., Higashi, R. M., & Baehr, E. C. (2016). Gender, interest, and prior experience shape opportunities to learn programming in robotics competitions. International Journal of STEM Education, 3, 1–12.

Most read articles by the same author(s)