G7 Research Group G7 Information Centre
Summits |  Meetings |  Publications |  Research |  Search |  Home |  About the G7 Research Group
 
University of Toronto

GEAC 2025

Emerging Technology (AI and Quantum Technologies)

Closing the gender gap in emerging technology (tech) is not only a matter of equity—it is a driver of innovation, economic growth, and democratic resilience. While ground has been lost in traditional information technologies, emerging fields like artificial intelligence (AI) and quantum computing present a second chance—a unique opportunity to reset the course and ensure women are part of shaping the future. GEAC calls on the G7 to build on its previous commitments in the areas of science, technology, engineering, and mathematics (STEM) and tech-facilitated gender-based violence to address the related issues in AI and quantum technologies.

Recommendations

Increase women’s participation in AI and quantum technologies design and governance

Protect women and girls from AI-facilitated violence and harassment such as pornographic deepfakes

Ensure AI systems are bias free

Rationale

Most AI solutions are developed by men. This reflects the low rate of women graduates in STEM education, which in the G7 countries is at 33%.[2] Over the past decade, progress has been minimal, while demand for tech talent is accelerating. Accurate and comprehensive data is essential for understanding and addressing gender gaps in emerging technologies.

The gender gap in the AI workforce contributes to the existing biases, as technology designed and built mostly by men is susceptible to being skewed to represent their individual experiences.

Women’s involvement and leadership in AI and quantum technologies brings diverse perspectives, increases the likelihood that gender-based biases are addressed, and helps drive innovation and economic growth. Moreover, AI and quantum technologies cannot remain concentrated in large firms and elite labs. Scaling access to women entrepreneurs, especially in non-tech sectors (such as care, climate, and creative industries) will expand inclusive growth. Bridging the gender digital divide could save US$500 billion globally in the coming years.[3]

Online harms, including AI-generated violence and harassment like deep fakes primarily target women and girls, thereby undermining their safety and participation in digital spaces and public life. Between 96% and 98% of online deepfake videos are pornographic and nonconsensual, with 99% of them targeting women.[4]

Data bias in AI is widespread, with one study indicating that 44% of AI systems show stereotypes and gender biases[5] that exacerbate existing inequalities. For example, AI-driven recruitment systems may reject women candidates because their names are not male-associated or inadvertently match female applicants to lower-paying or less-prestigious positions. These biases can have devastating effects in, for example, the health sector, where AI systems often misdiagnose women because they’re trained on mostly male data.

[back to top]

Footnotes

[1] Pal, Siddi; Ruggero Marino Lazzaroni; and Paula Mendoza. AI's Missing Link: The Gender Gap in the Talent Pool. (2024)

[2] OECD. Education at a Glance: OECD Indicators. (2023)

[3] UN Women and DESA. Op. Cit.

[4] A 2019 study by DeepTrace found that 96% of online deepfake videos were pornographic and nonconsensual. A 2023 study by Home Security Heroes found that deepfake porn makes up 98% of all deepfake videos online, with 99% of them targeting women.

[5] Smith, Genevieve, and Ishita Rustagi. “When Good Algorithms Go Sexist: Why and How to Advance AI Gender Equity.” Stanford Social Innovation Review. (2021)

[back to top]

Source: Official website of Canada's 2025 G7 presidency


G7 Information Centre

Top of Page
This Information System is provided by the University of Toronto Libraries and the G7 Research Group at the University of Toronto.
Please send comments to: g7@utoronto.ca
This page was last updated April 12, 2026.
X      Facebook      Instagram      LinkedIn

All contents copyright © 2026. University of Toronto unless otherwise stated. All rights reserved.