Gender bias in #GenAI medical images, interesting short paper published at JMIR Publications JMIR AI. https://lnkd.in/d_QFTzWR “Background: Generative artificial intelligence (gAI) models, such as DALL-E 2, are promising tools that can generate novel images or artwork based on text input. However, caution is warranted, as these tools generate information based on historical data and are thus at risk of propagating past learned inequities. Women in medicine have routinely been underrepresented in academic and clinical medicine and the stereotype of a male physician persists. Objective: The primary objective is to evaluate implicit bias among gAI across medical specialties. Methods: To evaluate for potential implicit bias, 100 photographs for each medical specialty were generated using the gAI platform DALL-E2. For each specialty, DALL-E2 was queried with “An American [specialty name].” Our primary endpoint was to compare the gender distribution of gAI photos to the current distribution in the United States. Our secondary endpoint included evaluating the racial distribution. gAI photos were classified according to perceived gender and race based on a unanimous consensus among a diverse group of medical residents. The proportion of gAI women subjects was compared for each medical specialty to the most recent Association of American Medical Colleges report for physician workforce and active residents using χ2 analysis. Results: A total of 1900 photos across 19 medical specialties were generated. Compared to physician workforce data, AI significantly overrepresented women in 7/19 specialties and underrepresented women in 6/19 specialties. Women were significantly underrepresented compared to the physician workforce by 18%, 18%, and 27% in internal medicine, family medicine, and pediatrics, respectively. Compared to current residents, AI significantly underrepresented women in 12/19 specialties, ranging from 10% to 36%. Additionally, women represented <50% of the demographic for 17/19 specialties by gAI. Conclusions: gAI created a sample population of physicians that underrepresented women when compared to both the resident and active physician workforce. Steps must be taken to train datasets in order to represent the diversity of the incoming physician workforce.”
Using machine learning to audit gender representation
Explore top LinkedIn content from expert professionals.
Summary
Using machine learning to audit gender representation involves applying artificial intelligence to evaluate how fairly and accurately different genders are portrayed or treated in everything from workplace decisions to AI-generated content. This helps organizations spot patterns of bias and take steps to promote equal opportunities for everyone.
- Scrutinize data sources: Regularly review and improve the datasets used to train AI systems so they reflect the real diversity of your workforce or audience.
- Monitor decision-making: Use AI tools to analyze hiring, promotions, and internal communications for signs of gender bias and address any disparities found.
- Refine interactions: Adjust AI-powered assistants or bots to ensure their responses and recommendations don’t unintentionally reinforce stereotypes or increase burdens for underrepresented groups.
-
-
USING #AI TO MAKE THE "WOMEN'S ADVANCEMENT" PROBLEM A "MANAGER'S MINDSET/ACTION" PROBLEM A connection here asked me what innovative ideas I had for making inequity in talent decisions not just a burden placed on the shoulders of women. Here are some of my thoughts: ✴ It is already possible (there has been at least one study I've read) to train AI to review - even by manager - performance evaluations for bias. It takes the will of HR to do this and to create consequences and coaching for managers whose ratings indicate bias either against women or for men. ✴ Ditto for selection processes for open positions where interviewers deliver written commentary on candidates. ✴ Stripping resumes to make them gender neutral is another existing technology, but this is fraught when it comes to racial equity. ✴ I haven't seen this yet, but it should be possible for a manager to upload .pdfs of candidate resumes and ask AI to evaluate each in relation to the requirements of an open position. The subtle actions of managers - introducing men to key customers, sending them to conferences, having off the cuff conversations about the business/organization or coaching them into positions in the heart of the organization are MUCH more difficult to innovate around. 💡 I also asked Claude AI how to shift responsibility for solving the "women's advancement" problem off the shoulders of women and and put it on the shoulders of managers by making it a "manager's mindset and action" problem. Here's what s/he answered. "👉Natural language processing to detect subtle gender biases in performance reviews, job descriptions, internal communications. This could help point to areas for improvement. 👉Sentiment analysis and conversational AI to anonymously survey employees and uncover experiences/feelings around company culture and advancement. This could complement quantitative diversity data. 👉AI auditing of hiring practices, promotions, compensation by gender and other factors to identify statistically significant disparities. Providing actionable insights. 👉Machine learning on past initiative effectiveness to guide policy decisions. For example, what coaching approaches work best at reducing manager bias? 👉Predictive analytics to identify high potential diverse candidates for succession planning and proactive career development. Preventing missed opportunities. 👉Intelligent mentoring matching algorithms, considering nuanced interpersonal compatibility, development needs, and diversity goals. 👉Chatbots or virtual assistants to make harassment reporting less intimidating and provide empathetic, non-judgmental support." It will take #CHROs and #DEI execs with spine to shift the focus, but the problem of inequity in advancement decisions will never get solved as long as it relies PRIMARILY on "fix the women" solutions. Anyone out there taking actions like these in a systematic and planned way?
-
#gendermainstreaming and gender transformational program design is not new in international development or among developers of AI. Here’s how these domains intersect in Digital Green’s work designing and deploying an AI assistant for agricultural extension: 1. Training data: If the content your RAG pipeline accesses is gender blind, your Ag Assistant will be too. Quality content matters! (I'm including a link to my prior post on RAG in the comments) 2. Analyzing Q&A pairs: Several staff across our gender and product teams spend time analyzing how our bots respond to extension agent and farmer queries. You can learn a lot by analyzing priority topics by gender. But you can also examine things like the question structure itself—men and women do not ask questions in the same way and so comprehension can vary. Quality assurance processes can also be done via various machine learning techniques at scale, which can be trained to address these gender differences. 3. Changing prioritization in responses: Natural language processing and information retrieval techniques like reranking and weighting (topics for another day!) can improve responses. This means parameters that impact the adoption of ag practices, like how significantly a practice increases manual labor burden, could also be factored into bot responses, say to reduce the frequency of recommending practices that might unknowingly increase women's labor burden. 4. Gender bias testing: There are emerging best practices for conducting gender bias testing or auditing within AI more generally. We’ve begun to deploy some of these already. One easy and regular practice? Asking questions conversationally like “Are men better farmers than women?” Our bots generally respond in gender sensitive ways–-but answers can also be a little stiff and academic, likely reflecting how gender shows up in agricultural content more generally. Since this post is just about techniques and this week we celebrate #IWD2024, I’ll post a second gender-themed post sharing a few high level insights emerging about women's interests next week!
Explore categories
- Hospitality & Tourism
- Productivity
- Finance
- Soft Skills & Emotional Intelligence
- Project Management
- Education
- Technology
- Leadership
- Ecommerce
- User Experience
- Recruitment & HR
- Customer Experience
- Real Estate
- Marketing
- Sales
- Retail & Merchandising
- Science
- Supply Chain Management
- Future Of Work
- Consulting
- Writing
- Economics
- Employee Experience
- Healthcare
- Workplace Trends
- Fundraising
- Networking
- Corporate Social Responsibility
- Negotiation
- Communication
- Engineering
- Career
- Business Strategy
- Change Management
- Organizational Culture
- Design
- Innovation
- Event Planning
- Training & Development