You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Promote gender equality by identifying potential gender bias in letters of recommendation and evaluations
The goal of Reading for Gender Bias is to develop a web-based automated text analysis tool that will scan letters of recommendations or evaluations and return feedback to the user on how to change the writing to address implicit gender bias identified within the text.
What do we need to do?
This issue is the roadmap. It is a place to start to find ways you can contribute. Check out the different milestones listed below or explore issues by label.
Define target patterns of bias (or signals) from the literature - March 15th - Review existing literature on gender bias in letters and select key patterns (signals) to include in this project
Draft Website - March 29th - Create a mock-up version of the website with a text field, something like this or this one
Create issues for each signal - April 5th - Create an issue entry for each signal with instructions on the level of the signal (word, sentence, paragraph, page) and what information to return to the user
Program text analysis for first signal - April 12th - Write code to scan text for first signal and return summary to the user
Create deadlines for all signals - April 19th - Prioritize signals and add deadlines to the issue entry
Protoype - May 3rd - Prototype of website for user input, at least one signal (text analysis), and summary (output)
Mozilla Global Spring - May 10th
Long-term goals
Expand to statistical, instead of rule, based analysis
Develop program/app that can interface with online evaluation program for students to test efficacy
Expand to include other types of bias such as racism, classism, etc.
The text was updated successfully, but these errors were encountered:
Roadmap
Promote gender equality by identifying potential gender bias in letters of recommendation and evaluations
The goal of Reading for Gender Bias is to develop a web-based automated text analysis tool that will scan letters of recommendations or evaluations and return feedback to the user on how to change the writing to address implicit gender bias identified within the text.
What do we need to do?
This issue is the roadmap. It is a place to start to find ways you can contribute. Check out the different milestones listed below or explore issues by label.
Please check out the contributors' guidelines and code of conduct to help you get started. For an overview of the project, check out the README, if you haven't already!
Milestones
Long-term goals
The text was updated successfully, but these errors were encountered: