For millions of workers, online job listings provide the first point of contact to potential employers. As a result, job listings and their word choices can significantly affect the makeup of the responding applicant pool. Although women make up over half of the professional workforce, they hold only about a quarter of positions in computing fields. In recent years, web services began stating that gendered wording in job advertisements deters applicants of particular genders from applying for such listings. These services claim simply altering or removing certain words can dramatically improve diversity in the applicant pool.
We study the effects of potentially gender-biased terminology in job listings, and their impact on job applicants, using job listings on LinkedIn spanning 10 years. We develop algorithms to detect and quantify gender bias, validate them using external tools, and use them to quantify job listing bias over time. Then, we perform a user survey to validate our findings and to quantify the end-to-end impact of such bias on applicant decisions.