Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.
The Department of Justice (DOJ) recently warned that automated employment application screening has the potential to unlawfully discriminate against disabled workers, violating the Americans with a Disability Act (ADA). The report outlined the potential for discrimination; the reasonable accommodations employers should provide when leveraging computer-based screening tools; and the safeguards that need to be in place moving forward. The Department’s recent news release is part of a larger pattern of governmental agencies stepping up to provide guidance and litigation on AI-based hiring tools that have previously gone unchecked, resulting in high rejection rates amongst more disadvantaged workers, including those with disabilities.
How AI impacts “hidden workers”
With hybrid or entirely remote positions increasingly becoming the norm, there is an opportunity for more inclusion and increased participation in the workforce amongst many unemployed and underemployed Americans – whether that be the woman in a wheelchair for whom a daily commute to an office is a logistical challenge, or the father who needs to pick up his children from school at 3:30. Yet, they continue to face high rates of automated rejection before their resumes even land on a person’s desk.
At a moment where companies are dealing with high turnover and a boom in demand for talent, it hardly seems as if American companies can afford to be rejecting qualified applicants. Yet, many use AI tools to screen applicants. These include anything from simple resume and job description matching programs, to more complex programs such as resume “scoring” systems or video interview tools. While computer programs can often be thought of as less biased, they are only as unbiased as the data they are trained on and often, the teams who made them. A video interview tool that claims to measure a candidate’s enthusiasm or expertise would need to know how to understand that candidate’s accent, voice tone, or way of speaking. A resume screening tool that hasn’t been trained on resumes with employment gaps might unfairly filter out new parents, not because they aren’t qualified for a job, but because it hasn’t been trained to evaluate people like them.
Companies that use computer screening programs are keenly aware of their shortcomings. A recent report from Accenture and Harvard Business Review (HBS) found that 88% of employers agree that “qualified high skills candidates” were filtered out because of these systems. In fact, the report determined that due, in part, to these automated screening systems, the U.S has an estimated 27 million “hidden workers.” These include Americans with disabilities, caregivers, veterans, immigrants, refugees, retirees hoping to return to work, the long-term unemployed, or those without college degrees. People falling into these categories are willing, able, and aspiring to work, but cannot make it through the application process to get the opportunity to do so. This provides a profoundly different picture of unemployment in the U.S., which currently puts the total number of unemployed Americans at about 5.9 million as of April 2022. complian
How to ensure compliance with ADA guidelines
There are simple, yet impactful, ways that companies can actively curb the negative impact of automated screenings and avoid violating ADA guidelines.
- Be mindful of how candidates who aren’t in the majority are evaluated, and accommodate for atypical professional journeys. This could include “hidden workers” such as women, those with disabilities, or those returning from career breaks. Normalizing small differences in work histories, such as a maternity break, and ensuring that technology is not counting these differences against candidates, can be impactful in getting so-called invisible candidates through the door.
- Measure each part of the hiring process, including initial computer screening, rounds of interviews, other assessments, and onboarding. Keeping a close eye on the metrics of each level of evaluation can help identify issues as they arise. Action should be taken if there is one part of the hiring process during which diverse candidates disproportionately get filtered out or drop out.
- Specifically when it comes to the ADA, accessibility testing is crucial. Organizations should have a third-party test their website, application process, and any other tools or assessments used in hiring (such as video interview applications or technical assessments) to ensure that people aren’t turned away even before they have an opportunity to apply.
- Lastly, ensuring that diversity hiring, whether that be candidates with disabilities or other workers, is an issue that the whole organization owns. As noted in the HBS report, plenty of companies engage with these populations of hidden workers, yet they do so through their Corporate Social Responsibility (CSR) programs, rather than through their HR function. While all diversity efforts are good, this perpetuates the notion that hiring these candidates is an act of charity. In reality, these workers are valuable contributors who want and deserve to be given the same opportunities afforded to everyone else.
The new DOJ report is a step in the right direction. While there is much talk of new litigation to regulate the use of AI in hiring, existing equal employment guidelines and legislation such as the ADA can be leveraged right now to create better rules around AI screening tools. These tools are costing companies strong workers, but more importantly, they are causing undue harm for millions of Americans who are losing opportunities to be employed through no fault of their own.
Rena Nigam is founder and CEO of Meytier.