Artificial Intelligence, Algorithms & the EEOC -What Employers Must Know

artificial intelligence, algorithms, eeoc, eeoc violations

Artificial Intelligence (AI), machine learning, and algorithms have been the subject of ongoing Equal Employment Opportunity Commission (EEOC) discriminatory violations. A newly released guidance from EEOC discusses how algorithmic hiring tools can be discriminatory against people with disabilities. Employers who utilize algorithmic processes should take note to ensure that your practices are not in violation of the EEOC and are Americans with Disabilities Act (ADA) compliant. Candidates seeking employment should be aware of limitations of A.I. Artificial Intelligence and algorithms and how it can affect your prospects for employment. Studies have shown that approximately 70% of all businesses utilize artificial intelligence and algorithms during the hiring process.

The Center for Democracy and Technology reported that algorithms have “risk of discrimination written invisibly into their codes” and “people with disabilities, those risks can be profound.”  

Employment Opportunity Commission (EEOC) Guidance

As a starting point, this section explains the meaning of three, central terms used in this document—software, algorithms, and artificial intelligence (“AI”) —and how, when used in a workplace, they relate to each other.

  • Software: Broadly, “software” refers to information technology programs or procedures that provide instructions to a computer on how to perform a given task or function. “Application software” (also known as an “application” or “app”) is a type of software designed to perform or to help the user perform a specific task or tasks. The United States Access Board is the source of these definitions.

There are many different types of software and applications used in employment, including: automatic resume-screening software, hiring software, chatbot software for hiring and workflow, video interviewing software, analytics software, employee monitoring software, and worker management software.

  • Algorithms: Generally, an “algorithm” is a set of instructions that can be followed by a computer to accomplish some end. Human resources software and applications use algorithms to allow employers to process data to evaluate, rate, and make other decisions about job applicants and employees. Software or applications that include algorithmic decision-making tools may be used at various stages of employment, including hiring, performance evaluation, promotion, and termination.
  • Artificial Intelligence (“AI”): Some employers and software vendors use AI when developing algorithms that help employers evaluate, rate, and make other decisions about job applicants and employees. In the National Artificial Intelligence Act of 2020 at section 5002(3) section 5002(3), Congress defined “AI” to mean a “machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations or decisions influencing real or virtual environments.” In the employment context, using AI has typically meant that the developer relies partly on the computer’s own analysis of data to determine which criteria to use when making employment decisions. AI may include machine learning, computer vision, natural language processing and understanding, intelligent decision support systems, and autonomous systems. For a general discussion of AI, which includes machine learning, see National Institute of Standards and Technology Special Publication 1270.

Employers may rely on different types of software that incorporate algorithmic decision-making at a number of stages of the employment process. Examples include: resume scanners that prioritize applications using certain keywords; employee monitoring software that rates employees on the basis of their keystrokes or other factors; “virtual assistants” or “chatbots” that ask job candidates about their qualifications and reject those who do not meet pre-defined requirements; video interviewing software that evaluates candidates based on their facial expressions and speech patterns; and testing software that provides “job fit” scores for applicants or employees regarding their personalities, aptitudes, cognitive skills, or perceived “cultural fit” based on their performance on a game or on a more traditional test. Each of these types of software may include AI.

Conclusion

Employers are utilizing new technologies to streamline laborious processes, and often overwhelming data in the form of applications and resumes relating to the hiring process. Even as Artificial Intelligence and algorithms have advanced on a technical level there still is the potential for unintentional discriminatory results. Employers should utilize due diligence when engaging these technologies to ensure Americans with Disabilities Act compliance.

For additional information regarding Artificial Intelligence and algorithms, please click – https://www.gov/laws/guidance/americans-disabilities-act-and-use-software

For additional information regarding this guidance, please click- https://tabb.net/hiring-platform-overview-2/