Using the wrong vendor to conduct criminal checks can land your company in hot water

Too many background check companies rely on what they describe as artificial intelligence (AI) to conduct background checks, and in particular, the criminal record research process. In reality, this methodology is neither artificial or intelligent. Rather than conduction criminal research through a primary source such as the State Police in a given state or a county court, background check companies promote a dangerous research product based on “webscrape technology”.

Checkr is the most recent of the big five background investigation vendors to run afoul of the Fair Credit Reporting Act (FCRA) and is the subject of numerous lawsuits due to its failure to properly confirm criminal records it finds in a search of it’s database correctly matches the identity of the candidate. Checkr exploded onto the background investigation scene in 2014 and now has 10,000 clients and processes more than 1 million background checks every month. It became the tech darling of the stodgy and mature background investigation industry.

Checkr grew due to the demand for fast and cheap criminal record checks needed by Uber, Lift and other tech giants that are hiring tens of thousands of employees, without regard for the consequences. A race to the bottom in an ever more cost sensitive industry has driven vendors to move toward automated proprietary criminal databases developed through webscrape technology rather than a proper criminal search through a state agency specifically authorized to allow a search of State Police records.

Automated software applications, “bots”, are designed to interact with other computers systems and scrape publically available data from municipal, county and state websites, state correctional and sex offenders websites and other data garnered from the Internet. Data scraped from these sources are then compiled into a searchable database. The problem is that not all courts or government agencies can be “webscraped” leaving a huge void and will miss a large number of actual criminal records. Additionally, many of the of the sites that are scraped have limited personal identifiers making a correct match with the candidate difficult to accomplish.

Background vendors tout AI’s ability to lower costs, increase speed and improve accuracy as a disrupting force in the background check industry. Is it a disrupting force? Checkr has been the subject of numerous class action lawsuits as well as many individual lawsuits brought by people who were denied employment due to the erroneous reporting attributing criminal and expunged records to a candidate. Clients have been the subject of lawsuits as well. Records in the AI databases are often old and stale having been cultivated weeks or months earlier. Charges are often dropped, records are expunged at the time of the candidate’s background check but a search is conducted of data collected at a previous time and has not been updated to reflect the current status of the case. The result is the erroneous reporting of a record.

The primary issue in the lawsuits is the violation of the FCRA by compiling information, conducting a criminal record database search on the candidate and reporting criminal records without checking the accuracy. Unlike a criminal search through the State Police, none of the websites scraped by AI will ever have the offender’s Social Security number that would aid in identifying a record belongs to the candidate. In practice, when a record that is believed to belong to the candidate is found in an AI database, the vendor must conduct a primary source (State Police or county court) search to confirm the identity of the individual in the record matches and accurately reports the offense. In an even increasing need to reduce costs, this confirmation process is conducted with “AI” and without the intervention of a human to confirm the record matches the identity of the candidate or to run additional criminal record checks as required. Why? The truth of AI is the relentless approach to cut costs in an ever competitive environment.

What are AI limitations? Connecticut, Maryland and MN for example have websites that are available to the public to conduct criminal record research. Bots are able to scrape data from these three sites for resale. However, other states are well aware of the value of their criminal records and have joined the background investigation industry as a source for accurate information. State agencies have capitalized on their criminal records and created web-based portals allowing background check companies to conduct research of actual State Police or State agency records, earning tens of millions of dollars and in some cases, over $100 million dollars annually on behalf of the taxpayer. AI is a competitor and like any business, states jealously guard their data and ward off competitors.

New York developed tremendous revenue streams selling statewide criminal records for employment purposes through the Office of Court Administration earning well over $100 million a year annually. With revenues like this, New York will not allow court records to be scraped and result in a loss of revenue. New York is so determined to protect its data that it has ordered County courts to deny access to criminal record terminals and court clerks are prohibited from conducting searches for the public. A bot will be unable to access any criminal record in New York through the court system. AI so highly touted by background investigation vendors but AI will only webscrape a limited number of criminal records in New York which begs the question, how thorough can the search be when records from an entire state cannot be obtained. Companies operating in New York or nearby states beware.

Primary source criminal research vs. webscrape technology-

We are used to seeing movies and television shows where an individual is able to access the full criminal history on individual from their laptop. This is not possible unless you are member of law enforcement. There is no true national criminal record database through a government agency that is available to a background vendor without fingerprinting. To conduct a proper criminal record search, each state must be searched individually.

Primary source criminal record research is both expensive and time consuming. In most cases, the background vendor must log into a State Police website, manually enter the candidates name and personal identifiers, one at a time. Primary source criminal record checks are also expensive. Some examples; Florida charges $25, New Jersey charges $18 in the most expensive state to conduct a criminal record search, New York at $95.

Many employers balk at this expense with good reason. If a background vendor states that they can provide the entire country for between $8 and $10, why would you utilize a vendor that must pass along the state fees of between $10 and $95 just for a single state. Conversely, what are you getting for $8.00 national criminal check? Cheap can be very expensive. AI searches a webscraped database in seconds and avoids the manual entry on a state website to conduct a criminal record check and avoids the state-mandated criminal search fees. However, AI has issues with correctly identifying and comparing a criminal record that is developed with the candidate and the AI databases are incomplete since they cannot access all criminal records.

Class action and individual lawsuits give us a window into how many individuals knew they lost a job opportunity and took action against the background vendor and the employer. What is not known is how many individuals lost job opportunities without their knowledge. On the employer side of the equation, it is not known how many individuals have a serious criminal history were missed due to the AI webscrape limitation. How many gig economy  drivers and individuals delivering groceries and packages to your door have been involved in violent or sex-related crimes? Only time will tell that story.

Cheap can be very expensive. Employers beware, failure to follow FCRA guidelines and can result in a lawsuit for your company as well due to errors caused by the vendor. The main problem with AI background vendors is the issue of saving time and money by automating processes that run without supervision rather than conducted by real people. The FCRA mandates that if an employer intends to deny an opportunity “in whole, or in part” due to information contained within a background investigation, the employer must send a candidate a pre-adverse action letter with notification that information in the report may result in rescinding an offer. The letter must include the name, address and phone number of the background vendor and notification that the candidate has a right to dispute information contained in the report with the vendor. The employer must also include a copy of the background investigation with the letter so that the candidate is aware of the negative information and can dispute incorrect information with the vendor. This is a simple matter of fairness so that a candidate is not denied an opportunity to advance their career due to an error caused by the vendor.

Many of the AI related lawsuits are due to a failure of the employer to follow FCRA guidelines and the failure of the vendor to perform their due diligence when reporting criminal histories and properly reinvestigate disputes to ensure the accuracy of the information they provide.