When It Raines, it Pours: HR Tech Solution Providers May Be Liable for Discrimination in California

NEWSLETTER VOLUME 1.19

|

September 14, 2023

Editor's Note

When It Raines, it Pours: HR Tech Solution Providers May Be Liable for Discrimination in California

 

Sorry about the title. I couldn't resist.

 

Raines is the California Supreme Court case that holds third-party agencies that provide services related to hiring/employment can be directly liable for discrimination under state law. 

 

It's not a stretch to think that HR Tech providers that offer products related to recruiting and hiring may also be liable under Raines. While the EEOC has said that employers are definitely responsible for their hiring decisions even if they rely on AI and tech in the process, it did not preclude liability for the tech provider. 

 

Most HR Tech sales contracts contain indemnity and other clauses that protect the tech companies from the consequences of using their products. Whether those clauses will hold up will depend on the facts of the situation. In allocating responsibility, here are some of the questions the courts will ask: 

 

  • Did the tech provider know that its product may produce discriminatory results? 
  • What steps, if any, did the tech provider take to inform users of the potential for discrimination and how to mitigate it? 
  • How does the tech product work and what suggestions, predictions, and "insights" does it provide? 
  • Does the tech make decisions on who makes the cut or is there meaningful human oversight? 
  • Is it reasonable for the users of the product to rely on these insights in making employment decisions or does the decision involved require other information? 
  • Which is less biased, humans or the tech product? 
  • Do principles of product liability apply to HR Tech and what does it mean to use the product as designed? 
  • How do we determine causation and whether the tech product caused the discriminatory decision? Does a ranking or prediction that is discriminatory cause an applicant to be rejected? If the tech influenced a human hiring decision, is that enough for causation? 
  • What public policies are involved and do those outweigh the parties attempts to determine liability through contracts?  

 

Another important aspect of using HR Tech is that the data, history, and processes are all recorded and generally discoverable in a lawsuit. HR Tech is potentially the black box of HR. 

 

As states continue to regulate the use of AI in employment decisions and allocate liability by statute, both employers and HR Tech companies will need to better educate the people making the decisions on what the tech can and cannot tell you, what else to consider, and how to question and pressure test the tech insights. 

 

The good news is that it has never been easier to audit the outcomes of employment decisions for discrimination. So, if you use HR Tech in employment decisions, be sure to monitor and handle any issues right away. 

 

- Heather Bussing

 

California Supreme Court’s Expansion of “Employer” under FEHA Could Have Implications for AI Regulation

byMichelle Barrett Falconer, Marko Mrkonich, Cristina Piechocki, Niloy Ray, and Alice Wang

at Littler

The California Supreme Court issued a ruling this week that expands the definition of employer under the state’s main discrimination statute, the Fair Employment and Housing Act (FEHA). This expansion not only increases the number of defendants that can be swept into a FEHA action, but it may also have a significant impact on California’s burgeoning efforts to regulate the use of artificial intelligence in employment decisions. 

Background 

As we previously noted, on March 16, 2022, the U.S. Court of Appeals for the Ninth Circuit certified to the Supreme Court of California the following question: 

Does California’s Fair Employment and Housing Act, which defines “employer” to include “any person acting as an agent of an employer,” permit a business entity acting as an agent of an employer to be held directly liable for employment discrimination?1 

 

In Raines v. U.S. Healthworks Medical Group, the California Supreme Court answered in the affirmative to this question, first concluding that an employer’s business entity “agents” may be considered “employers" for purposes of the statute, and then holding that such an agent may be held directly liable for employment discrimination in violation of the Fair Employment & Housing Act when it has at least five employees2 and “when it carries out FEHA-regulated activities on behalf of an employer.” The court recognized that its ruling “increases the number of defendants that might share liability” when a plaintiff brings FEHA-related claims against their employer. 

 

In reaching its holding, the court analyzed the FEHA Section 12926(d)’s language, stating that the “most natural reading” supports the determination that an employer’s business-entity’s agent “is itself an employer for purposes of FEHA.” The court further addressed the statute’s legislative history, tracing the origins of the definition of “employer” to the Fair Employment Practices Act (FEPA) enacted in 1959, which adopted the National Labor Relations Act’s (NLRA) “agent-inclusive language.” The court also looked to federal case law, finding support for the idea that “an employer’s agent can, under certain circumstances, appropriately bear direct liability under the federal antidiscrimination laws.” Significantly, the court found that its prior rulings in Reno v. Baird3 and Jones v. Lodge at Torrey Pines Partnership,4 which did not extend personal liability for claims of discrimination or retaliation to supervisors, did not dictate the result here. 

 

The court also reviewed policy reasons that could impact the reading of the statutory language: 

  • Imposing liability on an employer’s business entity agents broadens FEHA liability to the entity that is “most directly responsible for the FEHA violation” and “in the best position to implement industry-wide policies that will avoid FEHA violations”; 
  • Imposing liability on an employer’s business entity agents “furthers the statutory mandate that the FEHA ‘be construed liberally’ in furtherance of its remedial purposes”; and 
  • The court’s reading of the statutory language “will not impose liability on individuals who might face ‘financial ruin for themselves and their families’ where held directly liable under the FEHA.” 

Equally important are rulings not made by the court in Raines. The California Supreme Court noted that it was not deciding the significance, if any, of an employer’s control over an agent’s acts that gave rise to a FEHA violation, nor did the court decide whether its conclusion extends to business-entity agents that have fewer than five employees. Critically, it also did not address the scope of a business-agent’s potential liability pursuant to FEHA’s aiding-and-abetting provision. 

Impact on California’s Efforts to Regulate AI in Employment Decisions   

Raines will likely have a significant impact on businesses that provide services or otherwise assist employers in the use of automated-decision systems for recruiting, screening, hiring, compensation, and other personnel management decisions. Coupled with proposed revisions to the state’s FEHA regulations, this expansion of the statute’s reach takes California one step closer to establishing joint and several liability across the AI tool supply chain. 

Under the Fair Employment & Housing Council’s proposed regulations5 addressing the use of artificial intelligence, machine learning, and other data-driven statistical processes to automate decision-making in the employment context, it is unlawful for an employer to use selection criteria—including automated decision systems—that screen out, or tend to screen out, an applicant or employee (or a class of applicants or employees) on the basis of a protected characteristic, unless the criteria are demonstrably job-related and consistent with business necessity. The draft regulations explicitly define “agent” broadly to include third-party providers of AI-driven services related to recruiting, screening, hiring, compensation and other personnel processes, and redefine “employment agency” to similarly cover these third-party entities.6 One key proposal – under the aforementioned aiding-and-abetting provision – even extends liability to the “design, development, advertisement, sale, provision, and/or use of an automated-decision system.” The high court’s decision in Raines unquestionably supports the Council’s proposed revisions, and enhances joint and several liability for artificial intelligence tool supply chains regardless of the final incarnation of the Council’s regulations. 

Footnotes 

1See Raines v. U.S. Healthworks Medical Group, 28 F.4th 968, 969 (9th Cir. 2022). 

2 The “five employees” threshold for an agent is consistent with the FEHA’s own definition of employer, which includes “any person regularly employing five or more persons….” Cal. Gov. Code § 12926(d). 

3 18 Cal. 4th 640 (1998). 

4 42 Cal. 4th 1158 (2008). 

5 Civil Rights Council Proposed Modifications to Employment Regulations Regarding Automated-Decision Systems, Attachment C, version 2/10/2023.  

6 “Employment Agency” is defined to include “[a[]ny person, for compensation, services to identify, screen, and/or procure job applicants, employees or opportunities to work, including persons undertaking these services through the use of an automated decision system.” 

 

It's Easy to Get Started

Transform compensation at your organization and get pay right — see how with a personalized demo.