WordPress Ad Banner

US Penalizes Chinese Firm Misusing AI in Recruitment


In what serves as a stark reminder against the unlawful application of AI in business operations, a significant settlement has been reached in the US, marking the inaugural resolution involving AI-driven recruitment tools. The Equal Employment Opportunity Commission (EEOC) successfully resolved a dispute with a Chinese online education platform, underscoring the growing importance of ethical AI practices in the hiring domain.

The focal point of the settlement is iTutorGroup, an entity that came under scrutiny in 2020 for allegedly employing AI tools to engage in discriminatory practices during the recruitment process. The platform, which hires online educators across a range of subjects, was accused of segregating older and younger job applicants through its AI-powered processes.

WordPress Ad Banner

The EEOC, in its charge sheet filed in 2022, stated, “Three interconnected enterprises offering English-language tutoring services under the ‘iTutorGroup’ brand in China violated federal law by programming their online recruitment software to automatically dismiss older candidates based on their age.”

Having initiated an initiative in 2021 aimed at ensuring that Artificial Inteligence software employed by US employers adheres to anti-discrimination legislation, the EEOC underscored its commitment to scrutinizing and addressing instances of AI misuse. According to a report by the Economic Times, the EEOC made it clear that it would focus its enforcement endeavors on companies found to be misappropriating AI capabilities.

The culmination of this effort resulted in a settlement agreement, with iTutorGroup agreeing to pay $365,000 to over 200 ‘senior’ job applicants whose applications were purportedly disregarded due to their age. The settlement, documented in a joint submission to the New York federal court and reported by Reuters, encompasses remedies such as back pay and liquidated damages.

Central to the allegations against iTutorGroup was its AI software’s systematic exclusion of female candidates aged above 55 and male candidates above 60, contravening the provisions of the Age Discrimination in Employment Act (ADEA). This case exemplifies the significance of fair and just application of AI in HR processes.

Interestingly, a parallel lawsuit was filed by another entity, accusing iTutorGroup of having developed AI-powered software that aids other companies in singling out applicants based on characteristics such as race, age, and disability. This legal action was brought forth by Derek Mobley against Workday, claiming that the AI software developed by the latter facilitated biased candidate screening. Mobley, a Black man over 40 who grapples with anxiety and depression, alleged that Workday’s software worked against him as he applied for positions at organizations utilizing Workday’s recruitment screening tool.

The unfolding scenario highlights the imperative for automated AI systems, designed to assist HR departments, to be equitable and secure. Noteworthy players like Accenture and Lloyds Banking Group have already incorporated innovative techniques like virtual reality games into their hiring processes. With the rise of AI in recruitment, a report by Aptitude Research revealed that 55% of companies are augmenting their investments in recruitment automation. This underscores the need for a thoughtful, ethical, and legal approach to AI utilization in the employment sphere.