An unsuccessful job applicant is suing Sirius XM Radio in federal court, claiming the company’s AI-powered hiring tool discriminated against him based on his race. Filed on August 4 in the Eastern District of Michigan, the plaintiff in Harper v. Sirius XM Radio, LLC alleges that the company’s AI system relied on historical hiring data that perpetuated past biases – resulting in his application being downgraded despite his qualifications. The lawsuit accuses Sirius XM of violating federal anti-discrimination statutes by the way it leverages AI tools in hiring, an allegation that we’re seeing more and more of lately. Here’s what you need to know about this case and how it fits into the larger puzzle of workplace AI litigation – plus 10 best practices you should follow as a result.
Case Summary: Harper v. Sirius XM Radio
In Harper, a job applicant proceeding pro se (representing himself without an attorney) just filed suit in a Michigan federal court.
Harper asserts two legal theories: disparate treatment, alleging intentional discrimination in the design or use of the AI tool; and disparate impact, claiming the tool’s outcomes had an unlawful discriminatory effect even if the bias was unintentional.