Philadelphia Partner Discusses How Employers Can Limit Liability Amid Artificial Intelligence Explosion
News
3.30.22
David Walton was interviewed by Law.com for an article about the rise in popularity of artificial intelligence (AI) tools, which is opening up companies and their legal departments to a range of new legal risks.
In one scenario presented to David, a company using an AI tool to sort through candidates based on job criteria was prioritizing people that lived in rural areas, because the company has seen studies that found people who live in rural areas tend to stay at their jobs longer than people who live in urban areas.
“If you are … disfavoring people from urban areas, you’re also potentially disfavoring minorities, because as a general rule there are more minorities in urban areas than in rural areas,” said David, who estimates that 80% of HR departments are using some form of AI or predictive analytics. “That’s how a facially neutral policy … could have a disparate impact on minorities.”
He discussed a number of local laws that have cropped up in recent years with the explicit aim to regulate how AI can be used, and also said he wouldn’t be surprised if future legislation and guidance from agencies like the U.S. Equal Employment Opportunity Commission and FTC—both of which have indicated they are mulling rules on artificial intelligence—places more focus on the AI tools themselves.
“I think you’re going to have a second wave of laws,” he predicted. “You’re going to have some states try to pass laws that put the onus on the AI company to make sure that there’s no bias” that results from their algorithms.
He said that in-house lawyers for the companies looking to use the tools should be vigilant.
“Don’t blindly rely on the algorithms,” he said. “Look for ways to test the algorithm or the tool before you put it into use. … Use it as a way to help you, but don’t use it as the final arbiter, the final decision-maker.”
Read the full article at Law.com (subscription required).
Please reach out to our Media team for any news inquiries.