As recruiters adopt advanced technologies in their quest to identify, court and hire candidates, attorneys are looking into the legal and regulatory issues those new tools may bring into play.
Lawyers, recruiting experts and technology vendors say legal teams are examining compliance concerns even as their colleagues in HR and IT evaluate products that leverage artificial intelligence, machine learning and other innovative approaches. Not only are they exploring the ramifications of privacy requirements such as Europeâs GDPR, theyâre considering the possible impact of biases that may be inherent in a data set or unwittingly applied by algorithms.
âI think weâre at the beginning of sorting out what all this means, but I think itâs definitely something people are thinking about,â said Jeffrey Bosley, San Francisco-based partner in the labor and employment practice of law firm Davis Wright Tremaine. âItâs a new technology and itâs evolving. Whenever you have a new technology, you do have growing pains and you do have these issues that come up,â he said.
Advanced technologies have gotten much attention recently, particularly as people inside and outside the business world consider the impact AI may have on jobs and livelihoods. At the same time, some well-intentioned efforts have generated media coverage for results that were diametrically opposed to what their developers set out to do.
In 2018, for example, Amazon abandoned an effort to build a machine-learning tool for recruiters after the system proved to be favoring men over women. According to Reuters, the tool downgraded resumes that included the word âwomenâsâ as well as the graduates of two all-womenâs colleges.
Also read: Is there room for an ethics code for tech companies?
Sources inside Amazon said the system, which had been under development since 2014, was meant to review resumes so recruiters could spend more time building candidate relationships and actually hiring people. It worked by comparing applicants against patterns found among resumes the company had received over a 10-year period. However, it didnât account for the dominance of men in the technology workforce. As a result, the system machine-taught itself that male candidates were stronger than females.
Advanced technology âis at an awkward stage where itâs not really intelligent,â said William Tincup, president of the industry website RecruitingDaily.com. While he sees great potential for AI and other tools to streamline the work of recruiters and even address bias in the hiring process, he believes systems are limited in how much they can accomplish.
Why? In a word, people. âWhat are machines learning from their learning from humans?â Tincup asked. Hiring managers canât help but operate with a number of possible preconceptions in their minds, from unconscious bias about race or gender to a preference for the candidate they most recently interviewed or who seems the most like themselves. Such biases, Tincup observed, live on in the makeup of a companyâs existing workforce. And that leads to the troubles Amazon faced, where the data set reflects decisions made in the past more than it positions a process to understand needs of the future.
Technology Races Ahead
The situation is complicated by the idea that technology has outpaced legal and business practices. While they believe that will eventually change, analysts and technology vendors donât see it changing quickly.Â
âRight now, technologyâs moving super-fast,â said Ankit Somani, co-founder of the talent acquisition and management platform AllyO, headquartered in Palo Alto, California. âGenerally, regulators and the folks who control compliance standards donât move so quickly. But, honestly, weâre like three lawsuits away from somebody taking it very seriously.â
 âTherein lies a real big rub,â Tincup said of regulationâs lag behind talent acquisition and HR practices. Nearly all of the processes involved with turning candidates into employees touch some kind of employment law or EEOC-related issues, but âall of those rules are outdated,â he said. âWeâve been working outside of the rules for 15 or 20 years. I would argue that there isnât a company in the United States thatâs 100 percent compliant from sourcing to outplacement.â
Talent acquisition teams, and HR in general, understand that and are beginning to adopt, said Brian Delle Donne, president of Talent Tech Labs, an industry analyst and consulting firm based in New York. However, he believes determining exactly how and where compliance fits in with the use of new technologies has been complicated by the way âartificial intelligenceâ has been âgrossly generalizedâ in industry conversations.
âMost of the time theyâre talking about machine learning, or sometimes just automated workflow processing,â Delle Donne said. âWhen you get into true artificial intelligence, where the machine is making decisions, itâs a higher threshold thatâs required for our concern about the accuracy of [its] recommendations and predictions.â The distinction between true AI and what might be called âadvanced technologyâ is important, he believes, because people assume that the machine is prescient when itâs usually not. âIn most cases, it will be quite a while until machines are actually making decisions on their own,â Delle Donne observed.
Even in todayâs state, the use of advanced technology has become widespread enough to raise concerns about whether it might, inadvertently, nudge an employer out of compliance. For example, AI-driven tools may use personal information in unplanned ways that a candidate hasnât given permission for. That would raise privacy concerns. Or, tools might present results that, intentionally or not, run afoul of fair-employment legislation. âOn both fronts, youâre talking about compliance statutory norms,â said Delle Donne.
AIâs Behavior
Such concerns, along with widespread speculation about AIâs impact, has made advanced technology âfront of mind for many people,â said Bosley. In response, governments at all levels have begun generating âa patchworkâ of laws that sometimes conflict with one another.
For example, Illinoisâs Artificial Intelligence Video Interview Act went into effect Jan. 1, 2020. The law sets out transparency and consent requirements for video interviews, as well as limits on who can view the interviews and how long they can be stored. However, Bosley said, the lawâs mandate to destroy videos within 30 days may conflict with the preservation requirements of other state and federal laws, including in the Civil Rights Act of 1964 and the Americans with Disabilities Act.
Also read: How Will Staney continues to change the talent acquisition game
âIt puts employers in a position where theyâre really going to need to assess risk,â Bosley said. âTheyâre going to need to come up with creative solutions to try and work around some of this risk.âÂ
Not all employers may feel exposed in the near term, Tincup suggested. He estimates that each year only a handful of legal actions are taken because of a candidateâs unhappiness with the recruiting process. People practices, technology practices and civil and social discourse are âway ahead of employment law,â he explained. âSo is this something thatâs going to create an immense amount of risk? No.â Employers today, he believes, put themselves at more risk by hiring a salesperson with a history of sexual harassment. In that regard, âyou could spend more money in risk mitigation ⊠than in recruitment technology,â he said.
At the same time, an organizationâs risk may be based on activities that arenât related to recruiting or the workforce, Bosley points out. âThis isnât just a human resources issue anymore. Itâs not only an employment law issue anymore. Itâs much broader than that,â he said. âYou have data protection, data compliance, privacy and the potential for disparate impact claims as opposed to disparate treatment claims.â
Bosley anticipates more claims will be filed that look into a databaseâs contents, what dataâs being looked at, how itâs being processed and whether algorithms are static or refined over time. Essentially, these claims will examine how advanced technology is making its decisions. âItâs going to be something where human resources leaders are looking to involve others in the organization and make sure that theyâre both issue-spotting and getting ahead of some of these compliance issues,â he said.
 Indeed, Somani believes this notion of âexplainabilityâ â laying out what a system does and how itâs doing it â will become more important in the realms of recruiting technology and compliance. âThere should, in my mind, be more compliance standards around that,â he said.
Evolving Standards
Even at a basic level, compliance standards for using technology in recruiting âdonât exist,â Somani said. For example, does texting about a job opportunity constitute a form of marketing? Is such a text permissible if itâs personalized? Because the answerâs not clear, he believes many companies are putting stricter guidelines in place.
Somani also said legal departments are becoming more involved in the purchase and implementation of recruiting technology. For tools handling communications, such as those that facilitate SMS messaging between recruiters and candidates, theyâre trying to anticipate issues by creating policies that cover not only privacy, but data collection and permissions. âItâs an explicit ask in almost every deal we go into: âIf a consumer doesnât want to interact with your system, how do you follow that?â â he said. When it comes to issues related to AIâs under-the-hood work, vendors focus on transparency and disclosure by presenting disclaimers on their product or within their privacy policies. Â
 For enterprises, compliance issues âcan be a deal-breaker,â said Megan Gimbar, the Holmdel, New Jersey-based product marketing manager for iCIMS Hiring Suite, at least at the corporate level. While compliance and consistency are important components of her product, she said, talent acquisition teams often shy away from the topic.
In the past, employers tried to ensure compliance through training. Their approach, said Delle Donne, was to make hiring managers aware of interview questions that shouldnât be asked (such as inquiring whether a woman intended to have children) or information that shouldnât be considered (the candidateâs age or ZIP code). âThatâs a fairly low bar,â he observed.
The bar began getting higher âonce we started saying algorithms are going to make that determination for us,â Delle Donne continued. âAlgorithms might actually do a better job, [or] may actually be set up in a way that they might do a better job, than humans do at avoiding compliance issues through bias.â However, he said, that requires planning and a focus on non-discrimination features when algorithms are designed.
Also read: The ethics of AI in the workplace
Compliance Further Afield
The compliance issues raised by using AI in recruiting arenât limited to talent acquisition alone. For one thing, Somandi notes, recruiters today leverage a variety of tools that were introduced into other functions.Â
Think of how candidate management systems and customer management systems align. When using those technologies, compliance may involve adapting the standards used by marketing or sales so they can be applied to talent acquisition and HR.
That road goes both ways. Even solutions designed for recruiters raise issues that arenât unique to hiring, Delle Donne said. âAs HR tries to digitize, there are many, many places where technology can streamline processes and save time and perhaps be more beneficial to the employee or the party,â he said. Many, if not all, of those will lead to some kind of compliance question. For example, a bot used in benefits administration may build a profile of confidential medical information. Or, a learning program might enter performance scores into an employee record without informing the employee. That could be a problem if those scores impact a personâs future promotions or career path.
As it digitizes, the tools implemented by HR âwill bring in these technologies and thereâs going to have to be some focus or some attention given to not inadvertently creating bias or discrimination, or revealing private information,â Delle Donne said. âIf you take a step back, it just could be like whack-a-mole. I mean, âHey, we see it over here in talent acquisition. Letâs go chase that down and⊠Oh, wait. We just saw this going on over there.ââ
Scheduling employees is one major HR task for which technology can help. Make more accurate, data-driven scheduling decisions in just a few clicks with Workforce.com’s comprehensive scheduling software.




