Skip to content

Workforce

Tag: business ethics

Posted on April 14, 2020June 29, 2023

Regulating recruiting amid constant technological innovations

recruiting, hiring, interviewing a candidate

As recruiters adopt advanced technologies in their quest to identify, court and hire candidates, attorneys are looking into the legal and regulatory issues those new tools may bring into play.

Lawyers, recruiting experts and technology vendors say legal teams are examining compliance concerns even as their colleagues in HR and IT evaluate products that leverage artificial intelligence, machine learning and other innovative approaches. Not only are they exploring the ramifications of privacy requirements such as Europe’s GDPR, they’re considering the possible impact of biases that may be inherent in a data set or unwittingly applied by algorithms.

recruiting, hiring, talent acquisition “I think we’re at the beginning of sorting out what all this means, but I think it’s definitely something people are thinking about,” said Jeffrey Bosley, San Francisco-based partner in the labor and employment practice of law firm Davis Wright Tremaine. “It’s a new technology and it’s evolving. Whenever you have a new technology, you do have growing pains and you do have these issues that come up,” he said.

Advanced technologies have gotten much attention recently, particularly as people inside and outside the business world consider the impact AI may have on jobs and livelihoods. At the same time, some well-intentioned efforts have generated media coverage for results that were diametrically opposed to what their developers set out to do.

In 2018, for example, Amazon abandoned an effort to build a machine-learning tool for recruiters after the system proved to be favoring men over women. According to Reuters, the tool downgraded resumes that included the word “women’s” as well as the graduates of two all-women’s colleges.

Also read: Is there room for an ethics code for tech companies?

Sources inside Amazon said the system, which had been under development since 2014, was meant to review resumes so recruiters could spend more time building candidate relationships and actually hiring people. It worked by comparing applicants against patterns found among resumes the company had received over a 10-year period. However, it didn’t account for the dominance of men in the technology workforce. As a result, the system machine-taught itself that male candidates were stronger than females.

Advanced technology “is at an awkward stage where it’s not really intelligent,” said William Tincup, president of the industry website RecruitingDaily.com. While he sees great potential for AI and other tools to streamline the work of recruiters and even address bias in the hiring process, he believes systems are limited in how much they can accomplish.

Why? In a word, people. “What are machines learning from their learning from humans?” Tincup asked. Hiring managers can’t help but operate with a number of possible preconceptions in their minds, from unconscious bias about race or gender to a preference for the candidate they most recently interviewed or who seems the most like themselves. Such biases, Tincup observed, live on in the makeup of a company’s existing workforce. And that leads to the troubles Amazon faced, where the data set reflects decisions made in the past more than it positions a process to understand needs of the future.

Technology Races Ahead

The situation is complicated by the idea that technology has outpaced legal and business practices. While they believe that will eventually change, analysts and technology vendors don’t see it changing quickly. 

“Right now, technology’s moving super-fast,” said Ankit Somani, co-founder of the talent acquisition and management platform AllyO, headquartered in Palo Alto, California. “Generally, regulators and the folks who control compliance standards don’t move so quickly. But, honestly, we’re like three lawsuits away from somebody taking it very seriously.”

Also read: Artificial intelligence is a double-edged sword. Here’s how HR leaders can properly wield it

 “Therein lies a real big rub,” Tincup said of regulation’s lag behind talent acquisition and HR practices. Nearly all of the processes involved with turning candidates into employees touch some kind of employment law or EEOC-related issues, but “all of those rules are outdated,” he said. “We’ve been working outside of the rules for 15 or 20 years. I would argue that there isn’t a company in the United States that’s 100 percent compliant from sourcing to outplacement.”

Talent acquisition teams, and HR in general, understand that and are beginning to adopt, said Brian Delle Donne, president of Talent Tech Labs, an industry analyst and consulting firm based in New York. However, he believes determining exactly how and where compliance fits in with the use of new technologies has been complicated by the way “artificial intelligence” has been “grossly generalized” in industry conversations.

“Most of the time they’re talking about machine learning, or sometimes just automated workflow processing,” Delle Donne said. “When you get into true artificial intelligence, where the machine is making decisions, it’s a higher threshold that’s required for our concern about the accuracy of [its] recommendations and predictions.” The distinction between true AI and what might be called “advanced technology” is important, he believes, because people assume that the machine is prescient when it’s usually not. “In most cases, it will be quite a while until machines are actually making decisions on their own,” Delle Donne observed.

Even in today’s state, the use of advanced technology has become widespread enough to raise concerns about whether it might, inadvertently, nudge an employer out of compliance. For example, AI-driven tools may use personal information in unplanned ways that a candidate hasn’t given permission for. That would raise privacy concerns. Or, tools might present results that, intentionally or not, run afoul of fair-employment legislation. “On both fronts, you’re talking about compliance statutory norms,” said Delle Donne.

AI’s Behavior

Such concerns, along with widespread speculation about AI’s impact, has made advanced technology “front of mind for many people,” said Bosley. In response, governments at all levels have begun generating “a patchwork” of laws that sometimes conflict with one another.

For example, Illinois’s Artificial Intelligence Video Interview Act went into effect Jan. 1, 2020. The law sets out transparency and consent requirements for video interviews, as well as limits on who can view the interviews and how long they can be stored. However, Bosley said, the law’s mandate to destroy videos within 30 days may conflict with the preservation requirements of other state and federal laws, including in the Civil Rights Act of 1964 and the Americans with Disabilities Act.

Also read: How Will Staney continues to change the talent acquisition game

“It puts employers in a position where they’re really going to need to assess risk,” Bosley said. “They’re going to need to come up with creative solutions to try and work around some of this risk.” 

Not all employers may feel exposed in the near term, Tincup suggested. He estimates that each year only a handful of legal actions are taken because of a candidate’s unhappiness with the recruiting process. People practices, technology practices and civil and social discourse are “way ahead of employment law,” he explained. “So is this something that’s going to create an immense amount of risk? No.” Employers today, he believes, put themselves at more risk by hiring a salesperson with a history of sexual harassment. In that regard, “you could spend more money in risk mitigation … than in recruitment technology,” he said.

At the same time, an organization’s risk may be based on activities that aren’t related to recruiting or the workforce, Bosley points out. “This isn’t just a human resources issue anymore. It’s not only an employment law issue anymore. It’s much broader than that,” he said. “You have data protection, data compliance, privacy and the potential for disparate impact claims as opposed to disparate treatment claims.”

Bosley anticipates more claims will be filed that look into a database’s contents, what data’s being looked at, how it’s being processed and whether algorithms are static or refined over time. Essentially, these claims will examine how advanced technology is making its decisions. “It’s going to be something where human resources leaders are looking to involve others in the organization and make sure that they’re both issue-spotting and getting ahead of some of these compliance issues,” he said.

 Indeed, Somani believes this notion of “explainability” — laying out what a system does and how it’s doing it — will become more important in the realms of recruiting technology and compliance. “There should, in my mind, be more compliance standards around that,” he said.

Evolving Standards

Even at a basic level, compliance standards for using technology in recruiting “don’t exist,” Somani said. For example, does texting about a job opportunity constitute a form of marketing? Is such a text permissible if it’s personalized? Because the answer’s not clear, he believes many companies are putting stricter guidelines in place.

Somani also said legal departments are becoming more involved in the purchase and implementation of recruiting technology. For tools handling communications, such as those that facilitate SMS messaging between recruiters and candidates, they’re trying to anticipate issues by creating policies that cover not only privacy, but data collection and permissions. “It’s an explicit ask in almost every deal we go into: ‘If a consumer doesn’t want to interact with your system, how do you follow that?’ ” he said. When it comes to issues related to AI’s under-the-hood work, vendors focus on transparency and disclosure by presenting disclaimers on their product or within their privacy policies.  

 For enterprises, compliance issues “can be a deal-breaker,” said Megan Gimbar, the Holmdel, New Jersey-based product marketing manager for iCIMS Hiring Suite, at least at the corporate level. While compliance and consistency are important components of her product, she said, talent acquisition teams often shy away from the topic.

In the past, employers tried to ensure compliance through training. Their approach, said Delle Donne, was to make hiring managers aware of interview questions that shouldn’t be asked (such as inquiring whether a woman intended to have children) or information that shouldn’t be considered (the candidate’s age or ZIP code). “That’s a fairly low bar,” he observed.

The bar began getting higher “once we started saying algorithms are going to make that determination for us,” Delle Donne continued. “Algorithms might actually do a better job, [or] may actually be set up in a way that they might do a better job, than humans do at avoiding compliance issues through bias.” However, he said, that requires planning and a focus on non-discrimination features when algorithms are designed.

Also read: The ethics of AI in the workplace

Compliance Further Afield

The compliance issues raised by using AI in recruiting aren’t limited to talent acquisition alone. For one thing, Somandi notes, recruiters today leverage a variety of tools that were introduced into other functions. 

Think of how candidate management systems and customer management systems align. When using those technologies, compliance may involve adapting the standards used by marketing or sales so they can be applied to talent acquisition and HR.

That road goes both ways. Even solutions designed for recruiters raise issues that aren’t unique to hiring, Delle Donne said. “As HR tries to digitize, there are many, many places where technology can streamline processes and save time and perhaps be more beneficial to the employee or the party,” he said. Many, if not all, of those will lead to some kind of compliance question. For example, a bot used in benefits administration may build a profile of confidential medical information. Or, a learning program might enter performance scores into an employee record without informing the employee. That could be a problem if those scores impact a person’s future promotions or career path.

As it digitizes, the tools implemented by HR “will bring in these technologies and there’s going to have to be some focus or some attention given to not inadvertently creating bias or discrimination, or revealing private information,” Delle Donne said. “If you take a step back, it just could be like whack-a-mole. I mean, ‘Hey, we see it over here in talent acquisition. Let’s go chase that down and… Oh, wait. We just saw this going on over there.’”

Scheduling employees is one major HR task for which technology can help. Make more accurate, data-driven scheduling decisions in just a few clicks with Workforce.com’s comprehensive scheduling software.

Posted on March 9, 2020October 18, 2024

The ethical use of AI on low-wage workers

warehouse workers, hourly employees

The impact of technology has not been equal among different segments of employees. 

The introduction of automation and artificial intelligence-enabled labor management systems raises significant questions about workers’ rights and safety, according to the “AI Now 2019 Report,” which explores the social implications of AI technologies. AI Now is a nonprofit that works with stakeholders such as academic researchers, policymakers and impacted communities to understand and address issues raised by the introduction of AI.

While the use of these systems puts more power and control in the hands of the company, it also harms mainly low-wage workers, who are disproportionately people of color, according to the report. These systems don’t work for employees when they set unrealistic productivity goals that can lead to injury or psychological stress and when they impose “unpredictable algorithmic wage cuts” on gig workers that undermine their financial stability, for example. 

Also read: Should there be a code of ethics in technology? 

warehouse workers, hourly employees
Hourly workers such as warehouse workers may be adversely impacted by AI-enabled workforce management systems.

Lower-wage workers stand to lose the most with the rise of automation while white-collar workers are generally unaffected, the report noted. It cited a McKinsey & Co. study that concluded “labor automation will further exacerbate the racial wealth gap in the U.S. absent any interventions.” 

Unions have been the traditional way for workers to contest harmful practices, but many employees don’t have access to union membership and many fear retaliation if they bring up their concerns. Still, the report noted, tech companies like Amazon and others are using many tactics to prevent unions from forming in their workforce. For example, whistleblowers have disclosed the fact that in a time of employee unrest, Google hired a consulting firm “known for its anti-union work.” 

It’s critical to get the perspective of hourly workers on how technology is playing into their lives, said Annelies M. Goger, a David M. Rubenstein Fellow in the Brookings Institute. Her research focuses on workforce development policy, the future of work and inclusive economic development. She was not talking about unions specifically in her interview with Workforce, but she did stress the importance of respecting and addressing employees’ concerns.

There are certain aspects of how technology is used in their jobs that hourly workers may appreciate, but they also have concerns or frustrations about issues like the influx of automated checkout lines and lack of consistency in scheduling, she said. 

“There’s a range of people who really want to embrace technology, but they want to make sure that workers have a voice at the table and that they have a way to provide feedback,” she said. 

These employees may also have concerns when management changes at their company, Goger said. 

As restructuring takes place new management might not take into account the needs of hourly workers, and these employees end up having less input on the quality of their jobs. 

Also read: Ensuring equity in the digital age

“Food, retail and grocery workers have witnessed rapid change in recent years, especially in the front end of their stores. Most feel they lack voices in these changes and feel pessimistic about the future for humans in their stores,” according to “Worker Voices: Technology and the Future for Workers,” a November 2019 paper by Molly Kinder and Amanda Lehnhart. Kinder is a David M. Rubenstein Fellow at the Brookings Institution’s Metropolitan Policy Program and a nonresident Senior Fellow at New America. Lenhart is the deputy director of the Better Life Lab at New America. 

“Worker Voices” also noted that low-wage workers’ low pay and economic insecurity is a barrier to them preparing for jobs that aren’t as impacted by new technology. An excerpt:

“While technological change is not the direct cause of workers’ precarity, it can add insult to injury. Automation and the adoption of new workplace technologies can exacerbate financial insecurity when jobs change, wages or hours are suppressed, or when workers are displaced altogether. Economic insecurity also limits workers’ resilience to technology changes by undermining their ability to weather a job transition, pay for training or schooling, and move into better paying—and less automatable—work. If workers cannot afford to make ends meet today, they will be ill-equipped to prepare for tomorrow. Raising income, reducing inequality and improving the economic security of workers is key to enabling a better future of work for those at greatest risk of change.”

Skill development is on some people’s minds. Chris Havrilla, leader of the HR technology practice for Bersin, Deloitte Consulting LLP, said that one application of AI could be to go through data and find potential new roles for people, in terms of talent mobility. From there, organizations can think about what employees need to accomplish and possibly help them develop the skills they need to get there. 

“I’m seeing some interesting things around, ‘We don’t want to lose people who already know how to work within our organization. How do we help them find other roles that might be applicable to them?’” she said. 

Posted on February 13, 2020June 29, 2023

Why ethics is the crux of employee engagement

protest, employee rights

Throughout 2019, numerous factors forced companies to recognize the importance of ethical leadership. 

Barcelona, Catalonia. Thousands of people took to the streets as part of the worldwide movement. Global Climate Strike, international, protests and action against climate change.From Google’s employee protests and walkouts to the onset of GDPR and data privacy troubles of companies like Facebook, ethics has not only dominated the headlines but also become a catalyst of both employee satisfaction and business success.

In this climate, employee alignment is simultaneously more important and difficult than ever to achieve. Employers are demanding more out of their employees, but at the same time, face a range of evolving preferences and digital distractions that make it difficult to capture their attention and trust. Business and HR leaders must adopt an ethos of ethical leadership while thoughtfully implementing engagement strategies or risk losing top employees and the ability to recruit the best as the war for talent rages on.

Ethical practices or a lack thereof will give organizations a competitive advantage or become their demise. 

Set the tone at the top

While they may seem insignificant at the time, small actions and decisions by company leadership can add up to big consequences and contribute to the ethical fabric of the workplace. First and foremost, business and HR leaders must prioritize a renewed commitment to transparency — and make it known. Then, they can incorporate tools and strategies to make their values more visible across the entire company, including frontline and deskless workers (i.e., the 80 percent of the workforce that doesn’t sit at a computer).

Also read: 5 ways leaders ruin employee engagement 

An authentic presence of leaders is the most important element in building trust with employees. This not only boosts productivity and performance but also prevents behavior that creates a toxic work environment. Leaders who are genuine and open in their communications can also help thwart digital water coolers from spreading misinformation around the workplace, especially with today’s social and collaboration platforms that make it easy for anyone to spread misinformation.

When challenges do arise, it is critical to get ahead of the conversation through proactive, honest communication, sharing the “why” behind decisions so employees hear it straight from the source. Business and HR leaders should be vigilant in sharing these types of company updates to instill trust and reinforce values. 

Shockingly, only 16 percent of employees worldwide consider themselves fully engaged. And in the face of an engagement crisis, annual or quarterly surveys don’t cut it to ensure employees’ needs are met. Instead, an approach that focuses on data from employee behavior and pulse polls delivered at optimal times can give leaders a real-time temperature on their organization. These insights can quickly be turned into action to most efficiently reach and engage all employees.

The most effective way to align the workforce must take employee preferences into account. For example, some workers may find nontraditional and more interactive forms of communications to be a welcome change from email or chat, which can create an “always on” culture and lead to burnout. 

More vibrant media, such as audio and video methods, makes the quality of interactions far richer, facilitating community-building and allowing distributed workers to feel closer to the business. Whatever their preferences may be, tailoring engagement strategies through a data-based, personalized approach ensures all employees get the information they need to build trust.

Empower employees to speak up

In an era of employee activism, organizations must not only support but actively encourage employees to make themselves heard. Instead of top-down communications, establishing two-way communication channels and mechanisms for feedback gives employees the opportunity to provide perspectives and ask questions in a way that holds leaders accountable. 

With this in mind, organizations should acknowledge and ensure that all employee feedback is heard and proper action is being taken. HR managers should use the data and insights from these channels and programs to reevaluate their diversity, equal pay or other policies and make sure they are as impactful as possible. They should also use these tools and insights to implement valuable recognition programs, whether rewards, promotions, bonuses or other programs so employees feel motivated to do their best work. 

In 2020, ethical leadership will no longer be an option, but an imperative that directly impacts the bottom line, pushing companies to build ethics into policies and practices, place a renewed focus on culture and seek ways to measure the impact of their efforts. 

In a digital workplace, business leaders must adopt tools, technologies and practices to create a more connected, engaged and productive workforce or risk losing trust in an era when it’s needed most.


 

Webinars

 

White Papers

 

 
  • Topics

    • Benefits
    • Compensation
    • HR Administration
    • Legal
    • Recruitment
    • Staffing Management
    • Training
    • Technology
    • Workplace Culture
  • Resources

    • Subscribe
    • Current Issue
    • Email Sign Up
    • Contribute
    • Research
    • Awards
    • White Papers
  • Events

    • Upcoming Events
    • Webinars
    • Spotlight Webinars
    • Speakers Bureau
    • Custom Events
  • Follow Us

    • LinkedIn
    • Twitter
    • Facebook
    • YouTube
    • RSS
  • Advertise

    • Editorial Calendar
    • Media Kit
    • Contact a Strategy Consultant
    • Vendor Directory
  • About Us

    • Our Company
    • Our Team
    • Press
    • Contact Us
    • Privacy Policy
    • Terms Of Use
Proudly powered by WordPress