The company claims it’s already been used by some 10 million job applicants, and co-founder Alex Rosen told Forbes such numbers mean a much bigger pool of viable candidates. That’s part of the rationale behind Sense HQ, which provides companies like Sears, Dell and Sony with text messaging-based AI chatbots that help their recruiters wade through thousands of applicants. “Chatbots are a very logical first step to try and take some of the load off of recruiters.” “Human resources has always been a cost center for a company, it's never been a revenue generating thing,” he explained. HR departments are often one of the first places to see staff reductions, said Matthew Scherer, a senior policy counsel for workers’ rights and technology at the Center for Democracy and Technology. Still, for companies looking to trim recruiting costs, AI screening agents seem an obvious option. In 2020, Illinois passed a law requiring employers who apply AI to analyze video interviews to notify applicants and obtain consent. In early July, New York City enacted a new law requiring employers who use automated tools like resume scanners and chatbot interviews to audit their tools for gender and racial bias. Recently, government authorities have introduced legislation to monitor and regulate the use of automation in hiring tools. But such bias can be tough to detect when companies aren't transparent about why a potential candidate was rejected. “If the chatbot is looking at things like how long it takes you to respond, or whether you’re using correct grammar and complex sentences, that's where you start worrying about bias coming in,” said Pauline Kim, a employment and labor law professor at Washington University, whose research focuses on the use of AI in hiring tools. Underlying prejudice in data used to train AI can bake bias and discrimination into the tools in which it's deployed. Jeremy Schiff, CEO and founder of RecruitBotĭiscrimination is another concern. “It's sort of like how Netflix recommends movies based on other movies you like.” “If the chatbot is too rigid, and the person needs to be able to request some kind of exemption, then the chatbot might not give them the opportunity to do that.” “If it's a human being that you're talking to, there's a natural opportunity to talk about reasonable accommodations,” he told Forbes. Equal Employment Opportunity Commission (EEOC), fears chatbots like Olivia and Mya may not provide people with disabilities or medical conditions with alternative options for availability or job roles. Aaron Konopasky, senior attorney advisor at the U.S. That could be a problem for people with disabilities, people who are not proficient in English and older job applicants, experts say. And the clear-cut answers many of the bots require could mean automatic rejection for some qualified candidates who might not answer questions like a large language model wants them to. They are rudimentary and ask fairly straightforward questions: “Do you know how to use a forklift?” or “Are you able to work weekends?” But as Claypool found, these bots can be buggy - and there isn’t always a human to turn to when something goes wrong. They’ve primarily been used to screen for jobs that have a high-volume of applicants - cashiers, warehouse associates and customer service assistants. Most hiring chatbots are not as advanced or elaborate as contemporary conversational chatbots like ChatGPT. (Paradox didn’t respond to a comment request about Claypool’s experience.) Other companies like L’Oreal rely on Mya, an AI chatbot developed in San Francisco by a startup of the same name. McDonalds, Wendy’s, CVS Health and Lowes use Olivia, a chatbot developed by Arizona-based $1.5 billion AI startup Paradox. HR chatbots like the ones Claypool encountered are increasingly being used in industries like healthcare, retail and restaurants to filter out unqualified applicants and schedule interviews with the ones who might be right for the job.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |