Today: 7 月 09, 2025
Dark
Light
Dark
Light

AI could determine whether you get hired or fired as more managers rely on the technology at work

1 min read

Here’s a scary thought: Your job security could be in the hands of AI. 

That’s according to a new study from career site Resumebuilder.com, finding that more managers are relying on tools like ChatGPT to make hiring and firing decisions.

Managers across the U.S. are increasingly outsourcing personnel-related matters to a range of AI tools, despite their not being well-versed in how to use the technology, according to the survey of more than 1,300 people in manager-level positions across different organizations.

The survey found that while one-third of people in charge of employees’ career trajectories have no formal training in using AI tools, 65% use it to make work-related decisions. Even more managers appear to be leaning heavily on AI when deciding who to hire, fire or promote, according to the survey. Ninety-four percent of managers said they turn to AI tools when tasked with determining who should be promoted or earn a raise, or even be laid off. 

The growing reliance among managers on AI tools for personnel-related decisions is at odds with the notion that these tasks often fall under the purview of human resources departments. But companies are quickly integrating AI into day-to-day operations, and urging workers to use it. 

To be sure, there are risks associated with using generative AI to determine who climbs the corporate ladder and who loses their job, especially if those using the technology don’t understand it well. 

“AI is only as good as the data you feed it,” Pandey said. “A lot of folks don’t know how much data you need to give it. And beyond that … this is a very sensitive decision; it involves someone’s life and livelihood. These are decisions that still need human input — at least a human checking the work.”

In other words, problems can arise when AI is increasingly determining staffing decisions with little input from human managers.

“The fact that AI could be in some cases making these decisions start to finish — you think about a manager just asking ChatGPT, ‘Hey, who should I lay off? How many people should I lay off?’ That, I think is really scary,” Pandey said. 

Companies could also find themselves exposed to discrimination lawsuits. 

“Report after report has told us that AI is biased. It’s as biased as the person using it. So you could see a lot of hairy legal territory for companies,” Pandey said. 

AI could also struggle to make sound personnel decisions when a worker’s success is measured qualitatively, versus quantitatively.

“If there aren’t hard numbers there, it’s very subjective,” Pandey said. “It very much needs human deliberation. Probably the deliberation of much more than one human, also.”

发表回复

Your email address will not be published.

Categories