The HR tech industry continues to introduce new innovations to make hiring smarter, faster, and easier. Yet even as the number of applications per position grows, some talent leaders are rethinking how, and how much, they want to rely on this technology alone.
During a From Day One webinar on recruiting trends and technology heading into 2026, talent leaders debated the ways in which tech will automate and augment the hiring process, and how to strike the right balance between human and artificial intelligence. They agreed on this: AI is here to stay, but so is the human touch.
What Is Responsible AI?
For Kim Stevens, director of talent acquisition at recruiting platform Employ, responsible AI can be defined simply: “It’s technology that supports people. It doesn’t replace them.”
In practice, that means AI never overrides human judgment, and a person always makes the final call. It also means the technology can explain itself. When Employ’s AI screening tool flags standout candidates, it provides a clear rationale. This information helps keep the company audit-ready, a legal necessity–and some argue, a moral one.
Mike Rockwell, VP of account management at Verified Fist, which conducts employee background checks, added that responsibility also includes security. “If you think about the most popular AI tools people are using, like ChatGPT, if you put something in there, everyone has it,” he said. Sensitive hiring information can’t be treated casually. Employers need to ensure the tools they adopt have the infrastructure to keep candidate data protected.
Transparency with candidates is part of responsible AI use. If a company is relying on AI tools for recruiting, they should be upfront about it, Rockwell says. Job seekers who feel misled or entirely cut off from real human interaction aren’t likely to walk away with a positive view of the employer.

Erica Wallace, senior talent acquisition manager at HR management software BambooHR, says that some organizations still struggle to formalize internal guidelines. “We can’t say people are breaking the rules if we haven’t established what those rules are,” she said. At BambooHR, guardrails are built directly into internal tools. Even if a recruiter tries to ask the AI who they should hire, the system is designed to decline.
Clarity also extends to the candidate experience. Some job seekers assume recruiters haven’t looked at resumes in years, Wallace joked. To counter that perception, her team tells candidates upfront if AI will be used in the interview process and gives them the option to opt out. Every AI-generated decision, from whether to advance a resume to the next round to any ranking of applicants, is still audited by a human.
The panelists agreed that it’s always worth questioning whether technology is saving time or quietly creating more work. “That’s something we’re asking all the time,” said Wallace. Her rule of thumb: AI should only be introduced to solve a clearly named problem. Too many vendors, she said, are inventing products for problems that they don’t face. Adopting tech for tech’s sake is a reliable way to burn hours, not reduce them.
Some tasks require a combination of the human touch and tech power. Fraud prevention, for instance, has pushed BambooHR back toward more in-person interviews to verify a candidate’s identity. Employ has also increased the amount of screening done by human recruiters. And Stevens cautioned against “over-engineering” the process by letting AI handle too much candidate messaging, especially deeper in the funnel where a personal touch matters most.
What Employers Should Focus on in 2026
As hiring teams plan for the new year, Stevens encourages leaders to think about candidate communication as a baseline requirement. With so many applicants across industries, it’s common for job seekers to hear nothing or receive only canned responses. “It is our responsibility as humans to treat other humans as such,” she said. AI can help clear the noise and reduce administrative work, but it shouldn’t replace meaningful interaction.
That means reinvesting time in the humans doing the hiring. Spend more time with your recruiters, Stevens says. Help them become better interviewers, better communicators, and more empathetic guides in a challenging job market. AI can accelerate workflows, but it can’t build trust or make someone feel valued.
Technology should enhance the human element, not erode it, panelists agreed. “We have an obligation to try the best we can to remain human and keep that human element, even with the advancements in technology and AI,” said Stevens.
“One way to differentiate is to lead with kindness and empathy in everything you do,” Rockwell said. “There’s someone on the other end that’s trying to get a job because they need to pay bills, they need to feed their kids, they need to be sitting in a seat so they have a career. It’s really easy to forget about that when everything’s happening through a computer.”
Editor’s note: From Day One thanks our partner, Employ, for sponsoring this webinar.
Emily McCrary-Ruiz-Esparza is an independent journalist and From Day One contributing editor who writes about business and the world of work. Her work has appeared in the Economist, the BBC, The Washington Post, Inc., and Business Insider, among others. She is the recipient of a Virginia Press Association award for business and financial journalism. She is the host of How to Be Anything, the podcast about people with unusual jobs.
(Photo by Take Production Studios/Shutterstock)
The From Day One Newsletter is a monthly roundup of articles, features, and editorials on innovative ways for companies to forge stronger relationships with their employees, customers, and communities.