Artificial Intelligence is a part of our world more than ever. From Alexa and Cortana in our homes telling us what time it is to digital twins solving the problem of supply chains to websites giving us suggestions based on our shopping, AI touched our lives and became present in at least some of the everyday activities that we do. One big activity that AI is present in is when we apply for a job. AI-based hiring, more of an undertaking by companies to make “better” hiring decisions for talent. The future of work impacted and actually outright disrupted by the pandemic led to work being virtual or at home, making the appeal of applying online even more choices than before. With hundreds and hundreds of resumes flowing in for one company role, the eyes of the human recruiter can only be so accurate. Enter AI to help with the identification of the targeted experience (as that is almost always the measuring stick for companies) as well as the knowledge and skills of applicants from their resume. From there, AI took over the repetitive and tedious small tasks of analyzing, sorting, and delivering resumes that matched the job experience and skills that the company looked for to achieve success in the role.
Supposedly, AI’s evolution from hiring to supporting candidate tasks such as sourcing, selection, onboarding, and even terminations creates opportunities for companies to become more cost-effective and efficient in hiring for roles and transitioning HR brainpower to more value-driven work and decision making. We must always realize though that AI is still a man-made resource that will think like we do and decide like we do, in some cases even more rigidly than we do. From that position, human bias can originate and take root in ensuring that some candidates do not have a fair and equal opportunity to be hired. I point to this specifically in experience (if you don’t have the full 5 years, but 4.25, does that put your resume in the rejection pile?), but more in consternation of bias would be human demographics including ability, race, age, and gender for starters. Ai is built from resources including training data and machine learning and both of those are built to understand human trends. It’s all but a solid guarantee that those resources used to build AI have some form of human bias if the company using it is trying to achieve a hiring goal for its roles. With companies being driven to the obvious economic goal – increasing revenue and decreasing costs – they must have a specific trait or characteristic that will cause some exclusionary behavior in hiring.
With human bias being present, AI should play an extremely small role in the hiring process by only supplying information and not making any judgment communications or statements in hiring. With the power and ability to gather information and decipher it as we do, it should shift towards organizational development and organizational change for the professionals that are already working in the company. Activities of development include employee engagement by deploying chatbots to perform wellness checks throughout the day or to help with work performance, employee experience by helping with employee benefits or team collaboration, and direct professional development by initiating training, learning, and even coaching to where it can guide the veracious and value-driven “5w-1h” – who needs training, what subjective or objective goal they need it on when they would need it, where could they perform it, why they would need it and how it would be deployed.
The way AI could positively impact training and learning could be when to engage the employee. Using the four brands of analytics, AI could take data from the performance of a job role and recommend that employees be placed in training or learning for a leadership or management role. An example would be using predictive analytics to anticipate how the professional’s skills of relationship-building could impact leadership in a specific business function to meet or exceed standards set by the company for that function. Another example would be AI using diagnostic analytics to identify company cultural behaviors of its workforce to build a career curriculum, leadership or otherwise, for a group of employees that exhibit certain characteristics. From there, the company can design its workforce positions to create “win-win” opportunities – employees get the career opportunities they want in increased wages and other career rewards and companies increase retention of its workforce and create a leadership pipeline for its future. Still another, and probability the most common at immediate thought, would be descriptive analytics in reporting patterns in role or talent performance and productivity to not only give company leadership a glimpse of work patterns that are or are not working but could recommend remedies or interventions from appreciative inquiry (build on strength or opportunistic patterns) and organizational change (improve on weakness or threatening patterns) that would help companies allocate resources accordingly.
Having AI present in training, learning, and coaching of employees or expanding into organizational development or change is not without its impact or cognizance of human bias. Remember, companies can input what development it thinks would make their workforce successful in the market it does business in. The company can request that training be in sales development and input it into AI to meticulously watch and report on patterns for professionals in the sales department. A sales professional could be doing great the first 2 weeks by exceeding quota, but do worse the next two weeks which could go on into a pattern. AI would then report this and suggest to this person to take sales development training because of this pattern when the professional does not need it. Another instance could be the selection of leaders. The aforementioned demographics could be lingering with AI making a decision of providing recommendations for leadership courses to a group that may be within a specific age range consistently (again, due to positive results of performance) and create discrimination as it could not tell the difference or make a reporting decision based on age.
For employee development, AI should be a complimenting partner in providing information and recommendation across a fair board. While using it, we should be aware that if, like us, is not pristine and flawless in its findings. I do think that using it for training and learning would be much more vital in company success than using it to find and hire talent, but even with those responsibilities, it should not be a judge in the development of professionals. Deploying AI in professional development would also provide a positive and accelerated step in the direction of investing in employees and possess the potential for c-suite leadership to connect with its workforce through training and learning. It must always be general that humans should be at the head of learning and training for other humans and still let AI play a fixed role. When it comes to that role, AI can be the sidekick and even the heroic partner, but it must never be the hero of professional development.
Dr. Simpson’s approach is a talent and people business partner that deploys extensive experience in organization development, people-organizational analytics, and stakeholder-entrepreneurship business development to drive employee engagement, thinking, and behavior to “business partnership and leadership”.