- WHO WE ARE
- WHAT WE DO
- WHO WE SERVE
- JOIN US
- CONTACT US
AI is everywhere. It's found in our homes, offices, and in public places. But does it have a spot in the world of recruiting and hiring? MetroStar's interns and Client Solutions Group (CSG) — a research and development group focused on applying new and emerging technologies — spent the summer researching and creating a tool (StarFinder) to tackle ethical AI in recruitment.
Artificial intelligence (AI) and machine learning (ML) can accelerate recruitment and prevent hiring teams from shifting through, sometimes, thousands of resumes, which they can only spend a couple of seconds reviewing. Adding AI tools to the recruitment process aims to solve the difficulty recruiters face when constantly switching working contexts between job openings.
AI/ML can perform the initial 'first pass' on resumes, allowing recruiters to focus on more detailed candidate assessments with less time pressure on their work. The first pass can check a resume for the basic job requirements (i.e., clearance level, education, skillsets, education, and years of experience). This check is vital for government-based jobs. Unlike with civilian jobs, recruiters in the world of government must follow strict hiring rules found in legal processes like the LCAT.
The Labor Category (referred to as LCAT) stipulates the years of experience, degree requirement, certifications, and U.S. Citizenship required from candidates due to the clearance requirements. If a contractor tries to hire someone into a position who doesn't have the experience, degree, or other requirements needed, then the contracting officer will push back and decline the potential employee.
"AI may not find the perfect resume, but it will find the ones that are best to review," said recruiter Stuart Strange. "Tools like StarFinder can also search internal databases throughout a company to find current employees' new projects as their current ones come to an end. In contracting, it's more than hiring. It's also about retaining great people."
The purpose of the StarFinder project is to expedite the recruitment process and lower attrition. During the internship, the team prepared and collected resume data while processing it for more analysis integrated with the web interface created by the interns, allowing more usable access to the information across the organization.
While StarFinder is a proof-of-concept, it can be built to handle larger amounts of job listings and candidates in the future. More efficient data storage solutions and other methods to improve code efficiency will need to be explored, such as multithreading and taking advantage of cluster computing.
The advancement of AI/ML in the workforce has been on the minds of many Americans. According to the Pew Research Center, 56 percent of Americans (11,004 surveyed) think the world's path to AI will have a significant impact on the workforce. Not unsurprising, 41 percent of those surveyed oppose the use of AI in the hiring process.
Knowing these concerns and fighting to make the process as ethical as possible is not just a request for organizations but a vital requirement. StarFinder is still in its early stages, but it has ethical and transparent measurements.
"I give resumes four to six seconds of review," said Strange. "Tools like this aid us to find the top candidates quicker. We bring back the human side when we interview potential candidates. So, I don't have any concerns because if those top five (or more) weren't a match after interviews, we can go back and dig to find a candidate with all the must-haves."
Currently, the tool removes personally identifiable information like a candidate's name, address, email, and phone number from the resume to anonymize them; prompts are separated and only given the information needed for their specific task. These prompts search for things such as candidate education and work experience and feed that information into the comparison prompts, which helps limit the scope of potentially sensitive information being used as an input.
It's important to note that more nuanced elements go into hiring that starts before applications are reviewed. "An ethical way of hiring starts with where you post the job. You need to ensure the posting gets in front of the eyes of different people with different backgrounds," said Strange. "You can't just hope diverse people will see your job and apply. You must be proactive and not reactive."
The tech and tools used during this project not only give the interns an insight into real-life solutions but also allow for a well-thought-out and crafted proof-of-concept that can benefit MetroStar. The three main subsections of the project include data science, data analysis, and development. Below is a deeper insight into the tools used to make AI in recruitment possible.
The main part of the system that assesses job listings and candidates is the OpenAI model. This model helps the team by extracting information from job listings and resumes. It breaks down the job requirements and resume details and gives specific instructions to the model to make sure the results are accurate and consistent. After data is collected and organized, information like skills, years of experience, education, clearance, and certifications are sorted.
StarFinder compares job requirements with the data of all the candidates. It assigns a score to each candidate-job pair by using different questions and methods for each requirement category. These scores are combined to make an overall score, which ranks the candidates for that job.
When a new candidate joins the system, their information is compared to all the available job listings similarly. Every day, the system checks for new candidates and job listing information and processes it as needed. This ensures that the system always has the most up-to-date information.
Databricks helps the team build a pathway between the hiring system called GreenHouse and a storage space in Azure called Blob storage. This process makes it easier for recruiters to go through candidate information. The team used a model called "en_core_web_md" from SpaCy to find skills and other requirements from job descriptions and resumes.
Databricks lets the team extract, transform, and load data into Blob storage. This is a flexible process, and the Python code used can take data and transform it easily. It can also do this process on a set schedule. For more complex parsing, the team used the SpaCy model to label data and improve the accuracy of the model.
When working with resumes, the team made the data easier to read by using Azure's Open AI. This made it possible for the different formats of resumes to be read accurately despite each one looking different.
An API was created to pull the data from one of the pipelines (job listings), which contains the information for each job role. The API pulls in the top five potential candidates, performs processing, and transforms the data into visual representations such as bar graphs, radar charts, and tables so that a recruiter can quickly understand the information.
The top five candidates are passed into a results table with links for further details. The details page shows a breakdown of the overall score and how the candidate met that. It also includes additional information, depending on the job posts. If the job post requires certifications, then the details page will include if those certifications are met. Other requirement categories include skills, education, years of experience, and clearances.
Curious about AI/ML technology and the future of recruiting? Learn from an expert.
Never miss a thing by signing up for our newsletter. We periodically send out important news, blogs, and other announcements. Don’t worry, we promise not to spam you.