Large Companies Collaborate to Develop Self-Regulation Framework for AI in Hiring

 

In response to the relative lack of government regulation surrounding the use of artificial intelligence (AI) technologies in hiring processes, a group of 18 major companies has taken matters into their own hands by crafting a voluntary set of principles and policies for self-regulation. Developed in partnership with BBB National Programs, these guidelines are aimed at ensuring fairness, transparency, and inclusivity in the AI-powered hiring landscape.

Among the companies involved in this initiative are prominent names such as Amazon, Koch Industries, Microsoft, Qualcomm, and Unilever. According to data cited by BBB National Programs, an overwhelming 99% of Fortune 500 companies utilize talent-sifting software, and 55% of human resources leaders rely on predictive algorithms during the hiring process.

Eric Reicin, president and CEO of BBB National Programs, highlighted the need for self-regulation in the absence of clear government rules. He emphasized that the surge in job applications has driven the adoption of AI technologies to fairly manage and reduce potential biases in the recruitment and hiring process. For instance, one of the participating organizations received over 20 million applications last year, making it impractical to manually handle all the decision-making. AI-enabled tools and machine learning algorithms have emerged as viable solutions to cope with such vast amounts of data.

The primary objectives of the AI Principles and Protocols developed by the working group include ensuring the validity and reliability of AI systems, promoting equitable outcomes by mitigating harmful biases, fostering inclusivity, enhancing compliance, transparency, and accountability, and striving for systems that are safe, secure, resilient, explainable, interpretable, and privacy-enhanced.

To obtain certification under this self-regulatory framework, employers must take specific steps, including providing clear notice to applicants about the use of AI processing and ensuring they have a comprehensive understanding of how the AI tools function. Additionally, companies are required to monitor these systems continuously to mitigate any biased outcomes. Independent third-party entities are responsible for evaluating and certifying compliance with these principles and protocols.

However, it is essential to note that this self-regulatory framework does not supersede any existing federal, state, or local laws related to AI usage in hiring. Policymakers are starting to address the subject by proposing legislation and regulations, although the federal government is yet to implement any major laws in this domain.

At the local level, New York City has already taken the lead by enacting the Automated Employment Decision Tools law, which obliges employers within the city using AI or machine learning tools for hiring decisions to notify candidates about their usage. Additionally, the law requires that automated employment decision tools be examined for potential biases by independent auditors.

BBB National Programs, established in 2019 as a result of the restructuring of the Council of Better Business Bureaus (CBBB), is responsible for housing industry self-regulations. In contrast, the International Association of Better Business Bureaus (IABBB) functions as the national headquarters of Better Business Bureaus across the country, addressing business complaints and scams. Through such collaborative efforts, companies are taking proactive steps to ensure ethical and responsible implementation of AI in the hiring process while advocating for fairness and inclusivity in the job market.

Comments
  • There are no comments yet. Your comment can be the first.
Add comment