TY - JOUR
T1 - Demystifying AI for the workforce
T2 - the role of explainable AI in worker acceptance and management relations
AU - Yang, Miles M.
AU - Lu, Ying
AU - Cooke, Fang Lee
N1 - Publisher Copyright:
© 2025 Society for the Advancement of Management Studies and John Wiley & Sons Ltd.
PY - 2025
Y1 - 2025
N2 - In the digital era, organizations are increasingly leveraging artificial intelligence (AI) to optimize their operations and decision-making. However, the opaqueness of AI processes raises concerns over trust, fairness, and autonomy, especially in the gig economy, where AI-driven management is ubiquitous. This study investigates how explainable AI (xAI), through the comparative use of counterfactual versus factual and local versus global explanations, shapes gig workers’ acceptance of AI-driven decisions and management relations, drawing on cognitive load theory. Using experimental data from 1107 gig workers, we found that both counterfactual (relative to factual) and local (relative to global) explanations increase the acceptance of AI decisions. However, the combination of local and counterfactual explanations can overwhelm workers, thereby reducing these positive effects. Furthermore, worker acceptance mediated the relationship between xAI explanations and management relations. A follow-up study using a simplified scenario and additional procedural controls confirmed the robustness of these effects. Our findings underscore the value of carefully tailored xAI in fostering equitable, transparent, and constructive organizational practices in digitally mediated work environments.
AB - In the digital era, organizations are increasingly leveraging artificial intelligence (AI) to optimize their operations and decision-making. However, the opaqueness of AI processes raises concerns over trust, fairness, and autonomy, especially in the gig economy, where AI-driven management is ubiquitous. This study investigates how explainable AI (xAI), through the comparative use of counterfactual versus factual and local versus global explanations, shapes gig workers’ acceptance of AI-driven decisions and management relations, drawing on cognitive load theory. Using experimental data from 1107 gig workers, we found that both counterfactual (relative to factual) and local (relative to global) explanations increase the acceptance of AI decisions. However, the combination of local and counterfactual explanations can overwhelm workers, thereby reducing these positive effects. Furthermore, worker acceptance mediated the relationship between xAI explanations and management relations. A follow-up study using a simplified scenario and additional procedural controls confirmed the robustness of these effects. Our findings underscore the value of carefully tailored xAI in fostering equitable, transparent, and constructive organizational practices in digitally mediated work environments.
KW - acceptance of AI-driven decision
KW - explainable AI (xAI)
KW - gig worker
KW - management relations
KW - scenario-based experiment
UR - https://www.scopus.com/pages/publications/105024548062
U2 - 10.1111/joms.70039
DO - 10.1111/joms.70039
M3 - Article
AN - SCOPUS:105024548062
SN - 0022-2380
JO - Journal of Management Studies
JF - Journal of Management Studies
ER -