The human touch in Artificial Intelligence (AI) decision-making
: investigating employee fairness perceptions of human-AI collaboration in the workplace

  • Anna Pauline Gescher (Student)

Student thesis: Master's Thesis


The future of work is seen as a collaboration between humans and machines, but the use of AI for management decision-making raises concerns about employees’ fairness perceptions. An experimental study investigated 797 employees’ perceptions of procedural, interpersonal, and informational fairness towards three proposed hybrid decision-making approaches. The results are compared to perceptions of fully automated decision-making in three common management contexts. The study shows that not all hybrid decision-making approaches are generally perceived as fairer than fully automated decision-making. The aggregated human-AI decision-making approach, in which AI and humans collaborate by performing specific evaluation tasks based on their respective strengths, is perceived to be the fairest among employees. Fairness perceptions remained consistent across the decision contexts of the allocation of promotions, bonus payments, and career training, suggesting that the perceived fairness of hybrid approaches may not be context dependent. The qualitative responses shed light on employee satisfaction and concerns related to both AI and human involvement in hybrid decision-making. This thesis discusses the implications of these findings and future research directions. It contributes to understanding fairness perceptions in the context of human-machine collaboration and informs decision-makers on the most effective approach for implementing AI in the workplace.
Date of Award16 Oct 2023
Original languageEnglish
Awarding Institution
  • Universidade Católica Portuguesa
SupervisorCristina Soares Pacheco Mendonça (Supervisor)


  • Artificial intelligence
  • Hybrid decision-making
  • Algorithmic fairness
  • Organizational justice theory
  • Employee perception


  • Mestrado em Gestão e Administração de Empresas

Cite this