TY - JOUR
T1 - When artificial intelligence substitutes humans in higher education
T2 - the cost of loneliness, student success, and retention
AU - Crawford, Joseph
AU - Allen, Kelly-Ann
AU - Pani, Bianca
AU - Cowling, Michael
N1 - Publisher Copyright:
© 2024 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
PY - 2024
Y1 - 2024
N2 - Artificial intelligence (AI) may be the new-new-norm in a post-pandemic learning environment. There is a growing number of university students using AI like ChatGPT and Bard to support their academic experience. Much of the AI in higher education research to date has focused on academic integrity and matters of authorship; yet, there may be unintended consequences beyond these concerns for students. That is, there may be people who reduce their formal social interactions while using these tools. This study evaluates 387 university students and their relationship to–and with–artificial intelligence large-language model-based tools. Using structural equation modelling, the study finds evidence that while AI chatbots designed for information provision may be associated with student performance, when social support, psychological wellbeing, loneliness, and sense of belonging are considered it has a net negative effect on achievement. This study tests an AI-specific form of social support, and the cost it may pose to student success, wellbeing, and retention. Indeed, while AI chatbot usage may be associated with poorer social outcomes, human-substitution activity that may be occurring when a student chooses to seek support from an AI rather than a human (e.g. a librarian, professor, or student advisor) may pose interesting learning and teaching policy implications. We explore the implications of this from the lens of student success and belonging.
AB - Artificial intelligence (AI) may be the new-new-norm in a post-pandemic learning environment. There is a growing number of university students using AI like ChatGPT and Bard to support their academic experience. Much of the AI in higher education research to date has focused on academic integrity and matters of authorship; yet, there may be unintended consequences beyond these concerns for students. That is, there may be people who reduce their formal social interactions while using these tools. This study evaluates 387 university students and their relationship to–and with–artificial intelligence large-language model-based tools. Using structural equation modelling, the study finds evidence that while AI chatbots designed for information provision may be associated with student performance, when social support, psychological wellbeing, loneliness, and sense of belonging are considered it has a net negative effect on achievement. This study tests an AI-specific form of social support, and the cost it may pose to student success, wellbeing, and retention. Indeed, while AI chatbot usage may be associated with poorer social outcomes, human-substitution activity that may be occurring when a student chooses to seek support from an AI rather than a human (e.g. a librarian, professor, or student advisor) may pose interesting learning and teaching policy implications. We explore the implications of this from the lens of student success and belonging.
KW - AI
KW - ChatGPT
KW - chatbot
KW - intention to leave
KW - sense of belonging
KW - university students
UR - https://www.scopus.com/pages/publications/85188046643
U2 - 10.1080/03075079.2024.2326956
DO - 10.1080/03075079.2024.2326956
M3 - Article
AN - SCOPUS:85188046643
SN - 0307-5079
VL - 49
SP - 883
EP - 897
JO - Studies in Higher Education
JF - Studies in Higher Education
IS - 5
ER -