Understanding international benchmarks on student engagement - Awareness, research alignment and response from a computer science perspective

Michael Morgan, Matthew Butler, Jane E Sinclair, Gerry Cross, Janet Fraser, Jana Jackova, Neena Thota

    Research output: Chapter in Book/Report/Conference proceedingConference PaperOther

    Abstract

    There is an increasing trend to use national benchmarks to measure student engagement, with instruments such as North American National Survey of Student Engagement (NSSE) in the USA and Canada, Student Experience Survey (SES) in Australia and NZ (previously known as the University Experience Survey UES), and Student Engagement Survey (SES) in the UK. Unfortunately, Computer Science (CS) rates fairly poorly on a number of measures in these surveys, even when compared to related STEM disciplines. Initial research suggests there may be several reasons for this poor performance: i) the suitability of the instruments to the CS context, ii) a lack of awareness of CS academics of these instruments and the student engagement measures they are based on, and iii) a misalignment between these instruments and the research focus of computing educators, leading to misdirected efforts in research and teaching practice. This working group focused on the last two aspects of this issue. We carried out an in-depth analysis of international student engagement instruments to facilitate a greater awareness of the international benchmarks and what aspects of student engagement they measure. The working group also examined the focus of current computing education research and its alignment to student engagement measures on which these instruments are based. Armed with this knowledge, the computing education community can make informed decisions on how best to respond to these measures and consider ways to improve our performance in relation to other disciplines. In particular it is important to understand why certain measures of student engagement are built into these instruments and how these align to our current research practice. Given the global nature of these benchmarks, an ITiCSE working group was needed to obtain the perspectives needed to address these challenges. This ITiCSE working group facilitated international input to the following activities: Stage 1 - A study examining the trends and variations in the data for the computing discipline from several international student engagement instruments (NSSE, SES, UKSES) over the past decade, including: • A longitudinal study of the data, comparing the results from the various instruments. • Benchmarking of the performance of CS, with a focus on comparisons to other STEM disciplines • A summary of the published data and the perceptions this may generate of CS for perspective students and university administrators. Stage 2 - Analysis of the instrument design and the student engagement measures they use. This is examined through: • A comparative analysis of the various survey instruments of student engagement. • An analysis of the engagement measures used in the instruments (this analyse was contrasted with the focus of current computing education research). • A survey of research literature used to justify the use of these student engagement measures. Stage 3 - Meta-analysis of current CS research literature related to computing education with specific focus on any initiatives to promote student engagement, including: • A survey of current topics in computing education research literature to establish the current research focus of the discipline. • A focus on any research and teaching initiatives to promote student engagement. • An analysis of the alignment of current CS research to student engagement measures used in international survey instruments. Stage 4 - Interviews on the perceptions of CS academics to the various survey instrument student engagement measures/questions, focusing on: • Interviews with academics to examine how they respond to specific questions extracted from current survey instruments, how they interpret the question in the context of CS and how the questions relate to their teaching practice. • Analysis of the response of academics to international student engagement survey instruments and the engagement measures used. Stage 5 - Responding to findings by: • Providing suggestions for adapting the focus of future computing education research to maximise outcomes from existing international benchmarks on student engagement. • Presenting ways of increasing staff awareness of international benchmarks and student engagement measures. Since these student engagement measures are widely publicised, and are used by students to make course selection decisions and by administrators to assess courses, it is important for the computing education discipline to have a greater awareness of these instruments and their design. We need to better understand how current computing education research relates to these instruments and the engagement measures they use. A better understanding of the engagement measures used in these instruments, why they are considered important, and how they align to our teaching and research practice, is crucial if we are to improve the performance of the computing discipline in these national benchmarks. Finally, we must investigate ways in which we might respond as a discipline in order to improve the performance of CS on these student engagement measures.

    Original languageEnglish
    Title of host publicationITiCSE'17 - Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education
    EditorsIrene Polycarpou, Guido Rößling
    Place of PublicationNew York NY USA
    PublisherAssociation for Computing Machinery (ACM)
    Pages383-384
    Number of pages2
    ISBN (Print)9781450347044
    DOIs
    Publication statusPublished - 28 Jun 2017
    EventAnnual Conference on Innovation and Technology in Computer Science Education 2017 - Bologna, Italy
    Duration: 3 Jul 20175 Jul 2017
    Conference number: 22nd
    https://dl.acm.org/citation.cfm?id=3059009 (Proceedings)

    Conference

    ConferenceAnnual Conference on Innovation and Technology in Computer Science Education 2017
    Abbreviated titleITiCSE 2017
    CountryItaly
    CityBologna
    Period3/07/175/07/17
    Internet address

    Keywords

    • Computer science
    • International benchmarks
    • Student engagement

    Cite this

    Morgan, M., Butler, M., Sinclair, J. E., Cross, G., Fraser, J., Jackova, J., & Thota, N. (2017). Understanding international benchmarks on student engagement - Awareness, research alignment and response from a computer science perspective. In I. Polycarpou, & G. Rößling (Eds.), ITiCSE'17 - Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education (pp. 383-384). New York NY USA: Association for Computing Machinery (ACM). https://doi.org/10.1145/3059009.3081324
    Morgan, Michael ; Butler, Matthew ; Sinclair, Jane E ; Cross, Gerry ; Fraser, Janet ; Jackova, Jana ; Thota, Neena. / Understanding international benchmarks on student engagement - Awareness, research alignment and response from a computer science perspective. ITiCSE'17 - Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education. editor / Irene Polycarpou ; Guido Rößling. New York NY USA : Association for Computing Machinery (ACM), 2017. pp. 383-384
    @inproceedings{62df365cd3d84eeb90246d6c6ae515fd,
    title = "Understanding international benchmarks on student engagement - Awareness, research alignment and response from a computer science perspective",
    abstract = "There is an increasing trend to use national benchmarks to measure student engagement, with instruments such as North American National Survey of Student Engagement (NSSE) in the USA and Canada, Student Experience Survey (SES) in Australia and NZ (previously known as the University Experience Survey UES), and Student Engagement Survey (SES) in the UK. Unfortunately, Computer Science (CS) rates fairly poorly on a number of measures in these surveys, even when compared to related STEM disciplines. Initial research suggests there may be several reasons for this poor performance: i) the suitability of the instruments to the CS context, ii) a lack of awareness of CS academics of these instruments and the student engagement measures they are based on, and iii) a misalignment between these instruments and the research focus of computing educators, leading to misdirected efforts in research and teaching practice. This working group focused on the last two aspects of this issue. We carried out an in-depth analysis of international student engagement instruments to facilitate a greater awareness of the international benchmarks and what aspects of student engagement they measure. The working group also examined the focus of current computing education research and its alignment to student engagement measures on which these instruments are based. Armed with this knowledge, the computing education community can make informed decisions on how best to respond to these measures and consider ways to improve our performance in relation to other disciplines. In particular it is important to understand why certain measures of student engagement are built into these instruments and how these align to our current research practice. Given the global nature of these benchmarks, an ITiCSE working group was needed to obtain the perspectives needed to address these challenges. This ITiCSE working group facilitated international input to the following activities: Stage 1 - A study examining the trends and variations in the data for the computing discipline from several international student engagement instruments (NSSE, SES, UKSES) over the past decade, including: • A longitudinal study of the data, comparing the results from the various instruments. • Benchmarking of the performance of CS, with a focus on comparisons to other STEM disciplines • A summary of the published data and the perceptions this may generate of CS for perspective students and university administrators. Stage 2 - Analysis of the instrument design and the student engagement measures they use. This is examined through: • A comparative analysis of the various survey instruments of student engagement. • An analysis of the engagement measures used in the instruments (this analyse was contrasted with the focus of current computing education research). • A survey of research literature used to justify the use of these student engagement measures. Stage 3 - Meta-analysis of current CS research literature related to computing education with specific focus on any initiatives to promote student engagement, including: • A survey of current topics in computing education research literature to establish the current research focus of the discipline. • A focus on any research and teaching initiatives to promote student engagement. • An analysis of the alignment of current CS research to student engagement measures used in international survey instruments. Stage 4 - Interviews on the perceptions of CS academics to the various survey instrument student engagement measures/questions, focusing on: • Interviews with academics to examine how they respond to specific questions extracted from current survey instruments, how they interpret the question in the context of CS and how the questions relate to their teaching practice. • Analysis of the response of academics to international student engagement survey instruments and the engagement measures used. Stage 5 - Responding to findings by: • Providing suggestions for adapting the focus of future computing education research to maximise outcomes from existing international benchmarks on student engagement. • Presenting ways of increasing staff awareness of international benchmarks and student engagement measures. Since these student engagement measures are widely publicised, and are used by students to make course selection decisions and by administrators to assess courses, it is important for the computing education discipline to have a greater awareness of these instruments and their design. We need to better understand how current computing education research relates to these instruments and the engagement measures they use. A better understanding of the engagement measures used in these instruments, why they are considered important, and how they align to our teaching and research practice, is crucial if we are to improve the performance of the computing discipline in these national benchmarks. Finally, we must investigate ways in which we might respond as a discipline in order to improve the performance of CS on these student engagement measures.",
    keywords = "Computer science, International benchmarks, Student engagement",
    author = "Michael Morgan and Matthew Butler and Sinclair, {Jane E} and Gerry Cross and Janet Fraser and Jana Jackova and Neena Thota",
    year = "2017",
    month = "6",
    day = "28",
    doi = "10.1145/3059009.3081324",
    language = "English",
    isbn = "9781450347044",
    pages = "383--384",
    editor = "Irene Polycarpou and Guido R{\"o}{\ss}ling",
    booktitle = "ITiCSE'17 - Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education",
    publisher = "Association for Computing Machinery (ACM)",
    address = "United States of America",

    }

    Morgan, M, Butler, M, Sinclair, JE, Cross, G, Fraser, J, Jackova, J & Thota, N 2017, Understanding international benchmarks on student engagement - Awareness, research alignment and response from a computer science perspective. in I Polycarpou & G Rößling (eds), ITiCSE'17 - Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education. Association for Computing Machinery (ACM), New York NY USA, pp. 383-384, Annual Conference on Innovation and Technology in Computer Science Education 2017, Bologna, Italy, 3/07/17. https://doi.org/10.1145/3059009.3081324

    Understanding international benchmarks on student engagement - Awareness, research alignment and response from a computer science perspective. / Morgan, Michael; Butler, Matthew; Sinclair, Jane E; Cross, Gerry; Fraser, Janet; Jackova, Jana; Thota, Neena.

    ITiCSE'17 - Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education. ed. / Irene Polycarpou; Guido Rößling. New York NY USA : Association for Computing Machinery (ACM), 2017. p. 383-384.

    Research output: Chapter in Book/Report/Conference proceedingConference PaperOther

    TY - GEN

    T1 - Understanding international benchmarks on student engagement - Awareness, research alignment and response from a computer science perspective

    AU - Morgan, Michael

    AU - Butler, Matthew

    AU - Sinclair, Jane E

    AU - Cross, Gerry

    AU - Fraser, Janet

    AU - Jackova, Jana

    AU - Thota, Neena

    PY - 2017/6/28

    Y1 - 2017/6/28

    N2 - There is an increasing trend to use national benchmarks to measure student engagement, with instruments such as North American National Survey of Student Engagement (NSSE) in the USA and Canada, Student Experience Survey (SES) in Australia and NZ (previously known as the University Experience Survey UES), and Student Engagement Survey (SES) in the UK. Unfortunately, Computer Science (CS) rates fairly poorly on a number of measures in these surveys, even when compared to related STEM disciplines. Initial research suggests there may be several reasons for this poor performance: i) the suitability of the instruments to the CS context, ii) a lack of awareness of CS academics of these instruments and the student engagement measures they are based on, and iii) a misalignment between these instruments and the research focus of computing educators, leading to misdirected efforts in research and teaching practice. This working group focused on the last two aspects of this issue. We carried out an in-depth analysis of international student engagement instruments to facilitate a greater awareness of the international benchmarks and what aspects of student engagement they measure. The working group also examined the focus of current computing education research and its alignment to student engagement measures on which these instruments are based. Armed with this knowledge, the computing education community can make informed decisions on how best to respond to these measures and consider ways to improve our performance in relation to other disciplines. In particular it is important to understand why certain measures of student engagement are built into these instruments and how these align to our current research practice. Given the global nature of these benchmarks, an ITiCSE working group was needed to obtain the perspectives needed to address these challenges. This ITiCSE working group facilitated international input to the following activities: Stage 1 - A study examining the trends and variations in the data for the computing discipline from several international student engagement instruments (NSSE, SES, UKSES) over the past decade, including: • A longitudinal study of the data, comparing the results from the various instruments. • Benchmarking of the performance of CS, with a focus on comparisons to other STEM disciplines • A summary of the published data and the perceptions this may generate of CS for perspective students and university administrators. Stage 2 - Analysis of the instrument design and the student engagement measures they use. This is examined through: • A comparative analysis of the various survey instruments of student engagement. • An analysis of the engagement measures used in the instruments (this analyse was contrasted with the focus of current computing education research). • A survey of research literature used to justify the use of these student engagement measures. Stage 3 - Meta-analysis of current CS research literature related to computing education with specific focus on any initiatives to promote student engagement, including: • A survey of current topics in computing education research literature to establish the current research focus of the discipline. • A focus on any research and teaching initiatives to promote student engagement. • An analysis of the alignment of current CS research to student engagement measures used in international survey instruments. Stage 4 - Interviews on the perceptions of CS academics to the various survey instrument student engagement measures/questions, focusing on: • Interviews with academics to examine how they respond to specific questions extracted from current survey instruments, how they interpret the question in the context of CS and how the questions relate to their teaching practice. • Analysis of the response of academics to international student engagement survey instruments and the engagement measures used. Stage 5 - Responding to findings by: • Providing suggestions for adapting the focus of future computing education research to maximise outcomes from existing international benchmarks on student engagement. • Presenting ways of increasing staff awareness of international benchmarks and student engagement measures. Since these student engagement measures are widely publicised, and are used by students to make course selection decisions and by administrators to assess courses, it is important for the computing education discipline to have a greater awareness of these instruments and their design. We need to better understand how current computing education research relates to these instruments and the engagement measures they use. A better understanding of the engagement measures used in these instruments, why they are considered important, and how they align to our teaching and research practice, is crucial if we are to improve the performance of the computing discipline in these national benchmarks. Finally, we must investigate ways in which we might respond as a discipline in order to improve the performance of CS on these student engagement measures.

    AB - There is an increasing trend to use national benchmarks to measure student engagement, with instruments such as North American National Survey of Student Engagement (NSSE) in the USA and Canada, Student Experience Survey (SES) in Australia and NZ (previously known as the University Experience Survey UES), and Student Engagement Survey (SES) in the UK. Unfortunately, Computer Science (CS) rates fairly poorly on a number of measures in these surveys, even when compared to related STEM disciplines. Initial research suggests there may be several reasons for this poor performance: i) the suitability of the instruments to the CS context, ii) a lack of awareness of CS academics of these instruments and the student engagement measures they are based on, and iii) a misalignment between these instruments and the research focus of computing educators, leading to misdirected efforts in research and teaching practice. This working group focused on the last two aspects of this issue. We carried out an in-depth analysis of international student engagement instruments to facilitate a greater awareness of the international benchmarks and what aspects of student engagement they measure. The working group also examined the focus of current computing education research and its alignment to student engagement measures on which these instruments are based. Armed with this knowledge, the computing education community can make informed decisions on how best to respond to these measures and consider ways to improve our performance in relation to other disciplines. In particular it is important to understand why certain measures of student engagement are built into these instruments and how these align to our current research practice. Given the global nature of these benchmarks, an ITiCSE working group was needed to obtain the perspectives needed to address these challenges. This ITiCSE working group facilitated international input to the following activities: Stage 1 - A study examining the trends and variations in the data for the computing discipline from several international student engagement instruments (NSSE, SES, UKSES) over the past decade, including: • A longitudinal study of the data, comparing the results from the various instruments. • Benchmarking of the performance of CS, with a focus on comparisons to other STEM disciplines • A summary of the published data and the perceptions this may generate of CS for perspective students and university administrators. Stage 2 - Analysis of the instrument design and the student engagement measures they use. This is examined through: • A comparative analysis of the various survey instruments of student engagement. • An analysis of the engagement measures used in the instruments (this analyse was contrasted with the focus of current computing education research). • A survey of research literature used to justify the use of these student engagement measures. Stage 3 - Meta-analysis of current CS research literature related to computing education with specific focus on any initiatives to promote student engagement, including: • A survey of current topics in computing education research literature to establish the current research focus of the discipline. • A focus on any research and teaching initiatives to promote student engagement. • An analysis of the alignment of current CS research to student engagement measures used in international survey instruments. Stage 4 - Interviews on the perceptions of CS academics to the various survey instrument student engagement measures/questions, focusing on: • Interviews with academics to examine how they respond to specific questions extracted from current survey instruments, how they interpret the question in the context of CS and how the questions relate to their teaching practice. • Analysis of the response of academics to international student engagement survey instruments and the engagement measures used. Stage 5 - Responding to findings by: • Providing suggestions for adapting the focus of future computing education research to maximise outcomes from existing international benchmarks on student engagement. • Presenting ways of increasing staff awareness of international benchmarks and student engagement measures. Since these student engagement measures are widely publicised, and are used by students to make course selection decisions and by administrators to assess courses, it is important for the computing education discipline to have a greater awareness of these instruments and their design. We need to better understand how current computing education research relates to these instruments and the engagement measures they use. A better understanding of the engagement measures used in these instruments, why they are considered important, and how they align to our teaching and research practice, is crucial if we are to improve the performance of the computing discipline in these national benchmarks. Finally, we must investigate ways in which we might respond as a discipline in order to improve the performance of CS on these student engagement measures.

    KW - Computer science

    KW - International benchmarks

    KW - Student engagement

    UR - http://www.scopus.com/inward/record.url?scp=85029492646&partnerID=8YFLogxK

    U2 - 10.1145/3059009.3081324

    DO - 10.1145/3059009.3081324

    M3 - Conference Paper

    AN - SCOPUS:85029492646

    SN - 9781450347044

    SP - 383

    EP - 384

    BT - ITiCSE'17 - Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education

    A2 - Polycarpou, Irene

    A2 - Rößling, Guido

    PB - Association for Computing Machinery (ACM)

    CY - New York NY USA

    ER -

    Morgan M, Butler M, Sinclair JE, Cross G, Fraser J, Jackova J et al. Understanding international benchmarks on student engagement - Awareness, research alignment and response from a computer science perspective. In Polycarpou I, Rößling G, editors, ITiCSE'17 - Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education. New York NY USA: Association for Computing Machinery (ACM). 2017. p. 383-384 https://doi.org/10.1145/3059009.3081324